Sample records for aerocapture performance analysis

  1. Neptune Aerocapture Systems Analysis

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae

    2004-01-01

    A Neptune Aerocapture Systems Analysis is completed to determine the feasibility, benefit and risk of an aeroshell aerocapture system for Neptune and to identify technology gaps and technology performance goals. The high fidelity systems analysis is completed by a five center NASA team and includes the following disciplines and analyses: science; mission design; aeroshell configuration screening and definition; interplanetary navigation analyses; atmosphere modeling; computational fluid dynamics for aerodynamic performance and database definition; initial stability analyses; guidance development; atmospheric flight simulation; computational fluid dynamics and radiation analyses for aeroheating environment definition; thermal protection system design, concepts and sizing; mass properties; structures; spacecraft design and packaging; and mass sensitivities. Results show that aerocapture can deliver 1.4 times more mass to Neptune orbit than an all-propulsive system for the same launch vehicle. In addition aerocapture results in a 3-4 year reduction in trip time compared to all-propulsive systems. Aerocapture is feasible and performance is adequate for the Neptune aerocapture mission. Monte Carlo simulation results show 100% successful capture for all cases including conservative assumptions on atmosphere and navigation. Enabling technologies for this mission include TPS manufacturing; and aerothermodynamic methods and validation for determining coupled 3-D convection, radiation and ablation aeroheating rates and loads, and the effects on surface recession.

  2. Aerocapture Performance Analysis of A Venus Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.

    2005-01-01

    A performance analysis of a Discovery Class Venus Exploration Mission in which aerocapture is used to capture a spacecraft into a 300km polar orbit for a two year science mission has been conducted to quantify its performance. A preliminary performance assessment determined that a high heritage 70 sphere-cone rigid aeroshell with a 0.25 lift to drag ratio has adequate control authority to provide an entry flight path angle corridor large enough for the mission s aerocapture maneuver. A 114 kilograms per square meter ballistic coefficient reference vehicle was developed from the science requirements and the preliminary assessment s heating indicators and deceleration loads. Performance analyses were conducted for the reference vehicle and for sensitivity studies on vehicle ballistic coefficient and maximum bank rate. The performance analyses used a high fidelity flight simulation within a Monte Carlo executive to define the aerocapture heating environment and deceleration loads and to determine mission success statistics. The simulation utilized the Program to Optimize Simulated Trajectories (POST) that was modified to include Venus specific atmospheric and planet models, aerodynamic characteristics, and interplanetary trajectory models. In addition to Venus specific models, an autonomous guidance system, HYPAS, and a pseudo flight controller were incorporated in the simulation. The Monte Carlo analyses incorporated a reference set of approach trajectory delivery errors, aerodynamic uncertainties, and atmospheric density variations. The reference performance analysis determined the reference vehicle achieves 100% successful capture and has a 99.87% probability of attaining the science orbit with a 90 meters per second delta V budget for post aerocapture orbital adjustments. A ballistic coefficient trade study conducted with reference uncertainties determined that the 0.25 L/D vehicle can achieve 100% successful capture with a ballistic coefficient of 228 kilograms

  3. Aerocapture Systems Analysis for a Titan Mission

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary K.; Queen, Eric M.; Way, David W.; Powell, Richard W.; Edquist, Karl; Starr, Brett W.; Hollis, Brian R.; Zoby, E. Vincent; Hrinda, Glenn A.; Bailey, Robert W.

    2006-01-01

    Performance projections for aerocapture show a vehicle mass savings of between 40 and 80%, dependent on destination, for an aerocapture vehicle compared to an all-propulsive chemical vehicle. In addition aerocapture is applicable to multiple planetary exploration destinations of interest to NASA. The 2001 NASA In-Space Propulsion Program (ISP) technology prioritization effort identified aerocapture as one of the top three propulsion technologies for solar system exploration missions. An additional finding was that aerocapture needed a better system definition and that supporting technology gaps needed to be identified. Consequently, the ISP program sponsored an aerocapture systems analysis effort that was completed in 2002. The focus of the effort was on aerocapture at Titan with a rigid aeroshell system. Titan was selected as the initial destination for the study due to potential interest in a follow-on mission to Cassini/Huygens. Aerocapture is feasible, and the performance is adequate, for the Titan mission and it can deliver 2.4 times more mass to Titan than an all-propulsive system for the same launch vehicle.

  4. Aerocapture Technologies

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.

    2006-01-01

    Aeroassist technology development is a vital part of the NASA In-Space Propulsion Technology (ISPT) Program. One of the main focus areas of ISPT is aeroassist technologies through the Aerocapture Technology (AT) Activity. Within the ISPT, the current aeroassist technology development focus is aerocapture. Aerocapture relies on the exchange of momentum with an atmosphere to achieve thrust, in this case a decelerating thrust leading to orbit capture. Without aerocapture, a substantial propulsion system would be needed on the spacecraft to perform the same reduction of velocity. This could cause reductions in the science payload delivered to the destination, increases in the size of the launch vehicle (to carry the additional fuel required for planetary capture) or could simply make the mission impossible due to additional propulsion requirements. The AT is advancing each technology needed for the successful implementation of aerocapture in future missions. The technology development focuses on both rigid aeroshell systems as well as the development of inflatable aerocapture systems, advanced aeroshell performance sensors, lightweight structure and higher temperature adhesives. Inflatable systems such as tethered trailing ballutes ('balloon parachutes'), clamped ballutes, and inflatable aeroshells are also under development. Aerocapture-specific computational tools required to support future aerocapture missions are also an integral part of the ATP. Tools include: engineering reference atmosphere models, guidance and navigation, aerothermodynamic modeling, radiation modeling and flight simulation. Systems analysis plays a key role in the AT development process. The NASA in-house aerocapture systems analysis team has been taken with multiple systems definition and concept studies to complement the technology development tasks. The team derives science requirements, develops guidance and navigation algorithms, as well as engineering reference atmosphere models and

  5. Aerocapture Systems Analysis for a Neptune Mission

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Edquist, Karl T.; Starr, Brett R.; Hollis, Brian R.; Hrinda, Glenn A.; Bailey, Robert W.; Hall, Jeffery L.; Spilker, Thomas R.; Noca, Muriel A.; O'Kongo, N.

    2006-01-01

    A Systems Analysis was completed to determine the feasibility, benefit and risk of an aeroshell aerocapture system for Neptune and to identify technology gaps and technology performance goals. The systems analysis includes the following disciplines: science; mission design; aeroshell configuration; interplanetary navigation analyses; atmosphere modeling; computational fluid dynamics for aerodynamic performance and aeroheating environment; stability analyses; guidance development; atmospheric flight simulation; thermal protection system design; mass properties; structures; spacecraft design and packaging; and mass sensitivities. Results show that aerocapture is feasible and performance is adequate for the Neptune mission. Aerocapture can deliver 1.4 times more mass to Neptune orbit than an all-propulsive system for the same launch vehicle and results in a 3-4 year reduction in trip time compared to all-propulsive systems. Enabling technologies for this mission include TPS manufacturing; and aerothermodynamic methods for determining coupled 3-D convection, radiation and ablation aeroheating rates and loads.

  6. Aerocapture Performance Analysis for a Neptune-Triton Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.; Masciarelli, James P.

    2004-01-01

    A systems analysis has been conducted for a Neptune-Triton Exploration Mission in which aerocapture is used to capture a spacecraft at Neptune. Aerocapture uses aerodynamic drag instead of propulsion to decelerate from the interplanetary approach trajectory to a captured orbit during a single pass through the atmosphere. After capture, propulsion is used to move the spacecraft from the initial captured orbit to the desired science orbit. A preliminary assessment identified that a spacecraft with a lift to drag ratio of 0.8 was required for aerocapture. Performance analyses of the 0.8 L/D vehicle were performed using a high fidelity flight simulation within a Monte Carlo executive to determine mission success statistics. The simulation was the Program to Optimize Simulated Trajectories (POST) modified to include Neptune specific atmospheric and planet models, spacecraft aerodynamic characteristics, and interplanetary trajectory models. To these were added autonomous guidance and pseudo flight controller models. The Monte Carlo analyses incorporated approach trajectory delivery errors, aerodynamic characteristics uncertainties, and atmospheric density variations. Monte Carlo analyses were performed for a reference set of uncertainties and sets of uncertainties modified to produce increased and reduced atmospheric variability. For the reference uncertainties, the 0.8 L/D flatbottom ellipsled vehicle achieves 100% successful capture and has a 99.87 probability of attaining the science orbit with a 360 m/s V budget for apoapsis and periapsis adjustment. Monte Carlo analyses were also performed for a guidance system that modulates both bank angle and angle of attack with the reference set of uncertainties. An alpha and bank modulation guidance system reduces the 99.87 percentile DELTA V 173 m/s (48%) to 187 m/s for the reference set of uncertainties.

  7. Aerocapture Guidance Performance for the Neptune Orbiter

    NASA Technical Reports Server (NTRS)

    Masciarelli, James P.; Westhelle, Carlos H.; Graves, Claude A.

    2004-01-01

    A performance evaluation of the Hybrid Predictor corrector Aerocapture Scheme (HYPAS) guidance algorithm for aerocapture at Neptune is presented in this paper for a Mission to Neptune and the Neptune moon Triton'. This mission has several challenges not experienced in previous aerocapture guidance assessments. These challengers are a very high Neptune arrival speed, atmospheric exit into a high energy orbit about Neptune, and a very high ballistic coefficient that results in a low altitude acceleration capability when combined with the aeroshell LD. The evaluation includes a definition of the entry corridor, a comparison to the theoretical optimum performance, and guidance responses to variations in atmospheric density, aerodynamic coefficients and flight path angle for various vehicle configurations (ballistic numbers). The benefits of utilizing angle-of-attack modulation in addition to bank angle modulation to improve flight performance is also discussed. The results show that despite large sensitivities in apoapsis targeting, the algorithm performs within the allocated AV budget for the Neptune mission bank angle only modulation. The addition of angle-of-attack modulation with as little as 5 degrees of amplitude significantly improves the scatter in final orbit apoapsis. Although the angle-of-attack modulation complicates the vehicle design, the performance enhancement reduces aerocapture risk and reduces the propellant consumption needed to reach the high energy target orbit for a conventional propulsion system.

  8. Aerocapture Benefits to Future Science Missions

    NASA Technical Reports Server (NTRS)

    Artis, Gwen; James, Bonnie

    2006-01-01

    NASA's In-Space Propulsion Technology (ISPT) Program is investing in technologies to revolutionize the robotic exploration of deep space. One of these technologies is Aerocapture, the most promising of the "aeroassist" techniques used to maneuver a space vehicle within an atmosphere, using aerodynamic forces in lieu of propellant. (Other aeroassist techniques include aeroentry and aerobraking.) Aerocapture relies on drag atmospheric drag to decelerate an incoming spacecraft and capture it into orbit. This technique is very attractive since it permits spacecraft to be launched from Earth at higher velocities, providing shorter trip times and saving mass and overall cost on future missions. Recent aerocapture systems analysis studies quantify the benefits of aerocapture to future exploration. The 2002 Titan aerocapture study showed that using aerocapture at Titan instead of conventional propulsive capture results in over twice as much payload delivered to Titan. Aerocapture at Venus results in almost twice the payload delivered to Venus as with aerobraking, and over six times more mass delivered into orbit than all-propulsive capture. Aerocapture at Mars shows significant benefits as the payload sizes increase and as missions become more complex. Recent Neptune aerocapture studies show that aerocapture opens up entirely new classes of missions at Neptune. Current aerocapture technology development is advancing the maturity of each subsystem technology needed for successful implementation of aerocapture on future missions. Recent development has focused on both rigid aeroshell and inflatable aerocapture systems. Rigid aeroshell systems development includes new ablative and non-ablative thermal protection systems, advanced aeroshell performance sensors, lightweight structures and higher temperature adhesives. Inflatable systems such as trailing tethered and clamped "ballutes" and inflatable aeroshells are also under development. Computational tools required to support

  9. ISP Aerocapture Technology

    NASA Astrophysics Data System (ADS)

    James, B.

    2004-11-01

    Aerocapture technology development is a vital part of the NASA In-Space Propulsion Program (ISP), which is managed by NASA Headquarters and implemented at the NASA Marshall Space Flight Center in Huntsville, Alabama. Aerocapture is a flight maneuver designed to aerodynamically decelerate a spacecraft from hyperbolic approach to a captured orbit during one pass through the atmosphere. Small amounts of propulsive fuel are used for attitude control and periapsis raise only. This technique is very attractive since it permits spacecraft to be launched from Earth at higher verlocities, reducing trip times. The aerocapture technique also significantly reduces the overall mass of the propulsion systems. This allows for more science payload to be added to the mission. Alternatively, a smaller launch vehicle could be used, reducing overall mission cost. Aerocapture can be realized in various ways. It can be accomplished using rigid aeroshells, such as those used in previous mission efforts (like Apollo, the planned Aeroassist Flight Experiment and the Mars Exploration Rovers). Aerocapture can also be achieved with inflatable deceleration systems. This family includes the use of a potentially lighter, inflatable aeroshell or a large, trailing ballute - a combination parachute and balloon made of durable, thin material and stowed behind the vehicle for deployment. Aerocapture utilizing inflatable decelerators is also derived from previous efforts, but will necessitate further research to reach the technology readiness level (TRL) that the rigid aeroshell has achieved. Results of recent Aerocapture Systems analysis studies for small bodies and giant planets show that aerocapture can be enhancing for most missions and absolutely enabling for some mission scenarios. In this way, Aerocapture could open up exciting, new science mission opportunities.

  10. Aerocapture Guidance and Performance at Mars for High-Mass Systems

    NASA Technical Reports Server (NTRS)

    Zumwalt, Carlie H.; Sostaric, Ronald r.; Westhelle, Carlos H.; Cianciolo, Alicia Dwyer

    2010-01-01

    The objective of this study is to understand the performance associated with using the aerocapture maneuver to slow high-mass systems from an Earth-approach trajectory into orbit around Mars. This work is done in conjunction with the Mars Entry Descent and Landing Systems Analysis (EDL-SA) task to explore candidate technologies necessary for development in order to land large-scale payloads on the surface of Mars. Among the technologies considered include hypersonic inflatable aerodynamic decelerators (HIADs) and rigid mid-lift to drag (L/D) aeroshells. Nominal aerocapture trajectories were developed for the mid-L/D aeroshell and two sizes of HIADs, and Monte Carlo analysis was completed to understand sensitivities to dispersions. Additionally, a study was completed in order to determine the size of the larger of the two HIADs which would maintain design constraints on peak heat rate and diameter. Results show that each of the three aeroshell designs studied is a viable option for landing high-mass payloads as none of the three exceed performance requirements.

  11. Aerocapture Technology Development Needs for Outer Planet Exploration

    NASA Technical Reports Server (NTRS)

    Wercinski, Paul; Munk, Michelle; Powell, Richard; Hall, Jeff; Graves, Claude; Partridge, Harry (Technical Monitor)

    2002-01-01

    The purpose of this white paper is to identify aerocapture technology and system level development needs to enable NASA future mission planning to support Outer Planet Exploration. Aerocapture is a flight maneuver that takes place at very high speeds within a planet's atmosphere that provides a change in velocity using aerodynamic forces (in contrast to propulsive thrust) for orbit insertion. Aerocapture is very much a system level technology where individual disciplines such as system analysis and integrated vehicle design, aerodynamics, aerothermal environments, thermal protection systems (TPS), guidance, navigation and control (GN&C) instrumentation need to be integrated and optimized to meet mission specific requirements. This paper identifies on-going activities, their relevance and potential benefit to outer planet aerocapture that include New Millennium ST7 Aerocapture concept definition study, Mars Exploration Program aeroassist project level support, and FY01 Aeroassist In-Space Guideline tasks. The challenges of performing aerocapture for outer planet missions such as Titan Explorer or Neptune Orbiter require investments to advance the technology readiness of the aerocapture technology disciplines for the unique application of outer planet aerocapture. This white paper will identify critical technology gaps (with emphasis on aeroshell concepts) and strategies for advancement.

  12. Analysis of aerothermodynamic environment of a Titan aerocapture vehicle

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Chow, H.; Moss, J. N.

    1982-01-01

    The feasibility of an aerocapture vehicle mission has been emphasized recently for inner and outer planetary missions. Aerocapture involves a system concept which utilizes aerodynamic drag to acquire the velocity reduction necessary to obtain a closed planetary orbit from a hyperbolic flyby trajectory. It has been proposed to use the atmosphere of Titan for braking into a Saturn orbit. This approach for a Saturn orbital mission is expected to cut the interplanetary cruise travel time to Saturn from 8 to 3.5 years. In connection with the preparation of such a mission, it will be necessary to provide a complete analysis of the aerodynamic environment of the Titan aerocapture vehicle. The main objective of the present investigation is, therefore, to determine the extent of convective and radiative heating for the aerocapture vehicle under different entry conditions. This can be essentially accomplished by assessing the heating rates in the stagnation and windward regions of an equivalent body.

  13. Neptune aerocapture mission and spacecraft design overview

    NASA Technical Reports Server (NTRS)

    Bailey, Robert W.; Hall, Jeff L.; Spliker, Tom R.; O'Kongo, Nora

    2004-01-01

    A detailed Neptune aerocapture systems analysis and spacecraft design study was performed as part of NASA's In-Space Propulsion Program. The primary objectives were to assess the feasibility of a spacecraft point design for a Neptune/Triton science mission. That uses aerocapture as the Neptune orbit insertion mechanism. This paper provides an overview of the science, mission and spacecraft design resulting from that study.

  14. Atmospheric Models for Aerocapture

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duval, Aleta; Keller, Vernon W.

    2003-01-01

    There are eight destinations in the Solar System with sufficient atmosphere for aerocapture to be a viable aeroassist option - Venus, Earth, Mars, Jupiter, Saturn and its moon Titan, Uranus, and Neptune. Engineering-level atmospheric models for four of these targets (Earth, Mars, Titan, and Neptune) have been developed for NASA to support systems analysis studies of potential future aerocapture missions. Development of a similar atmospheric model for Venus has recently commenced. An important capability of all of these models is their ability to simulate quasi-random density perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design. Similarities and differences among these atmospheric models are presented, with emphasis on the recently developed Neptune model and on planned characteristics of the Venus model. Example applications for aerocapture are also presented and illustrated. Recent updates to the Titan atmospheric model, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Robe entry at Titan, are discussed. Recent updates to the Mars atmospheric model, in support of ongoing Mars aerocapture systems analysis studies, are also presented.

  15. A Comparative Study of Aerocapture Missions with a Mars Destination

    NASA Technical Reports Server (NTRS)

    Vaughan, Diane; Miller, Heather C.; Griffin, Brand; James, Bonnie F.; Munk, Michelle M.

    2005-01-01

    Conventional interplanetary spacecraft use propulsive systems to decelerate into orbit. Aerocapture is an alternative approach for orbit capture, in which the spacecraft makes a single pass through a target destination's atmosphere. Although this technique has never been performed, studies show there are substantial benefits of using aerocapture for reduction of propellant mass, spacecraft size, and mission cost. The In-Space Propulsion (ISP) Program, part of NASA's Science Mission Directorate, has invested in aerocapture technology development since 2002. Aerocapture investments within ISP are largely driven by mission systems analysis studies, The purpose of this NASA-funded report is to identify and document the fundamental parameters of aerocapture within previous human and robotic Mars mission studies which will assist the community in identifying technology research gaps in human and robotic missions, and provide insight for future technology investments. Upon examination of the final data set, some key attributes within the aerocapture disciplines are identified.

  16. Aerocapture Technology to Reduce Trip Time and Cost of Planetary Missions

    NASA Astrophysics Data System (ADS)

    Artis, Gwen R.; James, B.

    2006-12-01

    NASA’s In-Space Propulsion Technology (ISPT) Program is investing in technologies to revolutionize the robotic exploration of deep space. One of these technologies is Aerocapture, the most promising of the “aeroassist” techniques used to maneuver a space vehicle within an atmosphere, using aerodynamic forces in lieu of propellant. (Other aeroassist techniques include aeroentry and aerobraking.) Aerocapture relies on drag atmospheric drag to decelerate an incoming spacecraft and capture it into orbit. This technique is very attractive since it permits spacecraft to be launched from Earth at higher velocities, providing shorter trip times and saving mass and overall cost on future missions. Recent aerocapture systems analysis studies quantify the benefits of aerocapture to future exploration. The 2002 Titan aerocapture study showed that using aerocapture at Titan instead of conventional propulsive capture results in over twice as much payload delivered to Titan. Aerocapture at Venus results in almost twice the payload delivered to Venus as with aerobraking, and over six times more mass delivered into orbit than all-propulsive capture. Aerocapture at Mars shows significant benefits as the payload sizes increase and as missions become more complex. Recent Neptune aerocapture studies show that aerocapture opens up entirely new classes of missions at Neptune. Current aerocapture technology development is advancing the maturity of each sub-system technology needed for successful implementation of aerocapture on future missions. Recent development has focused on both rigid aeroshell and inflatable aerocapture systems. Rigid aeroshell systems development includes new ablative and non-ablative thermal protection systems, advanced aeroshell performance sensors, lightweight structures and higher temperature adhesives. Inflatable systems such as trailing tethered and clamped “ballutes” and inflatable aeroshells are also under development. Computational tools required

  17. Aerocapture Guidance Algorithm Comparison Campaign

    NASA Technical Reports Server (NTRS)

    Rousseau, Stephane; Perot, Etienne; Graves, Claude; Masciarelli, James P.; Queen, Eric

    2002-01-01

    The aerocapture is a promising technique for the future human interplanetary missions. The Mars Sample Return was initially based on an insertion by aerocapture. A CNES orbiter Mars Premier was developed to demonstrate this concept. Mainly due to budget constraints, the aerocapture was cancelled for the French orbiter. A lot of studies were achieved during the three last years to develop and test different guidance algorithms (APC, EC, TPC, NPC). This work was shared between CNES and NASA, with a fruitful joint working group. To finish this study an evaluation campaign has been performed to test the different algorithms. The objective was to assess the robustness, accuracy, capability to limit the load, and the complexity of each algorithm. A simulation campaign has been specified and performed by CNES, with a similar activity on the NASA side to confirm the CNES results. This evaluation has demonstrated that the numerical guidance principal is not competitive compared to the analytical concepts. All the other algorithms are well adapted to guaranty the success of the aerocapture. The TPC appears to be the more robust, the APC the more accurate, and the EC appears to be a good compromise.

  18. Atmospheric Models for Aerocapture

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta L.; Keller, Vernon W.

    2004-01-01

    There are eight destinations in the solar System with sufficient atmosphere for aerocapture to be a viable aeroassist option - Venus, Earth, Mars, Jupiter, Saturn and its moon Titan, Uranus, and Neptune. Engineering-level atmospheric models for four of these targets (Earth, Mars, Titan, and Neptune) have been developed for NASA to support systems analysis studies of potential future aerocapture missions. Development of a similar atmospheric model for Venus has recently commenced. An important capability of all of these models is their ability to simulate quasi-random density perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithm, and for thermal systems design. Similarities and differences among these atmospheric models are presented, with emphasis on the recently developed Neptune model and on planned characteristics of the Venus model. Example applications for aerocapture are also presented and illustrated. Recent updates to the Titan atmospheric model are discussed, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Probe entry at Titan.

  19. Aerocapture Technology Development Overview

    NASA Technical Reports Server (NTRS)

    Munk, Michelle M.; Moon, Steven A.

    2008-01-01

    This paper will explain the investment strategy, the role of detailed systems analysis, and the hardware and modeling developments that have resulted from the past 5 years of work under NASA's In-Space Propulsion Program (ISPT) Aerocapture investment area. The organizations that have been funded by ISPT over that time period received awards from a 2002 NASA Research Announcement. They are: Lockheed Martin Space Systems, Applied Research Associates, Inc., Ball Aerospace, NASA s Ames Research Center, and NASA s Langley Research Center. Their accomplishments include improved understanding of entry aerothermal environments, particularly at Titan, demonstration of aerocapture guidance algorithm robustness at multiple bodies, manufacture and test of a 2-meter Carbon-Carbon "hot structure," development and test of evolutionary, high-temperature structural systems with efficient ablative materials, and development of aerothermal sensors that will fly on the Mars Science Laboratory in 2009. Due in large part to this sustained ISPT support for Aerocapture, the technology is ready to be validated in flight.

  20. Application of a Fully Numerical Guidance to Mars Aerocapture

    NASA Technical Reports Server (NTRS)

    Matz, Daniel A.; Lu, Ping; Mendeck, Gavin F.; Sostaric, Ronald R.

    2017-01-01

    An advanced guidance algorithm, Fully Numerical Predictor-corrector Aerocapture Guidance (FNPAG), has been developed to perform aerocapture maneuvers in an optimal manner. It is a model-based, numerical guidance that benefits from requiring few adjustments across a variety of different hypersonic vehicle lift-to-drag ratios, ballistic co-efficients, and atmospheric entry conditions. In this paper, FNPAG is first applied to the Mars Rigid Vehicle (MRV) mid lift-to-drag ratio concept. Then the study is generalized to a design map of potential Mars aerocapture missions and vehicles, ranging from the scale and requirements of recent robotic to potential human and precursor missions. The design map results show the versatility of FNPAG and provide insight for the design of Mars aerocapture vehicles and atmospheric entry conditions to achieve desired performance.

  1. Mars Aerocapture Systems Study

    NASA Technical Reports Server (NTRS)

    Wright, Henry S.; Oh, David Y.; Westhelle, Carlos H.; Fisher, Jody L.; Dyke, R. Eric; Edquist, Karl T.; Brown, James L.; Justh, Hilary L.; Munk, Michelle M.

    2006-01-01

    Mars Aerocapture Systems Study (MASS) is a detailed study of the application of aerocapture to a large Mars robotic orbiter to assess and identify key technology gaps. This study addressed use of an Opposition class return segment for use in the Mars Sample Return architecture. Study addressed mission architecture issues as well as system design. Key trade studies focused on design of aerocapture aeroshell, spacecraft design and packaging, guidance, navigation and control with simulation, computational fluid dynamics, and thermal protection system sizing. Detailed master equipment lists are included as well as a cursory cost assessment.

  2. Aerocapture Technology Development for Planetary Science - Update

    NASA Technical Reports Server (NTRS)

    Munk, Michelle M.

    2006-01-01

    Within NASA's Science Mission Directorate is a technological program dedicated to improving the cost, mass, and trip time of future scientific missions throughout the Solar System. The In-Space Propulsion Technology (ISPT) Program, established in 2001, is charged with advancing propulsion systems used in space from Technology Readiness Level (TRL) 3 to TRL6, and with planning activities leading to flight readiness. The program's content has changed considerably since inception, as the program has refocused its priorities. One of the technologies that has remained in the ISPT portfolio through these changes is Aerocapture. Aerocapture is the use of a planetary body's atmosphere to slow a vehicle from hyperbolic velocity to a low-energy orbit suitable for science. Prospective use of this technology has repeatedly shown huge mass savings for missions of interest in planetary exploration, at Titan, Neptune, Venus, and Mars. With launch vehicle costs rising, these savings could be the key to mission viability. This paper provides an update on the current state of the Aerocapture technology development effort, summarizes some recent key findings, and highlights hardware developments that are ready for application to Aerocapture vehicles and entry probes alike. Description of Investments: The Aerocapture technology area within the ISPT program has utilized the expertise around NASA to perform Phase A-level studies of future missions, to identify technology gaps that need to be filled to achieve flight readiness. A 2002 study of the Titan Explorer mission concept showed that the combination of Aerocapture and a Solar Electric Propulsion system could deliver a lander and orbiter to Titan in half the time and on a smaller, less expensive launch vehicle, compared to a mission using chemical propulsion for the interplanetary injection and orbit insertion. The study also identified no component technology breakthroughs necessary to implement Aerocapture on such a mission

  3. Mars Aerocapture and Validation of Mars-GRAM with TES Data

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta; Keller, Vernon W.

    2005-01-01

    Mars Global Reference Atmospheric Model (Mars-GRAM) is a widely-used engineering- level Mars atmospheric model. Applications include systems design, performance analysis, and operations planning for aerobraking, entry descent and landing, and aerocapture. Typical Mars aerocapture periapsis altitudes (for systems with rigid-aeroshell heat shields) are about 50 km. This altitude is above the 0-40 km height range covered by Mars Global Surveyor Thermal Emission Spectrometer (TES) nadir observations. Recently, TES limb sounding data have been made available, spanning more than two Mars years (more than 200,000 data profiles) with altitude coverage up to about 60 km, well within the height range of interest for aerocapture. Results are presented comparing Mars-GRAM atmospheric density with densities from TES nadir and limb sounding observations. A new Mars-GRAM feature is described which allows individual TES nadir or limb profiles to be extracted from the large TES databases, and to be used as an optional replacement for standard Mars-GRAM background (climatology) conditions. For Monte-Carlo applications such as aerocapture guidance and control studies, Mars-GRAM perturbations are available using these TES profile background conditions.

  4. Parametric entry corridors for lunar/Mars aerocapture missions

    NASA Technical Reports Server (NTRS)

    Ling, Lisa M.; Baseggio, Franco M.; Fuhry, Douglas P.

    1991-01-01

    Parametric atmospheric entry corridor data are presented for Earth and Mars aerocapture. Parameter ranges were dictated by the range of mission designs currently envisioned as possibilities for the Human Exploration Initiative (HEI). This data, while not providing a means for exhaustive evaluation of aerocapture performance, should prove to be a useful aid for preliminary mission design and evaluation. Entry corridors are expressed as ranges of allowable vacuum periapse altitude of the planetary approach hyperbolic orbit, with chart provided for conversion to an approximate flight path angle corridor at entry interface (125 km altitude). The corridor boundaries are defined by open-loop aerocapture trajectories which satisfy boundary constraints while utilizing the full aerodynamic control capability of the vehicle (i.e., full lift-up or full lift-down). Parameters examined were limited to those of greatest importance from an aerocapture performance standpoint, including the approach orbit hyperbolic excess velocity, the vehicle lift to drag ratio, maximum aerodynamic load factor limit, and the apoapse of the target orbit. The impact of the atmospheric density bias uncertainties are also included. The corridor data is presented in graphical format, and examples of the utilization of these graphs for mission design and evaluation are included.

  5. Assessing the Relative Risk of Aerocapture Using Probabalistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Percy, Thomas K.; Bright, Ellanee; Torres, Abel O.

    2005-01-01

    A recent study performed for the Aerocapture Technology Area in the In-Space Propulsion Technology Projects Office at the Marshall Space Flight Center investigated the relative risk of various capture techniques for Mars missions. Aerocapture has been proposed as a possible capture technique for future Mars missions but has been perceived by many in the community as a higher risk option as compared to aerobraking and propulsive capture. By performing a probabilistic risk assessment on aerocapture, aerobraking and propulsive capture, a comparison was made to uncover the projected relative risks of these three maneuvers. For mission planners, this knowledge will allow them to decide if the mass savings provided by aerocapture warrant any incremental risk exposure. The study focuses on a Mars Sample Return mission currently under investigation at the Jet Propulsion Laboratory (JPL). In each case (propulsive, aerobraking and aerocapture), the Earth return vehicle is inserted into Martian orbit by one of the three techniques being investigated. A baseline spacecraft was established through initial sizing exercises performed by JPL's Team X. While Team X design results provided the baseline and common thread between the spacecraft, in each case the Team X results were supplemented by historical data as needed. Propulsion, thermal protection, guidance, navigation and control, software, solar arrays, navigation and targeting and atmospheric prediction were investigated. A qualitative assessment of human reliability was also included. Results show that different risk drivers contribute significantly to each capture technique. For aerocapture, the significant drivers include propulsion system failures and atmospheric prediction errors. Software and guidance hardware contribute the most to aerobraking risk. Propulsive capture risk is mainly driven by anomalous solar array degradation and propulsion system failures. While each subsystem contributes differently to the risk of

  6. An Assessment of Aerocapture and Applications to Future Missions to Uranus and Neptune

    NASA Astrophysics Data System (ADS)

    Beauchamp, P. M.; Spilker, T. R.

    2017-12-01

    Our investigation examined the current state of readiness of aerocapture at several destinations of interest, including Uranus and Neptune, to identify what technologies are needed, and to determine if a technology demonstration mission is required, prior to the first use of aerocapture for a science mission. The study team concluded that the current state of readiness is destination dependent, with aerocaptured missions feasible at Venus, Mars, and Titan with current technologies. The use of aerocapture for orbit insertion at the ice giant planets Uranus and Neptune requires at least additional study to assess the expected performance of new guidance, navigation, and control algorithms, and possible development of new hardware, such as a mid-L/D entry vehicle shape or new thermal protection system materials. A variety of near-term activities could contribute to risk reduction for missions proposing use of aerocapture, but a system-level technology demonstration mission is not deemed necessary before the use of aerocapture for a NASA science mission.

  7. Structural Design for a Neptune Aerocapture Mission

    NASA Technical Reports Server (NTRS)

    Dyke, R. Eric; Hrinda, Glenn A.

    2004-01-01

    A multi-center study was conducted in 2003 to assess the feasibility of and technology requirements for using aerocapture to insert a scientific platform into orbit around Neptune. The aerocapture technique offers a potential method of greatly reducing orbiter mass and thus total spacecraft launch mass by minimizing the required propulsion system mass. This study involved the collaborative efforts of personnel from Langley Research Center (LaRC), Johnson Space Flight Center (JSFC), Marshall Space Flight Center (MSFC), Ames Research Center (ARC), and the Jet Propulsion Laboratory (JPL). One aspect of this effort was the structural design of the full spacecraft configuration, including the ellipsled aerocapture orbiter and the in-space solar electric propulsion (SEP) module/cruise stage. This paper will discuss the functional and structural requirements for each of these components, some of the design trades leading to the final configuration, the loading environments, and the analysis methods used to ensure structural integrity. It will also highlight the design and structural challenges faced while trying to integrate all the mission requirements. Component sizes, materials, construction methods and analytical results, including masses and natural frequencies, will be presented, showing the feasibility of the resulting design for use in a Neptune aerocapture mission. Lastly, results of a post-study structural mass optimization effort on the ellipsled will be discussed, showing potential mass savings and their influence on structural strength and stiffness

  8. Overview of a Proposed Flight Validation of Aerocapture System Technology for Planetary Missions

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Hall, Jeffery L.; Oh, David; Munk, Michelle M.

    2006-01-01

    Aerocapture System Technology for Planetary Missions is being proposed to NASA's New Millennium Program for flight aboard the Space Technology 9 (ST9) flight opportunity. The proposed ST9 aerocapture mission is a system-level flight validation of the aerocapture maneuver as performed by an instrumented, high-fidelity flight vehicle within a true in-space and atmospheric environment. Successful validation of the aerocapture maneuver will be enabled through the flight validation of an advanced guidance, navigation, and control system as developed by Ball Aerospace and two advanced Thermal Protection System (TPS) materials, Silicon Refined Ablative Material-20 (SRAM-20) and SRAM-14, as developed by Applied Research Associates (ARA) Ablatives Laboratory. The ST9 aerocapture flight validation will be sufficient for immediate infusion of these technologies into NASA science missions being proposed for flight to a variety of Solar System destinations possessing a significant planetary atmosphere.

  9. Generic aerocapture atmospheric entry study, volume 1

    NASA Technical Reports Server (NTRS)

    1980-01-01

    An atmospheric entry study to fine a generic aerocapture vehicle capable of missions to Mars, Saturn, and Uranus is reported. A single external geometry was developed through atmospheric entry simulations. Aerocapture is a system design concept which uses an aerodynamically controlled atmospheric entry to provide the necessary velocity depletion to capture payloads into planetary orbit. Design concepts are presented which provide the control accuracy required while giving thermal protection for the mission payload. The system design concepts consist of the following elements: (1) an extendable biconic aerodynamic configuration with lift to drag ratio between 1.0 and 2.0; (2) roll control system concepts to control aerodynamic lift and disturbance torques; (3) aeroshell design concepts capable of meeting dynamic pressure loads during aerocapture; and (4) entry thermal protection system design concepts to meet thermodynamic loads during aerocapture.

  10. Demonstration of an Aerocapture GN and C System Through Hardware-in-the-Loop Simulations

    NASA Technical Reports Server (NTRS)

    Masciarelli, James; Deppen, Jennifer; Bladt, Jeff; Fleck, Jeff; Lawson, Dave

    2010-01-01

    Aerocapture is an orbit insertion maneuver in which a spacecraft flies through a planetary atmosphere one time using drag force to decelerate and effect a hyperbolic to elliptical orbit change. Aerocapture employs a feedback Guidance, Navigation, and Control (GN&C) system to deliver the spacecraft into a precise postatmospheric orbit despite the uncertainties inherent in planetary atmosphere knowledge, entry targeting and aerodynamic predictions. Only small amounts of propellant are required for attitude control and orbit adjustments, thereby providing mass savings of hundreds to thousands of kilograms over conventional all-propulsive techniques. The Analytic Predictor Corrector (APC) guidance algorithm has been developed to steer the vehicle through the aerocapture maneuver using bank angle control. Through funding provided by NASA's In-Space Propulsion Technology Program, the operation of an aerocapture GN&C system has been demonstrated in high-fidelity simulations that include real-time hardware in the loop, thus increasing the Technology Readiness Level (TRL) of aerocapture GN&C. First, a non-real-time (NRT), 6-DOF trajectory simulation was developed for the aerocapture trajectory. The simulation included vehicle dynamics, gravity model, atmosphere model, aerodynamics model, inertial measurement unit (IMU) model, attitude control thruster torque models, and GN&C algorithms (including the APC aerocapture guidance). The simulation used the vehicle and mission parameters from the ST-9 mission. A 2000 case Monte Carlo simulation was performed and results show an aerocapture success rate of greater than 99.7%, greater than 95% of total delta-V required for orbit insertion is provided by aerodynamic drag, and post-aerocapture orbit plane wedge angle error is less than 0.5 deg (3-sigma). Then a real-time (RT), 6-DOF simulation for the aerocapture trajectory was developed which demonstrated the guidance software executing on a flight-like computer, interfacing with a

  11. NASA Development of Aerocapture Technologies

    NASA Technical Reports Server (NTRS)

    James, Bonnie; Munk, Michelle; Moon, Steve

    2003-01-01

    Aeroassist technology development is a vital part of the NASA ln-Space Propulsion Program (ISP), which is managed by the NASA Headquarters Office of Space Science, and implemented by the Marshall Space Flight Center in Huntsville, Alabama. Aeroassist is the general term given to various techniques to maneuver a space vehicle within an atmosphere, using aerodynamic forces in lieu of propulsive fuel. Within the ISP, the current aeroassist technology development focus is aerocapture. The objective of the ISP Aerocapture Technology Project (ATP) is to develop technologies that can enable and/or benefit NASA science missions by significantly reducing cost, mass, and/or travel times. To accomplish this objective, the ATP identifies and prioritizes the most promising technologies using systems analysis, technology advancement and peer review, coupled with NASA Headquarters Office of Space Science target requirements. Plans are focused on developing mid-Technology Readiness Level (TRL) technologies to TRL 6 (ready for technology demonstration in space).

  12. NASA Development of Aerocapture Technologies

    NASA Technical Reports Server (NTRS)

    James, Bonnie; Munk, Michelle; Moon, Steve

    2004-01-01

    Aeroassist technology development is a vital part of the NASA In-Space Propulsion Program (ISP), which is managed by the NASA Headquarters Office of Space Science, and implemented by the Marshall Space Flight Center in Huntsville, Alabama. Aeroassist is the general term given to various techniques to maneuver a space vehicle within an atmosphere, using aerodynamic forces in lieu of propulsive fuel. Within the ISP, the current aeroassist technology development focus is aerocapture. The objective of the ISP Aerocapture Technology Project (ATP) is to develop technologies that can enable and/or benefit NASA science missions by significantly reducing cost, mass, and/or travel times. To accomplish this objective, the ATP identifies and prioritizes the most promising technologies using systems analysis, technology advancement and peer review, coupled with NASA Headquarters Office of Space Science target requirements. Plans are focused on developing mid-Technology Readiness Level (TRL) technologies to TRL 6 (ready for technology demonstration in space).

  13. Aerocapture Technology Developments from NASA's In-Space Propulsion Technology Program

    NASA Technical Reports Server (NTRS)

    Munk, Michelle M.; Moon, Steven A.

    2007-01-01

    This paper will explain the investment strategy, the role of detailed systems analysis, and the hardware and modeling developments that have resulted from the past 5 years of work under NASA's In-Space Propulsion Program (ISPT) Aerocapture investment area. The organizations that have been funded by ISPT over that time period received awards from a 2002 NASA Research Announcement. They are: Lockheed Martin Space Systems, Applied Research Associates, Inc., Ball Aerospace, NASA's Ames Research Center, and NASA's Langley Research Center. Their accomplishments include improved understanding of entry aerothermal environments, particularly at Titan, demonstration of aerocapture guidance algorithm robustness at multiple bodies, manufacture and test of a 2-meter Carbon-Carbon "hot structure," development and test of evolutionary, high-temperature structural systems with efficient ablative materials, and development of aerothermal sensors that will fly on the Mars Science Laboratory in 2009. Due in large part to this sustained ISPT support for Aerocapture, the technology is ready to be validated in flight.

  14. Preliminary studies on the planetary entry to Jupiter by aerocapture technique

    NASA Astrophysics Data System (ADS)

    Aso, Shigeru; Yasaka, Tetsuo; Hirayama, Hiroshi; Poetro, Ridanto Eko; Hatta, Shinji

    2006-10-01

    Preliminary studies on the planetary entry to Jupiter by aerocapture technique are studied in order to complete technological challenges to deliver scientific probe with low cost and smaller mass of the spacecraft to Jupiter. Jupiter aerocapture corridor determination based on maximum deceleration limit of 5g (lower corridor) and aerocapture capability (upper corridor) at Jupiter are carefully considered and calculated. The results show about 1700 m/s of saving velocity due to aerocapture could be possible in some cases for the spacecraft to be captured by Jovian gravitational field. However, the results also show that Jovian aerocapture is not available in some cases. Hence, careful selection is needed to realize Jovian aerocapture. Also the numerical simulation of aerodynamic heating to the spacecraft has been conducted. DSMC method is used for the simulation of flow fields around the spacecraft. The transient changes of drag due to Jovian atmosphere and total heat loads to the spacecraft are obtained. The results show that the estimated heat loads could be within allowable amount heat load when some ablation heat shield technique is applied.

  15. Preliminary studies on the planetary entry to Jupiter by aerocapture technique

    NASA Astrophysics Data System (ADS)

    Aso, Shigeru; Yasaka, Tetsuo; Hirayama, Hiroshi; Eko Poetro, Ridanto; Hatta, Shinji

    2003-11-01

    Preliminary studies on the planetary entry to Jupiter by aerocapture technique are studied in order to complete technological challenges to deliver scientific probe with low cost and smaller mass of the spacecraft to Jupiter. Jupiter aerocapture corridor determination based on maximum deceleration limit of 5g (lower corridor) and aerocapture capability (upper corridor) at Jupiter are carefully considered and calculated. The results show about 1700 m/s of saving velocity due to aerocapture could be possible in some cases for the spacecraft to be captured by Jovian gravitational field. However, the results also show that Jovian aerocapture is not available in some cases. Hence, careful selection is needed to realise Jovian aerocapture. Also the numerical simulation of aerodynamic heating to the spacecraft has been conducted. DSMC method is used for the simulation of flow fields around the spacecraft. The transient changes of drag due to Jovian atmosphere and total heat loads to the spacecraft are obtained. The results show the estimated heat loads could be within allowable amount heat load when some ablation heat shield technique is applied.

  16. Candidate Earth Entry Trajectories to Mimic Venus Aerocapture Using a Lifting ADEPT

    NASA Technical Reports Server (NTRS)

    Williams, Jimmy

    2017-01-01

    A Lifting ADEPT is considered for aerocapture at Venus. Analysis concerning the heating environment leads to an initial sizing estimate. In tandem, a direct entry profile at Earth is considered to act as a facsimile for the Venus aerocapture heating environment. The bounds of this direct entry profile are determined and it is found that a trajectory from a Geostationary Transfer Orbit with a Lifting ADEPT capable of fitting on a rideshare opportunity is capable of matching certain aspects of this heating environment.

  17. Aeroshell Design Techniques for Aerocapture Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Dyke, R. Eric; Hrinda, Glenn A.

    2004-01-01

    A major goal of NASA s In-Space Propulsion Program is to shorten trip times for scientific planetary missions. To meet this challenge arrival speeds will increase, requiring significant braking for orbit insertion, and thus increased deceleration propellant mass that may exceed launch lift capabilities. A technology called aerocapture has been developed to expand the mission potential of exploratory probes destined for planets with suitable atmospheres. Aerocapture inserts a probe into planetary orbit via a single pass through the atmosphere using the probe s aeroshell drag to reduce velocity. The benefit of an aerocapture maneuver is a large reduction in propellant mass that may result in smaller, less costly missions and reduced mission cruise times. The methodology used to design rigid aerocapture aeroshells will be presented with an emphasis on a new systems tool under development. Current methods for fast, efficient evaluations of structural systems for exploratory vehicles to planets and moons within our solar system have been under development within NASA having limited success. Many systems tools that have been attempted applied structural mass estimation techniques based on historical data and curve fitting techniques that are difficult and cumbersome to apply to new vehicle concepts and missions. The resulting vehicle aeroshell mass may be incorrectly estimated or have high margins included to account for uncertainty. This new tool will reduce the guesswork previously found in conceptual aeroshell mass estimations.

  18. CNES-NASA Studies of the Mars Sample Return Orbiter Aerocapture Phase

    NASA Technical Reports Server (NTRS)

    Fraysse, H.; Powell, R.; Rousseau, S.; Striepe, S.

    2000-01-01

    A Mars Sample Return (MSR) mission has been proposed as a joint CNES (Centre National d'Etudes Spatiales) and NASA effort in the ongoing Mars Exploration Program. The MSR mission is designed to return the first samples of Martian soil to Earth. The primary elements of the mission are a lander, rover, ascent vehicle, orbiter, and an Earth entry vehicle. The Orbiter has been allocated only 2700 kg on the launch phase to perform its part of the mission. This mass restriction has led to the decision to use an aerocapture maneuver at Mars for the orbiter. Aerocapture replaces the initial propulsive capture maneuver with a single atmospheric pass. This atmospheric pass will result in the proper apoapsis, but a periapsis raise maneuver is required at the first apoapsis. The use of aerocapture reduces the total mass requirement by approx. 45% for the same payload. This mission will be the first to use the aerocapture technique. Because the spacecraft is flying through the atmosphere, guidance algorithms must be developed that will autonomously provide the proper commands to reach the desired orbit while not violating any of the design parameters (e.g. maximum deceleration, maximum heating rate, etc.). The guidance algorithm must be robust enough to account for uncertainties in delivery states, atmospheric conditions, mass properties, control system performance, and aerodynamics. To study this very critical phase of the mission, a joint CNES-NASA technical working group has been formed. This group is composed of atmospheric trajectory specialists from CNES, NASA Langley Research Center and NASA Johnson Space Center. This working group is tasked with developing and testing guidance algorithms, as well as cross-validating CNES and NASA flight simulators for the Mars atmospheric entry phase of this mission. The final result will be a recommendation to CNES on the algorithm to use, and an evaluation of the flight risks associated with the algorithm. This paper will describe the

  19. Saturn/Titan Rendezvous: A Solar-Sail Aerocapture Mission

    NASA Technical Reports Server (NTRS)

    Matloff, Gregory L.; Taylor, Travis; Powell, Conley

    2004-01-01

    A low-mass Titan orbiter is proposed that uses conservative or optimistic solar sails for all post-Earth-escape propulsion. After accelerating the probe onto a trans-Saturn trajectory, the sail is used parachute style for Saturn capture during a pass through Saturn's outer atmosphere. If the apoapsis of the Saturn-capture orbit is appropriate, the aerocapture maneuver can later be repeated at Titan so that the spacecraft becomes a satellite of Titan. An isodensity-atmosphere model is applied to screen aerocapture trajectories. Huygens/Cassini should greatly reduce uncertainties regarding the upper atmospheres of Saturn and Titan.

  20. PredGuid+A: Orion Entry Guidance Modified for Aerocapture

    NASA Technical Reports Server (NTRS)

    Lafleur, Jarret

    2013-01-01

    PredGuid+A software was developed to enable a unique numerical predictor-corrector aerocapture guidance capability that builds on heritage Orion entry guidance algorithms. The software can be used for both planetary entry and aerocapture applications. Furthermore, PredGuid+A implements a new Delta-V minimization guidance option that can take the place of traditional targeting guidance and can result in substantial propellant savings. PredGuid+A allows the user to set a mode flag and input a target orbit's apoapsis and periapsis. Using bank angle control, the guidance will then guide the vehicle to the appropriate post-aerocapture orbit using one of two algorithms: Apoapsis Targeting or Delta-V Minimization (as chosen by the user). Recently, the PredGuid guidance algorithm was adapted for use in skip-entry scenarios for NASA's Orion multi-purpose crew vehicle (MPCV). To leverage flight heritage, most of Orion's entry guidance routines are adapted from the Apollo program.

  1. Trailing Ballute Aerocapture: Concept and Feasibility Assessment

    NASA Technical Reports Server (NTRS)

    Miller, Kevin L.; Gulick, Doug; Lewis, Jake; Trochman, Bill; Stein, Jim; Lyons, Daniel T.; Wilmoth, Richard G.

    2003-01-01

    Trailing Ballute Aerocapture offers the potential to obtain orbit insertion around a planetary body at a fraction of the mass of traditional methods. This allows for lower costs for launch, faster flight times and additional mass available for science payloads. The technique involves an inflated ballute (balloon-parachute) that provides aerodynamic drag area for use in the atmosphere of a planetary body to provide for orbit insertion in a relatively benign heating environment. To account for atmospheric, navigation and other uncertainties, the ballute is oversized and detached once the desired velocity change (Delta V) has been achieved. Analysis and trades have been performed for the purpose of assessing the feasibility of the technique including aerophysics, material assessments, inflation system and deployment sequence and dynamics, configuration trades, ballute separation and trajectory analysis. Outlined is the technology development required for advancing the technique to a level that would allow it to be viable for use in space exploration missions.

  2. Aerocapture Design Study for a Titan Polar Orbiter

    NASA Astrophysics Data System (ADS)

    Nixon, C. A.; Kirchman, F.; Esper, J.; Folta, D.; Mashiku, A.

    2016-03-01

    In 2014 a team at NASA Goddard Space Flight Center (GSFC) studied the feasibility of using active aerocapture to reduce the chemical ΔV requirements for inserting a small scientific satellite into Titan polar orbit. The scientific goals of the mission would be multi-spectral imaging and active radar mapping of Titan's surface and subsurface. The study objectives were to: (i) identify and select from launch window opportunities and refine the trajectory to Titan; (ii) study the aerocapture flight path and refine the entry corridor; (iii) design a carrier spacecraft and systems architecture; (iv) develop a scientific and engineering plan for the orbital portion of the mission. Study results include: (i) a launch in October 2021 on an Atlas V vehicle, using gravity assists from Earth and Venus to arrive at Titan in January 2031; (ii) initial aerocapture via an 8-km wide entry corridor to reach an initial 350-6000 km orbit, followed by aerobraking to reach a 350-1500 km orbit, and a periapse raise maneuver to reach a final 1500 km circular orbit; (iii) a three-part spacecraft system consisting of a cruise stage, radiator module, and orbiter inside a heat shield; (iv) a 22-month mission including station keeping to prevent orbital decay due to Saturn perturbations, with 240 Gb of compressed data returned. High-level issues identified include: (i) downlink capability - realistic downlink rates preclude the desired multi- spectral, global coverage of Titan's surface; (ii) power - demise of the NASA ASRG (Advanced Stirling Radioisotope Generator) program, and limited availability at present of MMRTGs (Multi-Mission Radioisotope Generators) needed for competed outer planet missions; (iii) thermal - external radiators must be carried to remove 4 kW of waste heat from MMRTGs inside the aeroshell, requiring heat pipes that pass through the aeroshell lid, compromising shielding ability; (iv) optical navigation to reach the entry corridor; (v) the NASA requirement of continuous

  3. AEROELASTIC SIMULATION TOOL FOR INFLATABLE BALLUTE AEROCAPTURE

    NASA Technical Reports Server (NTRS)

    Liever, P. A.; Sheta, E. F.; Habchi, S. D.

    2006-01-01

    A multidisciplinary analysis tool is under development for predicting the impact of aeroelastic effects on the functionality of inflatable ballute aeroassist vehicles in both the continuum and rarefied flow regimes. High-fidelity modules for continuum and rarefied aerodynamics, structural dynamics, heat transfer, and computational grid deformation are coupled in an integrated multi-physics, multi-disciplinary computing environment. This flexible and extensible approach allows the integration of state-of-the-art, stand-alone NASA and industry leading continuum and rarefied flow solvers and structural analysis codes into a computing environment in which the modules can run concurrently with synchronized data transfer. Coupled fluid-structure continuum flow demonstrations were conducted on a clamped ballute configuration. The feasibility of implementing a DSMC flow solver in the simulation framework was demonstrated, and loosely coupled rarefied flow aeroelastic demonstrations were performed. A NASA and industry technology survey identified CFD, DSMC and structural analysis codes capable of modeling non-linear shape and material response of thin-film inflated aeroshells. The simulation technology will find direct and immediate applications with NASA and industry in ongoing aerocapture technology development programs.

  4. Physiologically constrained aerocapture for manned Mars missions

    NASA Technical Reports Server (NTRS)

    Lyne, James Evans

    1992-01-01

    Aerobraking has been proposed as a critical technology for manned missions to Mars. The variety of mission architectures currently under consideration presents aerobrake designers with an enormous range of potential entry scenarios. Two of the most important considerations in the design of an aerobrake are the required control authority (lift-to-drag ratio) and the aerothermal environment which the vehicle will encounter. Therefore, this study examined the entry corridor width and stagnation-point heating rate and load for the entire range of probable entry velocities, lift-to-drag ratios, and ballistic coefficients for capture at both Earth and Mars. To accomplish this, a peak deceleration limit for the aerocapture maneuvers had to be established. Previous studies had used a variety of load limits without adequate proof of their validity. Existing physiological and space flight data were examined, and it was concluded that a deceleration limit of 5 G was appropriate. When this load limit was applied, numerical studies showed that an aerobrake with an L/D of 0.3 could provide an entry corridor width of at least 1 degree for all Mars aerocaptures considered with entry velocities up to 9 km/s. If 10 km/s entries are required, an L/D of 0.4 to 0.5 would be necessary to maintain a corridor width of at least 1 degree. For Earth return aerocapture, a vehicle with an L/D of 0.4 to 0.5 was found to provide a corridor width of 0.7 degree or more for all entry velocities up to 14.5 km/s. Aerodynamic convective heating calculations were performed assuming a fully catalytic, 'cold' wall; radiative heating was calculated assuming that the shock layer was in thermochemical equilibrium. Heating rates were low enough for selected entries at Mars that a radiatively cooled thermal protection system might be feasible, although an ablative material would be required for most scenarios. Earth return heating rates were generally more severe than those encountered by the Apollo vehicles

  5. Aerocapture Design Study for a Titan Polar Orbiter

    NASA Technical Reports Server (NTRS)

    Nixon, Conor A.; Kirchman, Frank; Esper, Jaime; Folta, David; Mashiku, Alinda

    2016-01-01

    In 2014 a team at NASA Goddard Space Flight Center (GSFC) studied the feasibility of using active aerocapture to reduce the chemical Delta V requirements for inserting a small scientific satellite into Titan polar orbit. The scientific goals of the mission would be multi-spectral imaging and active radar mapping of Titan's surface and subsurface. The study objectives were to: (i) identify and select from launch window opportunities and refine the trajectory to Titan; (ii) study the aerocapture flight path and refine the entry corridor; (iii) design a carrier spacecraft and systems architecture; (iv) develop a scientific and engineering plan for the orbital portion of the mission. Study results include: (i) a launch in October 2021 on an Atlas V vehicle, using gravity assists from Earth and Venus to arrive at Titan in January 2031; (ii) initial aerocapture via an 8-km wide entry corridor to reach an initial 350X6000 km orbit, followed by aerobraking to reach a 350X1500 km orbit, and a periapse raise maneuver to reach a final 1500 km circular orbit; (iii) a three-part spacecraft system consisting of a cruise stage, radiator module, and orbiter inside a heat shield; (iv) a 22-month mission including station keeping to prevent orbital decay due to Saturn perturbations, with 240 Gb of compressed data returned. High-level issues identified include: (i) downlink capability - realistic downlink rates preclude the desired multi-spectral, global coverage of Titan's surface; (ii) power - demise of the NASA ASRG (Advanced Stirling Radioisotope Generator) program, and limited availability at present of MMRTGs (Multi-Mission Radioisotope Generators) needed for competed outer planet missions; (iii) thermal - external radiators must be carried to remove 4 kW of waste heat from MMRTGs inside the aeroshell, requiring heat pipes that pass through the aeroshell lid, compromising shielding ability; (iv) optical navigation to reach the entry corridor; (v) the NASA requirement of

  6. An onboard navigation system which fulfills Mars aerocapture guidance requirements

    NASA Technical Reports Server (NTRS)

    Brand, Timothy J.; Fuhry, Douglas P.; Shepperd, Stanley W.

    1989-01-01

    The development of a candidate autonomous onboard Mars approach navigation scheme capable of supporting aerocapture into Mars orbit is discussed. An aerocapture guidance and navigation system which can run independently of the preaerocapture navigation was used to define a preliminary set of accuracy requirements at entry interface. These requirements are used to evaluate the proposed preaerocapture navigation scheme. This scheme uses optical sightings on Deimos with a star tracker and an inertial measurement unit for instrumentation as a source for navigation nformation. Preliminary results suggest that the approach will adequately support aerocaputre into Mars orbit.

  7. Angle-of-Attack-Modulated Terminal Point Control for Neptune Aerocapture

    NASA Technical Reports Server (NTRS)

    Queen, Eric M.

    2004-01-01

    An aerocapture guidance algorithm based on a calculus of variations approach is developed, using angle of attack as the primary control variable. Bank angle is used as a secondary control to alleviate angle of attack extremes and to control inclination. The guidance equations are derived in detail. The controller has very small onboard computational requirements and is robust to atmospheric and aerodynamic dispersions. The algorithm is applied to aerocapture at Neptune. Three versions of the controller are considered with varying angle of attack authority. The three versions of the controller are evaluated using Monte Carlo simulations with expected dispersions.

  8. Aerothermodynamic environment of a Titan aerocapture vehicle

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Chow, H.

    1982-01-01

    The extent of convective and radiative heating for a Titan aerocapture vehicle is investigated. The flow in the shock layer is assumed to be axisymmetric, steady, viscous, and compressible. It is further assumed that the gas is in chemical and local thermodynamic equilibrium and tangent slab approximation is used for the radiative transport. The effect of the slip boundary conditions on the body surface and at the shock wave are included in the analysis of high-altitude entry conditions. The implicit finite difference techniques is used to solve the viscous shock-layer equations for a 45 degree sphere cone at zero angle of attack. Different compositions for the Titan atmosphere are assumed, and results are obtained for the entry conditions specified by the Jet Propulsion Laboratory.

  9. Trajectory Guidance for Mars Robotic Precursors: Aerocapture, Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Sostaric, Ronald R.; Zumwalt, Carlie; Garcia-Llama, Eduardo; Powell, Richard; Shidner, Jeremy

    2011-01-01

    Future crewed missions to Mars require improvements in landed mass capability beyond that which is possible using state-of-the-art Mars Entry, Descent, and Landing (EDL) systems. Current systems are capable of an estimated maximum landed mass of 1-1.5 metric tons (MT), while human Mars studies require 20-40 MT. A set of technologies were investigated by the EDL Systems Analysis (SA) project to assess the performance of candidate EDL architectures. A single architecture was selected for the design of a robotic precursor mission, entitled Exploration Feed Forward (EFF), whose objective is to demonstrate these technologies. In particular, inflatable aerodynamic decelerators (IADs) and supersonic retro-propulsion (SRP) have been shown to have the greatest mass benefit and extensibility to future exploration missions. In order to evaluate these technologies and develop the mission, candidate guidance algorithms have been coded into the simulation for the purposes of studying system performance. These guidance algorithms include aerocapture, entry, and powered descent. The performance of the algorithms for each of these phases in the presence of dispersions has been assessed using a Monte Carlo technique.

  10. Inflatable Aerocapture Decelerators for Mars Orbiters

    NASA Technical Reports Server (NTRS)

    Brown, Glen J.; Lingard, J. Stephen; Darley, Matthew G.; Underwood, John C.

    2007-01-01

    A multi-disciplinary research program was recently completed, sponsored by NASA Marshall Space Flight Center, on the subject of aerocapture of spacecraft weighing up to 5 metric tons at Mars. Heavier spacecraft will require deployable drag area beyond the dimensional limits of current and planned launch fairings. This research focuses on the approach of lightweight inflatable decelerators constructed with thin films, using fiber reinforcement and having a temperature limitation of 500 C. Trajectory analysis defines trajectories for a range of low ballistic coefficients for which convective heat flux is compatible with the material set. Fluid-Structure Interaction (FSI) tools are expanded to include the rarified flow regime. Several non-symmetrical configurations are evaluated for their capability to develop lift as part of the necessary trajectory control strategy. Manufacturing technology is developed for 3-D stretch forming of polyimide films and for tailored fiber reinforcement of thin films. Finally, the mass of the decelerator is estimated and compared to the mass of a traditional rigid aeroshell.

  11. A Design Study of Onboard Navigation and Guidance During Aerocapture at Mars. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Fuhry, Douglas Paul

    1988-01-01

    The navigation and guidance of a high lift-to-drag ratio sample return vehicle during aerocapture at Mars are investigated. Emphasis is placed on integrated systems design, with guidance algorithm synthesis and analysis based on vehicle state and atmospheric density uncertainty estimates provided by the navigation system. The latter utilizes a Kalman filter for state vector estimation, with useful update information obtained through radar altimeter measurements and density altitude measurements based on IMU-measured drag acceleration. A three-phase guidance algorithm, featuring constant bank numeric predictor/corrector atmospheric capture and exit phases and an extended constant altitude cruise phase, is developed to provide controlled capture and depletion of orbital energy, orbital plane control, and exit apoapsis control. Integrated navigation and guidance systems performance are analyzed using a four degree-of-freedom computer simulation. The simulation environment includes an atmospheric density model with spatially correlated perturbations to provide realistic variations over the vehicle trajectory. Navigation filter initial conditions for the analysis are based on planetary approach optical navigation results. Results from a selection of test cases are presented to give insight into systems performance.

  12. Aerothermodynamic Testing of Aerocapture and Planetary Probe Geometries in Hypersonic Ballistic-Range Environments

    NASA Technical Reports Server (NTRS)

    Wilder, M. C.; Reda, D. C.; Bogdanoff, D. W.; Olejniczak, J.

    2005-01-01

    A viewgraph presentation on aerothermodynamic testing of aerocapture and planetary probe design methods in hypersonic ballistic range environments is shown. The topics include: 1) Ballistic Range Testing; 2) NASA-Ames Hypervelocity Free Flight Facility; and 3) Representative Results.

  13. Earth Return Aerocapture for the TransHab/Ellipsled Vehicle

    NASA Technical Reports Server (NTRS)

    Muth, W. D.; Hoffmann, C.; Lyne, J. E.

    2000-01-01

    The current architecture being considered by NASA for a human Mars mission involves the use of an aerocapture procedure at Mars arrival and possibly upon Earth return. This technique would be used to decelerate the vehicles and insert them into their desired target orbits, thereby eliminating the need for propulsive orbital insertions. The crew may make the interplanetary journey in a large, inflatable habitat known as the TransHab. It has been proposed that upon Earth return, this habitat be captured into orbit for use on subsequent missions. In this case, the TransHab would be complimented with an aeroshell, which would protect it from heating during the atmospheric entry and provide the vehicle with aerodynamic lift. The aeroshell has been dubbed the "Ellipsled" because of its characteristic shape. This paper reports the results of a preliminary study of the aerocapture of the TransHab/Ellipsled vehicle upon Earth return. Undershoot and overshoot boundaries have been determined for a range of entry velocities, and the effects of variations in the atmospheric density profile, the vehicle deceleration limit, the maximum vehicle roll rate, the target orbit, and the vehicle ballistic coefficient have been examined. A simple, 180 degree roll maneuver was implemented in the undershoot trajectories to target the desired 407 km circular Earth orbit. A three-roll sequence was developed to target not only a specific orbital energy, but also a particular inclination, thereby decreasing propulsive inclination changes and post-aerocapture delta-V requirements. Results show that the TransHab/Ellipsled vehicle has a nominal corridor width of at least 0.7 degrees for entry speeds up to 14.0 km/s. Most trajectories were simulated using continuum flow aerodynamics, but the impact of high-altitude viscous effects was evaluated and found to be minimal. In addition, entry corridor comparisons have been made between the TransHab/Ellipsled and a modified Apollo capsule which is also

  14. Aerocapture for manned Mars missions - Status and challenges

    NASA Astrophysics Data System (ADS)

    Walberg, Gerald D.

    1991-08-01

    The current status for manned Mars missions and the associated challenges are summarized. Mission benefits are considered to increase with increasing Mars entry velocity. However, significant benefits accrue at moderate entry velocities between 7 and 8 km/sec, which is the realistically achievable range in view of g-limits and heating constraints. Blunt, low mass/drag coefficient (reference area) vehicles with L/Ds from 0.3 to 0.5 are found to be the preferred configurations, taking into account their adequate control authority and good payload packaging characteristics. The overall design characteristics of Mars aerocapture vehicles can be established with good confidence, using flight and ground test data and the state-of-the-art flow field analysis techniques. The principal challenges are identified as follows: to refine the knowledge of the Martian atmosphere in order to reduce design conservatism, to extend present stagnation region heating analyses to the entire vehicle forebody, and to develop reflective low-wall-catalycity TPS systems for enabling reusable vehicles.

  15. Aerocapture for manned Mars missions - Status and challenges

    NASA Technical Reports Server (NTRS)

    Walberg, Gerald D.

    1991-01-01

    The current status for manned Mars missions and the associated challenges are summarized. Mission benefits are considered to increase with increasing Mars entry velocity. However, significant benefits accrue at moderate entry velocities between 7 and 8 km/sec, which is the realistically achievable range in view of g-limits and heating constraints. Blunt, low mass/drag coefficient (reference area) vehicles with L/Ds from 0.3 to 0.5 are found to be the preferred configurations, taking into account their adequate control authority and good payload packaging characteristics. The overall design characteristics of Mars aerocapture vehicles can be established with good confidence, using flight and ground test data and the state-of-the-art flow field analysis techniques. The principal challenges are identified as follows: to refine the knowledge of the Martian atmosphere in order to reduce design conservatism, to extend present stagnation region heating analyses to the entire vehicle forebody, and to develop reflective low-wall-catalycity TPS systems for enabling reusable vehicles.

  16. Aerocapture Inflatable Decelerator (AID)

    NASA Technical Reports Server (NTRS)

    Reza, Sajjad

    2007-01-01

    Forward Attached Inflatable Decelerators, more commonly known as inflatable aeroshells, provide an effective, cost efficient means of decelerating spacecrafts by using atmospheric drag for aerocapture or planetary entry instead of conventional liquid propulsion deceleration systems. Entry into planetary atmospheres results in significant heating and aerodynamic pressures which stress aeroshell systems to their useful limits. Incorporation of lightweight inflatable decelerator surfaces with increased surface-area footprints provides the opportunity to reduce heat flux and induced temperatures, while increasing the payload mass fraction. Furthermore, inflatable aeroshell decelerators provide the needed deceleration at considerably higher altitudes and Mach numbers when compared with conventional rigid aeroshell entry systems. Inflatable aeroshells also provide for stowage in a compact space, with subsequent deployment of a large-area, lightweight heatshield to survive entry heating. Use of a deployable heatshield decelerator not only enables an increase in the spacecraft payload mass fraction and but may also eliminate the need for a spacecraft backshell and cruise stage. This document is the viewgraph slides for the paper's presentation.

  17. Post-aerocapture orbit selection and maintenance for the Aerofast mission to Mars

    NASA Astrophysics Data System (ADS)

    Pontani, Mauro; Teofilatto, Paolo

    2012-10-01

    Aerofast is the abbreviation of “aerocapture for future space transportation” and represents a project aimed at developing aerocapture techniques with regard to an interplanetary mission to Mars, in the context of the 7th Framework Program, with the financial support of the European Union. This paper describes the fundamental characteristics of the operational orbit after aerocapture for the mission of interest, as well as the related maintenance strategy. The final orbit selection depends on the desired lighting conditions, maximum revisit time of specific target regions, and feasibility of the orbit maintenance strategy. A sunsynchronous, frozen, repeating-ground-track orbit is chosen. First, the period of repetition is such that adjacent ascending node crossings (over the Mars surface) have a separation compatible with the swath of the optical payload. Secondly, the sunsynchronism condition ensures that a given latitude is periodically visited at the same local time, which condition is essential for comparing images of the same region at different epochs. Lastly, the fulfillment of the frozen condition guarantees improved orbit stability with respect to perturbations due to the zonal harmonics of Mars gravitational field. These three fundamental features of the operational orbit lead to determining its mean orbital elements. The evaluation of short and long period effects (e.g., those due to the sectorial harmonics of the gravitational field or to the aerodynamic drag) requires the determination of the osculating orbital elements at an initial reference time. This research describes a simple and accurate approach that leads to numerically determining these initial values, without employing complicated analytical developments. Numerical simulations demonstrate the long-period stability of the orbit when a significant number of harmonics of the gravitational field are taken into account. However, aerodynamic drag produces a relatively slow orbital decay at the

  18. Technology requirements for a generic aerocapture system. [for atmospheric entry

    NASA Technical Reports Server (NTRS)

    Cruz, M. I.

    1980-01-01

    The technology requirements for the design of a generic aerocapture vehicle system are summarized. These spacecraft have the capability of completely eliminating fuel-costly retropropulsion for planetary orbit capture through a single aerodynamically controlled atmospheric braking pass from a hyperbolic trajectory into a near circular orbit. This generic system has application at both the inner and outer planets. Spacecraft design integration, navigation, communications, and aerothermal protection system design problems were assessed in the technology requirements study and are discussed in this paper.

  19. Aerocapture Inflatable Decelerator for Planetary Entry

    NASA Technical Reports Server (NTRS)

    Reza, Sajjad; Hund, Richard; Kustas, Frank; Willcockson, William; Songer, Jarvis; Brown, Glen

    2007-01-01

    Forward Attached Inflatable Decelerators, more commonly known as inflatable aeroshells, provide an effective, cost efficient means of decelerating spacecrafts by using atmospheric drag for aerocapture or planetary entry instead of conventional liquid propulsion deceleration systems. Entry into planetary atmospheres results in significant heating and aerodynamic pressures which stress aeroshell systems to their useful limits. Incorporation of lightweight inflatable decelerator surfaces with increased surface-area footprints provides the opportunity to reduce heat flux and induced temperatures, while increasing the payload mass fraction. Furthermore, inflatable aeroshell decelerators provide the needed deceleration at considerably higher altitudes and Mach numbers when compared with conventional rigid aeroshell entry systems. Inflatable aeroshells also provide for stowage in a compact space, with subsequent deployment of a large-area, lightweight heatshield to survive entry heating. Use of a deployable heatshield decelerator enables an increase in the spacecraft payload mass fraction and may eliminate the need for a spacecraft backshell.

  20. Blended control, predictor-corrector guidance algorithm: an enabling technology for Mars aerocapture.

    PubMed

    Jits, Roman Y; Walberg, Gerald D

    2004-03-01

    A guidance scheme designed for coping with significant dispersion in the vehicle's state and atmospheric conditions is presented. In order to expand the flyable aerocapture envelope, control of the vehicle is realized through bank angle and angle-of-attack modulation. Thus, blended control of the vehicle is achieved, where the lateral and vertical motions of the vehicle are decoupled. The overall implementation approach is described, together with the guidance algorithm macrologic and structure. Results of guidance algorithm tests in the presence of various single and multiple off-nominal conditions are presented and discussed. c2003 Published by Elsevier Ltd.

  1. Entry, Descent, and Landing technological barriers and crewed MARS vehicle performance analysis

    NASA Astrophysics Data System (ADS)

    Subrahmanyam, Prabhakar; Rasky, Daniel

    2017-05-01

    Mars has been explored historically only by robotic crafts, but a crewed mission encompasses several new engineering challenges - high ballistic coefficient entry, hypersonic decelerators, guided entry for reaching intended destinations within acceptable margins for error in the landing ellipse, and payload mass are all critical factors for evaluation. A comprehensive EDL parametric analysis has been conducted in support of a high mass landing architecture by evaluating three types of vehicles -70° Sphere Cone, Ellipsled and SpaceX hybrid architecture called Red Dragon as potential candidate options for crewed entry vehicles. Aerocapture at the Martian orbit of about 400 km and subsequent Entry-from-orbit scenarios were investigated at velocities of 6.75 km/s and 4 km/s respectively. A study on aerocapture corridor over a range of entry velocities (6-9 km/s) suggests that a hypersonic L/D of 0.3 is sufficient for a Martian aerocapture. Parametric studies conducted by varying aeroshell diameters from 10 m to 15 m for several entry masses up to 150 mt are summarized and results reveal that vehicles with entry masses in the range of about 40-80 mt are capable of delivering cargo with a mass on the order of 5-20 mt. For vehicles with an entry mass of 20 mt to 80 mt, probabilistic Monte Carlo analysis of 5000 cases for each vehicle were run to determine the final landing ellipse and to quantify the statistical uncertainties associated with the trajectory and attitude conditions during atmospheric entry. Strategies and current technological challenges for a human rated Entry, Descent, and Landing to the Martian surface are presented in this study.

  2. Enhancement of the Natural Earth Satellite Population Through Meteoroid Aerocapture

    NASA Technical Reports Server (NTRS)

    Moorhead, Althea V.; Cooke, William J.

    2014-01-01

    The vast majority of meteoroids either fall to the ground as meteorites or ablate completely in the atmosphere. However, large meteoroids have been observed to pass through the atmosphere and reenter space in a few instances. These atmosphere-grazing meteoroids have been characterized using ground-based observation and satellite-based infrared detection. As these methods become more sensitive, smaller atmospheregrazing meteoroids will likely be detected. In anticipation of this increased detection rate, we compute the frequency with which centimeter-sized meteoroids graze and exit Earth's atmosphere. We characterize the post-atmosphere orbital characteristics of these bodies and conduct numerical simulations of their orbital evolution under the perturbing influence of the Sun and Moon. We find that a small subset of aerocaptured meteoroids are perturbed away from immediate atmospheric reentry and become temporary natural Earth satellites.

  3. Physiological constraints on deceleration during the aerocapture of manned vehicles

    NASA Technical Reports Server (NTRS)

    Lyne, J. E.

    1992-01-01

    The peak deceleration load allowed for aerobraking of manned vehicles is a critical parameter in planning future excursions to Mars. However, considerable variation exists in the limits used by various investigators. The goal of this study was to determine the most appropriate level for this limit. Methods: Since previous U.S. space flights have been limited to 84 days duration, Soviet flight results were examined. Published details of Soviet entry trajectories were not available. However, personal communication with Soviet cosmonauts suggested that peak entry loads of 5-6 G had been encountered upon return from 8 months in orbit. Soyuz entry capsule's characteristics were established and the capsule's entry trajectory was numerically calculated. The results confirm a peak load of 5 to 6 G. Results: Although the Soviet flights were of shorter duration than expected Mars missions, evidence exists that the deceleration experience is applicable. G tolerance has been shown to stabilize after 1 to 3 months in space if adequate countermeasures are used. The calculated Soyuz deceleration histories are graphically compared with those expected for Mars aerobraking. Conclusions: Previous spaceflight experience supports the use of a 5 G limit for the aerocapture of a manned vehicle at Mars.

  4. Numerical Roll Reversal Predictor Corrector Aerocapture and Precision Landing Guidance Algorithms for the Mars Surveyor Program 2001 Missions

    NASA Technical Reports Server (NTRS)

    Powell, Richard W.

    1998-01-01

    This paper describes the development and evaluation of a numerical roll reversal predictor-corrector guidance algorithm for the atmospheric flight portion of the Mars Surveyor Program 2001 Orbiter and Lander missions. The Lander mission utilizes direct entry and has a demanding requirement to deploy its parachute within 10 km of the target deployment point. The Orbiter mission utilizes aerocapture to achieve a precise captured orbit with a single atmospheric pass. Detailed descriptions of these predictor-corrector algorithms are given. Also, results of three and six degree-of-freedom Monte Carlo simulations which include navigation, aerodynamics, mass properties and atmospheric density uncertainties are presented.

  5. HyperPASS, a New Aeroassist Tool

    NASA Technical Reports Server (NTRS)

    Gates, Kristin; McRonald, Angus; Nock, Kerry

    2005-01-01

    A new software tool designed to perform aeroassist studies has been developed by Global Aerospace Corporation (GAC). The Hypersonic Planetary Aeroassist Simulation System (HyperPASS) [1] enables users to perform guided aerocapture, guided ballute aerocapture, aerobraking, orbit decay, or unguided entry simulations at any of six target bodies (Venus, Earth, Mars, Jupiter, Titan, or Neptune). HyperPASS is currently being used for trade studies to investigate (1) aerocapture performance with alternate aeroshell types, varying flight path angle and entry velocity, different gload and heating limits, and angle of attack and angle of bank variations; (2) variable, attached ballute geometry; (3) railgun launched projectile trajectories, and (4) preliminary orbit decay evolution. After completing a simulation, there are numerous visualization options in which data can be plotted, saved, or exported to various formats. Several analysis examples will be described.

  6. Testing of Flexible Ballutes in Hypersonic Wind Tunnels for Planetary Aerocapture

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    2007-01-01

    Studies were conducted for the In-Space Propulsion (ISP) Ultralightweight Ballute Technology Development Program to increase the technical readiness level of inflatable decelerator systems for planetary aerocapture. The present experimental study was conducted to develop the capability for testing lightweight, flexible materials in hypersonic facilities. The primary objectives were to evaluate advanced polymer film materials in a high-temperature, high-speed flow environment and provide experimental data for comparisons with fluid-structure interaction modeling tools. Experimental testing was conducted in the Langley Aerothermodynamics Laboratory 20-Inch Hypersonic CF4 and 31-Inch Mach 10 Air blowdown wind tunnels. Quantitative flexure measurements were made for 60 degree half angle afterbody-attached ballutes, in which polyimide films (1-mil and 3- mil thick) were clamped between a 1/2-inch diameter disk and a base ring (4-inch and 6-inch diameters). Deflection measurements were made using a parallel light silhouette of the film surface through an existing schlieren optical system. The purpose of this paper is to discuss these results as well as free-flying testing techniques being developed for both an afterbody-attached and trailing toroidal ballute configuration to determine dynamic fluid-structural stability. Methods for measuring polymer film temperature were also explored using both temperature sensitive paints (for up to 370 C) and laser-etched thin-film gages.

  7. Testing of Flexible Ballutes in Hypersonic Wind Tunnels for Planetary Aerocapture

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    2006-01-01

    Studies were conducted for the In-Space Propulsion (ISP) Ultralightweight Ballute Technology Development Program to increase the technical readiness level of inflatable decelerator systems for planetary aerocapture. The present experimental study was conducted to develop the capability for testing lightweight, flexible materials in hypersonic facilities. The primary objectives were to evaluate advanced polymer film materials in a high-temperature, high-speed flow environment and provide experimental data for comparisons with fluid-structure interaction modeling tools. Experimental testing was conducted in the Langley Aerothermodynamics Laboratory 20-Inch Hypersonic CF4 and 31-Inch Mach 10 Air blowdown wind tunnels. Quantitative flexure measurements were made for 60 degree half angle afterbody-attached ballutes, in which polyimide films (1-mil and 3-mil thick) were clamped between a 1/2-inch diameter disk and a base ring (4-inch and 6-inch diameters). Deflection measurements were made using a parallel light silhouette of the film surface through an existing schlieren optical system. The purpose of this paper is to discuss these results as well as free-flying testing techniques being developed for both an afterbody-attached and trailing toroidal ballute configuration to determine dynamic fluid-structural stability. Methods for measuring polymer film temperature were also explored using both temperature sensitive paints (for up to 370 C) and laser-etched thin-film gages.

  8. Aerothermodynamic environments for Mars entry, Mars return, and lunar return aerobraking missions

    NASA Astrophysics Data System (ADS)

    Rochelle, W. C.; Bouslog, S. A.; Ting, P. C.; Curry, D. M.

    1990-06-01

    The aeroheating environments to vehicles undergoing Mars aerocapture, earth aerocapture from Mars, and earth aerocapture from the moon are presented. An engineering approach for the analysis of various types of vehicles and trajectories was taken, rather than performing a benchmark computation for a specific point at a selected time point in a trajectory. The radiation into Mars using the Mars Rover Sample Return (MRSR) 2-ft nose radius bionic remains a small contributor of heating for 6 to 10 km/sec; however, at 12 km/sec it becomes comparable with the convection. For earth aerocapture, returning from Mars, peak radiation for the MRSR SRC is only 25 percent of the peak convection for the 12-km/sec trajectory. However, when large vehicles are considered with this trajectory, peak radiation can become 2 to 4 times higher than the peak convection. For both Mars entry and return, a partially ablative Thermal Protection System (TPS) would be required, but for Lunar Transfer Vehicle return an all-reusable TPS can be used.

  9. Atmospheric Models for Aeroentry and Aeroassist

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta; Keller, Vernon W.

    2005-01-01

    Eight destinations in the Solar System have sufficient atmosphere for aeroentry, aeroassist, or aerobraking/aerocapture: Venus, Earth, Mars, Jupiter, Saturn, Uranus, and Neptune, plus Saturn's moon Titan. Engineering-level atmospheric models for Earth, Mars, Titan, and Neptune have been developed for use in NASA's systems analysis studies of aerocapture applications. Development has begun on a similar atmospheric model for Venus. An important capability of these models is simulation of quasi-random perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design. Characteristics of these atmospheric models are compared, and example applications for aerocapture are presented. Recent Titan atmospheric model updates are discussed, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Probe entry at Titan. Recent and planned updates to the Mars atmospheric model, in support of future Mars aerocapture systems analysis studies, are also presented.

  10. Atmospheric Models for Aeroentry and Aeroassist

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta; Keller, Vernon W.

    2004-01-01

    Eight destinations in the Solar System have sufficient atmosphere for aeroentry, aeroassist, or aerobraking/aerocapture: Venus, Earth, Mars, Jupiter, Saturn, Uranus, and Neptune, plus Saturn's moon Titan. Engineering-level atmospheric models for Earth, Mars, Titan, and Neptune have been developed for use in NASA s systems analysis studies of aerocapture applications. Development has begun on a similar atmospheric model for Venus. An important capability of these models is simulation of quasi-random perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design. Characteristics of these atmospheric models are compared, and example applications for aerocapture are presented. Recent Titan atmospheric model updates are discussed, in anticipation of applications for trajectory and atmospheric reconstruct of Huygens Probe entry at Titan. Recent and planned updates to the Mars atmospheric model, in support of future Mars aerocapture systems analysis studies, are also presented.

  11. Titan Flagship Mission 3-Degree-of-Freedom Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Prince, Jill L.; Powell, R. W.; Lockwood, Mary Kae

    2008-01-01

    A NASA flagship mission to Titan, the largest moon of Saturn and the only moon in the solar system with a significant atmosphere, has been designed that uses three separate spacecraft, each requiring significant interaction with the atmosphere. The first vehicle is a Titan lander for lower-atmosphere and surface science. The second is an aerial vehicle for aerial science at approximately 10 km altitude with an expected lifetime of one year. This spacecraft will use the natural winds of Titan to cover a large area over its lifetime. The third vehicle is a Titan orbiter that will interact with the atmosphere in two respects. The first atmospheric interaction is the orbital insertion maneuver that will be accomplished using aerocapture, during which time the hyperbolic approach of 6.5 km/s will be reduced to 1.6 km/s over 41 minutes with an exit periapsis altitude of 130 km. The second atmospheric interaction occurs after a propulsive maneuver has raised the periapsis after aerocapture to 1170 km, where the atmosphere will be sampled over several months. This is the first phase of aerosampling that covers southern latitudes. After a 3.3-year circular science phase at an altitude of 1700 km, a second phase of additional aerosampling is performed sampling northern latitudes. The atmospheric trajectory analysis for these three spacecraft will be discussed throughout this paper.

  12. An Aeroelastic Analysis of a Thin Flexible Membrane

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Bartels, Robert E.; Kandil, Osama A.

    2007-01-01

    Studies have shown that significant vehicle mass and cost savings are possible with the use of ballutes for aero-capture. Through NASA's In-Space Propulsion program, a preliminary examination of ballute sensitivity to geometry and Reynolds number was conducted, and a single-pass coupling between an aero code and a finite element solver was used to assess the static aeroelastic effects. There remain, however, a variety of open questions regarding the dynamic aeroelastic stability of membrane structures for aero-capture, with the primary challenge being the prediction of the membrane flutter onset. The purpose of this paper is to describe and begin addressing these issues. The paper includes a review of the literature associated with the structural analysis of membranes and membrane utter. Flow/structure analysis coupling and hypersonic flow solver options are also discussed. An approach is proposed for tackling this problem that starts with a relatively simple geometry and develops and evaluates analysis methods and procedures. This preliminary study considers a computationally manageable 2-dimensional problem. The membrane structural models used in the paper include a nonlinear finite-difference model for static and dynamic analysis and a NASTRAN finite element membrane model for nonlinear static and linear normal modes analysis. Both structural models are coupled with a structured compressible flow solver for static aeroelastic analysis. For dynamic aeroelastic analyses, the NASTRAN normal modes are used in the structured compressible flow solver and 3rd order piston theories were used with the finite difference membrane model to simulate utter onset. Results from the various static and dynamic aeroelastic analyses are compared.

  13. Trajectory and Aeroheating Environment Development and Sensitivity Analysis for Capsule-shaped Vehicles

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.; Wurster, Kathryn E.

    2006-01-01

    Recently, NASA's Exploration Systems Research and Technology Project funded several tasks that endeavored to develop and evaluate various thermal protection systems and high temperature material concepts for potential use on the crew exploration vehicle. In support of these tasks, NASA Langley's Vehicle Analysis Branch generated trajectory information and associated aeroheating environments for more than 60 unique entry cases. Using the Apollo Command Module as the baseline entry system because of its relevance to the favored crew exploration vehicle design, trajectories for a range of lunar and Mars return, direct and aerocapture Earth-entry scenarios were developed. For direct entry, a matrix of cases was created that reflects reasonably expected minimum and maximum values of vehicle ballistic coefficient, inertial velocity at entry interface, and inertial flight path angle at entry interface. For aerocapture, trajectories were generated for a range of values of initial velocity and ballistic coefficient that, when combined with proper initial flight path angles, resulted in achieving a low Earth orbit either by employing a full lift vector up or full lift vector down attitude. For each trajectory generated, aeroheating environments were generated which were intended to bound the thermal protection system requirements for likely crew exploration vehicle concepts. The trades examined clearly pointed to a range of missions / concepts that will require ablative systems as well as a range for which reusable systems may be feasible. In addition, the results clearly indicated those entry conditions and modes suitable for manned flight, considering vehicle deceleration levels experienced during entry. This paper presents an overview of the analysis performed, including the assumptions, methods, and general approach used, as well as a summary of the trajectory and aerothermal environment information that was generated.

  14. GRAM Series of Atmospheric Models for Aeroentry and Aeroassist

    NASA Technical Reports Server (NTRS)

    Duvall, Aleta; Justus, C. G.; Keller, Vernon W.

    2005-01-01

    The eight destinations in the Solar System with sufficient atmosphere for either aeroentry or aeroassist, including aerocapture, are: Venus, Earth, Mars, Jupiter, Saturn; Uranus. and Neptune, and Saturn's moon Titan. Engineering-level atmospheric models for four of these (Earth, Mars, Titan, and Neptune) have been developed for use in NASA's systems analysis studies of aerocapture applications in potential future missions. Work has recently commenced on development of a similar atmospheric model for Venus. This series of MSFC-sponsored models is identified as the Global Reference Atmosphere Model (GRAM) series. An important capability of all of the models in the GRAM series is their ability to simulate quasi-random perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design. Example applications for Earth aeroentry and Mars aerocapture systems analysis studies are presented and illustrated. Current and planned updates to the Earth and Mars atmospheric models, in support of NASA's new exploration vision, are also presented.

  15. Parametric Study of Biconic Re-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Steele, Bryan; Banks, Daniel W.; Whitmore, Stephen A.

    2007-01-01

    An optimization based on hypersonic aerodynamic performance and volumetric efficiency was accomplished for a range of biconic configurations. Both axisymmetric and quasi-axisymmetric geometries (bent and flattened) were analyzed. The aerodynamic optimization wag based on hypersonic simple Incidence angle analysis tools. The range of configurations included those suitable for r lunar return trajectory with a lifting aerocapture at Earth and an overall volume that could support a nominal crew. The results yielded five configurations that had acceptable aerodynamic performance and met overall geometry and size limitations

  16. Advanced Chemical Propulsion System Study

    NASA Technical Reports Server (NTRS)

    Portz, Ron; Alexander, Leslie; Chapman, Jack; England, Chris; Henderson, Scott; Krismer, David; Lu, Frank; Wilson, Kim; Miller, Scott

    2007-01-01

    A detailed; mission-level systems study has been performed to show the benefit resulting from engine performance gains that will result from NASA's In-Space Propulsion ROSS Cycle 3A NRA, Advanced Chemical Technology sub-topic. The technology development roadmap to accomplish the NRA goals are also detailed in this paper. NASA-Marshall and NASA-JPL have conducted mission-level studies to define engine requirements, operating conditions, and interfaces. Five reference missions have been chosen for this analysis based on scientific interest, current launch vehicle capability and trends in space craft size: a) GTO to GEO, 4800 kg, delta-V for GEO insertion only approx.1830 m/s; b) Titan Orbiter with aerocapture, 6620 kg, total delta V approx.210 m/s, mostly for periapsis raise after aerocapture; c) Enceladus Orbiter (Titan aerocapture) 6620 kg, delta V approx.2400 m/s; d) Europa Orbiter, 2170 kg, total delta V approx.2600 m/s; and e) Mars Orbiter, 2250 kg, total delta V approx.1860 m/s. The figures of merit used to define the benefit of increased propulsion efficiency at the spacecraft level include propulsion subsystem wet mass, volume and overall cost. The objective of the NRA is to increase the specific impulse of pressure-fed earth storable bipropellant rocket engines to greater than 330 seconds with nitrogen tetroxide and monomothylhydrazine propellants and greater than 335 , seconds with nitrogen tetroxide and hydrazine. Achievement of the NRA goals will significantly benefit NASA interplanetary missions and other government and commercial opportunities by enabling reduced launch weight and/or increased payload. The study also constitutes a crucial stepping stone to future development, such as pump-fed storable engines.

  17. Aero-Assisted Spacecraft Missions Using Hypersonic Waverider Aeroshells

    NASA Astrophysics Data System (ADS)

    Knittel, Jeremy

    This work examines the use of high-lift, low drag vehicles which perform orbital transfers within a planet's atmosphere to reduce propulsive requirements. For the foreseeable future, spacecraft mission design will include the objective of limiting the mass of fuel required. One means of accomplishing this is using aerodynamics as a supplemental force, with what is termed an aero-assist maneuver. Further, the use of a lifting body enables a mission designer to explore candidate trajectory types wholly unavailable to non-lifting analogs. Examples include missions to outer planets by way of an aero-gravity assist, aero-assisted plane change, aero-capture, and steady atmospheric periapsis probing missions. Engineering level models are created in order to simulate both atmospheric and extra-atmospheric space flight. Each mission is parameterized using discrete variables which control multiple areas of design. This work combines the areas of hypersonic aerodynamics, re-entry aerothermodynamics, spacecraft orbital mechanics, and vehicle shape optimization. In particular, emphasis is given to the parametric design of vehicles known as "waveriders" which are inversely designed from known shock flowfields. An entirely novel means of generating a class of waveriders known as "starbodies" is presented. A complete analysis is performed of asymmetric starbody forms and compared to a better understood parameterization, "osculating cone" waveriders. This analysis includes characterization of stability behavior, a critical discipline within hypersonic flight. It is shown that asymmetric starbodies have significant stability improvement with only a 10% reduction in the lift-to-drag ratio. By combining the optimization of both the shape of the vehicle and the trajectory it flies, much is learned about the benefit that can be expected from lifting aero-assist missions. While previous studies have conceptually proven the viability, this work provides thorough quantification of the

  18. Guidance and Control Architecture Design and Demonstration for Low Ballistic Coefficient Atmospheric Entry

    NASA Technical Reports Server (NTRS)

    Swei, Sean

    2014-01-01

    We propose to develop a robust guidance and control system for the ADEPT (Adaptable Deployable Entry and Placement Technology) entry vehicle. A control-centric model of ADEPT will be developed to quantify the performance of candidate guidance and control architectures for both aerocapture and precision landing missions. The evaluation will be based on recent breakthroughs in constrained controllability/reachability analysis of control systems and constrained-based energy-minimum trajectory optimization for guidance development operating in complex environments.

  19. Ultralightweight Ballute Technology Advances

    NASA Technical Reports Server (NTRS)

    Masciarelli, Jim; Miller, Kevin

    2005-01-01

    Ultralightweight ballutes offer the potential to provide the deceleration for entry and aerocapture missions at a fraction of the mass of traditional methods. A team consisting of Ball Aerospace, ILC Dover, NASA Langley, NASA Johnson, and the Jet Propulsion Laboratory has been addressing the technical issues associated with ultralightweight ballutes for aerocapture at Titan. Significant progress has been made in the areas of ballute materials, aerothermal analysis, trajectory control, and aeroelastic modeling. The status and results of efforts in these areas are presented. The results indicate that an ultralightweight ballute system mass of 8 to 10 percent of the total entry mass is possible.

  20. Review of NASA In-Space Propulsion Technology Program Inflatable Decelerator Investments

    NASA Technical Reports Server (NTRS)

    Richardson, E. H.; Mnk, M. M.; James, B. F.; Moon, S. A.

    2005-01-01

    The NASA In-Space Propulsion Technology (ISPT) Program is managed by the NASA Headquarters Science Mission Directorate and is implemented by the Marshall Space Flight Center in Huntsville, Alabama. The ISPT objective is to fund development of promising in-space propulsion technologies that can decrease flight times, decrease cost, or increase delivered payload mass for future science missions. Before ISPT will invest in a technology, the Technology Readiness Level (TRL) of the concept must be estimated to be at TRL 3. A TRL 3 signifies that the technical community agrees that the feasibility of the concept has been proven through experiment or analysis. One of the highest priority technology investments for ISPT is Aerocapture. The aerocapture maneuver uses a planetary atmosphere to reduce or alter the speed of a vehicle allowing for quick, propellantless (or using very little propellant) orbit capture. The atmosphere is used as a brake, transferring the energy associated with the vehicle's high speed into thermal energy. The ISPT Aerocapture Technology Area (ATA) is currently investing in the development of advanced lightweight ablative thermal protection systems, high temperature composite structures, and heat-flux sensors for rigid aeroshells. The heritage of rigid aeroshells extends back to the Apollo era and this technology will most likely be used by the first generation aerocapture vehicle. As a second generation aerocapture technology, ISPT is investing in three inflatable aerodynamic decelerator concepts for planetary aerocapture. They are: trailing ballute (balloon-parachute), attached afterbody ballute, and an inflatable aeroshell. ISPT also leverages the NASA Small Business Innovative Research Program for additional inflatable decelerator technology development. In mid-2004 ISPT requested an independent review of the three inflatable decelerator technologies funded directly by ISPT to validate the TRL and to identify technology maturation concerns. An

  1. Review of NASA In-Space Propulsion Technology Program Inflatable Decelerator Investments

    NASA Technical Reports Server (NTRS)

    Richardson, Erin H.; Munk, Michelle M.; James, Bonnie F.; Moon, Steve A.

    2005-01-01

    The NASA In-Space Propulsion Technology (ISPT) Program is managed by the NASA Headquarters Science Mission Directorate and is implemented by the Marshall Space Flight Center in Huntsville, Alabama. The ISPT objective is to fund development of promising in- space propulsion technologies that can decrease flight times, decrease cost, or increase delivered payload mass for future science missions. Before ISPT will invest in a technology, the Technology Readiness Level (TRL) of the concept must be estimated to be at TRL 3. A TRL 3 signifies that the technical community agrees that the feasibility of the concept has been proven through experiment or analysis. One of the highest priority technology investments for ISPT is Aerocapture. The aerocapture maneuver uses a planetary atmosphere to reduce or alter the speed of a vehicle allowing for quick, propellantless (or using very little propellant) orbit capture. The atmosphere is used as a brake, transferring the energy associated with the vehicle s high speed into thermal energy. The ISPT Aerocapture Technology Area (ATA) is currently investing in the development of advanced lightweight ablative thermal protection systems, high temperature composite structures, and heat-flux sensors for rigid aeroshells. The heritage of rigid aeroshells extends back to the Apollo era and this technology will most likely be used by the first generation aerocapture vehicle. As a second generation aerocapture technology, ISPT is investing in three inflatable aerodynamic decelerator concepts for planetary aerocapture. They are: trailing ballute (balloon-parachute), attached afterbody ballute, and an inflatable aeroshell. ISPT also leverages the NASA Small Business Innovative Research Program for additional inflatable decelerator technology development. In mid-2004 ISPT requested an independent review of the three inflatable decelerator technologies funded directly by ISPT to validate the TRL and to identify technology maturation concerns. An

  2. The NASA In-Space Propulsion Technology Project, Products, and Mission Applicability

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Pencil, Eric; Liou, Larry; Dankanich, John; Munk, Michelle M.; Kremic, Tibor

    2009-01-01

    The In-Space Propulsion Technology (ISPT) Project, funded by NASA s Science Mission Directorate (SMD), is continuing to invest in propulsion technologies that will enable or enhance NASA robotic science missions. This overview provides development status, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of aerocapture, electric propulsion, advanced chemical thrusters, and systems analysis tools. Aerocapture investments improved: guidance, navigation, and control models of blunt-body rigid aeroshells; atmospheric models for Earth, Titan, Mars, and Venus; and models for aerothermal effects. Investments in electric propulsion technologies focused on completing NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6 to 7 kW throttle-able gridded ion system. The project is also concluding its High Voltage Hall Accelerator (HiVHAC) mid-term product specifically designed for a low-cost electric propulsion option. The primary chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. The project is also delivering products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. In-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations.

  3. NASA's In-Space Propulsion Technology Project Overview, Near-term Products and Mission Applicability

    NASA Technical Reports Server (NTRS)

    Dankanich, John; Anderson, David J.

    2008-01-01

    The In-Space Propulsion Technology (ISPT) Project, funded by NASA's Science Mission Directorate (SMD), is continuing to invest in propulsion technologies that will enable or enhance NASA robotic science missions. This overview provides development status, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of aerocapture, electric propulsion, advanced chemical thrusters, and systems analysis tools. Aerocapture investments improved (1) guidance, navigation, and control models of blunt-body rigid aeroshells, 2) atmospheric models for Earth, Titan, Mars and Venus, and 3) models for aerothermal effects. Investments in electric propulsion technologies focused on completing NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system. The project is also concluding its High Voltage Hall Accelerator (HiVHAC) mid-term product specifically designed for a low-cost electric propulsion option. The primary chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. The project is also delivering products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. In-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations.

  4. The Status of Spacecraft Bus and Platform Technology Development under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Pencil, Eric; Dankanich, John; Glaab, Louis; Peterson, Todd

    2013-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance; 2) NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells; and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System and ultralightweight propellant tank technologies. Future directions for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV); 2) multi-mission technologies for Earth Entry Vehicles (MMEEV); and 3) electric propulsion. These technologies are more vehicles and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These inspace propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicability to

  5. The status of spacecraft bus and platform technology development under the NASA ISPT program

    NASA Astrophysics Data System (ADS)

    Anderson, D. J.; Munk, M. M.; Pencil, E.; Dankanich, J.; Glaab, L.; Peterson, T.

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance; 2) NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GN& C) models of blunt-body rigid aeroshells; and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System and ultra-lightweight propellant tank technologies. Future directions for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV); 2) multi-mission technologies for Earth Entry Vehicles (MMEEV); and 3) electric propulsion. These technologies are more vehicles and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicabilit- to

  6. Spacecraft Bus and Platform Technology Development under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Pencil, Eric; Dankanich, John; Glaab, Louis; Peterson, Todd

    2013-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance; 2) NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells; and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System, and ultra-lightweight propellant tank technologies. Future direction for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV); 2) multi-mission technologies for Earth Entry Vehicles (MMEEV) for sample return missions; and 3) electric propulsion for sample return and low cost missions. These technologies are more vehicle and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions

  7. NASA In-Space Propulsion Technologies and Their Infusion Potential

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Pencil,Eric J.; Peterson, Todd; Vento, Daniel; Munk, Michelle M.; Glaab, Louis J.; Dankanich, John W.

    2012-01-01

    The In-Space Propulsion Technology (ISPT) program has been developing in-space propulsion technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (Electric and Chemical), Entry Vehicle Technologies (Aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance; 2) NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells; and aerothermal effect models. Two component technologies that will be ready for flight infusion in the near future will be Advanced Xenon Flow Control System, and ultra-lightweight propellant tank technologies. Future focuses for ISPT are sample return missions and other spacecraft bus technologies like: 1) Mars Ascent Vehicles (MAV); 2) multi-mission technologies for Earth Entry Vehicles (MMEEV) for sample return missions; and 3) electric propulsion for sample return and low cost missions. These technologies are more vehicle-focused, and present a different set of technology infusion challenges. While the Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicability to potential Flagship missions. This paper

  8. The Status of Spacecraft Bus and Platform Technology Development Under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Pencil, Eric J.; Dankanich, John; Glaab, Louis J.

    2013-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance 2) NASAs Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System, and ultra-lightweight propellant tank technologies. Future direction for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV) 2) multi-mission technologies for Earth Entry Vehicles (MMEEV) and 3) electric propulsion. These technologies are more vehicle and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicability to potential

  9. Spacecraft Bus and Platform Technology Development under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Pencil, Eric J.; Dankanich, John W.; Glaab, Louis J.; Peterson, Todd T.

    2013-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance 2) NASAs Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System, and ultra-lightweight propellant tank technologies. Future direction for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV) 2) multi-mission technologies for Earth Entry Vehicles (MMEEV) for sample return missions and 3) electric propulsion for sample return and low cost missions. These technologies are more vehicle and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently

  10. A probabilistic sizing tool and Monte Carlo analysis for entry vehicle ablative thermal protection systems

    NASA Astrophysics Data System (ADS)

    Mazzaracchio, Antonio; Marchetti, Mario

    2010-03-01

    Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.

  11. Automating Structural Analysis of Spacecraft Vehicles

    NASA Technical Reports Server (NTRS)

    Hrinda, Glenn A.

    2004-01-01

    A major effort within NASA's vehicle analysis discipline has been to automate structural analysis and sizing optimization during conceptual design studies of advanced spacecraft. Traditional spacecraft structural sizing has involved detailed finite element analysis (FEA) requiring large degree-of-freedom (DOF) finite element models (FEM). Creation and analysis of these models can be time consuming and limit model size during conceptual designs. The goal is to find an optimal design that meets the mission requirements but produces the lightest structure. A structural sizing tool called HyperSizer has been successfully used in the conceptual design phase of a reusable launch vehicle and planetary exploration spacecraft. The program couples with FEA to enable system level performance assessments and weight predictions including design optimization of material selections and sizing of spacecraft members. The software's analysis capabilities are based on established aerospace structural methods for strength, stability and stiffness that produce adequately sized members and reliable structural weight estimates. The software also helps to identify potential structural deficiencies early in the conceptual design so changes can be made without wasted time. HyperSizer's automated analysis and sizing optimization increases productivity and brings standardization to a systems study. These benefits will be illustrated in examining two different types of conceptual spacecraft designed using the software. A hypersonic air breathing, single stage to orbit (SSTO), reusable launch vehicle (RLV) will be highlighted as well as an aeroshell for a planetary exploration vehicle used for aerocapture at Mars. By showing the two different types of vehicles, the software's flexibility will be demonstrated with an emphasis on reducing aeroshell structural weight. Member sizes, concepts and material selections will be discussed as well as analysis methods used in optimizing the structure

  12. Venus Global Reference Atmospheric Model

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.

    2017-01-01

    Venus Global Reference Atmospheric Model (Venus-GRAM) is an engineering-level atmospheric model developed by MSFC that is widely used for diverse mission applications including: Systems design; Performance analysis; Operations planning for aerobraking, Entry, Descent and Landing, and aerocapture; Is not a forecast model; Outputs include density, temperature, pressure, wind components, and chemical composition; Provides dispersions of thermodynamic parameters, winds, and density; Optional trajectory and auxiliary profile input files Has been used in multiple studies and proposals including NASA Engineering and Safety Center (NESC) Autonomous Aerobraking and various Discovery proposals; Released in 2005; Available at: https://software.nasa.gov/software/MFS-32314-1.

  13. Mars Global Reference Atmospheric Model 2010 Version: Users Guide

    NASA Technical Reports Server (NTRS)

    Justh, H. L.

    2014-01-01

    This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.

  14. Neptune Orbiters Utilizing Solar and Radioisotope Electric Propulsion

    NASA Technical Reports Server (NTRS)

    Fiehler, Douglas I.; Oleson, Steven R.

    2004-01-01

    In certain cases, Radioisotope Electric Propulsion (REP), used in conjunction with other propulsion systems, could be used to reduce the trip times for outer planetary orbiter spacecraft. It also has the potential to improve the maneuverability and power capabilities of the spacecraft when the target body is reached as compared with non-electric propulsion spacecraft. Current missions under study baseline aerocapture systems to capture into a science orbit after a Solar Electric Propulsion (SEP) stage is jettisoned. Other options under study would use all REP transfers with small payloads. Compared to the SEP stage/Aerocapture scenario, adding REP to the science spacecraft as well as a chemical capture system can replace the aerocapture system but with a trip time penalty. Eliminating both the SEP stage and the aerocapture system and utilizing a slightly larger launch vehicle, Star 48 upper stage, and a combined REP/Chemical capture system, the trip time can nearly be matched while providing over a kilowatt of science power reused from the REP maneuver. A Neptune Orbiter mission is examined utilizing single propulsion systems and combinations of SEP, REP, and chemical systems to compare concepts.

  15. In-Space Propulsion for Science and Exploration

    NASA Technical Reports Server (NTRS)

    Bishop-Behel, Karen; Johnson, Les

    2004-01-01

    This paper presents viewgraphs on the development of In-Space Propulsion Technologies for Science and Exploration. The topics include: 1) In-Space Propulsion Technology Program Overview; 2) In-Space Propulsion Technology Project Status; 3) Solar Electric Propulsion; 4) Next Generation Electric Propulsion; 5) Aerocapture Technology Alternatives; 6) Aerocapture; 7) Advanced Thermal Protection Systems Developed and Being Tested; 8) Solar Sails; 9) Advanced Chemical Propulsion; 10) Momentum Exchange Tethers; and 11) Momentum-exchange/electrodynamic reboost (MXER) Tether Basic Operation.

  16. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  17. Structural-Thermal-Optical-Performance (STOP) Analysis

    NASA Technical Reports Server (NTRS)

    Bolognese, Jeffrey; Irish, Sandra

    2015-01-01

    The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Spaceflight Center (GSFC) Thermal Engineering Branch (Code 545). A STOP analysis is a multidiscipline analysis, consisting of Structural, Thermal and Optical Performance Analyses, that is performed for all space flight instruments and satellites. This course will explain the different parts of performing this analysis. The student will learn how to effectively interact with each discipline in order to accurately obtain the system analysis results.

  18. Dual Heat Pulse, Dual Layer Thermal Protection System Sizing Analysis and Trade Studies for Human Mars Entry Descent and Landing

    NASA Technical Reports Server (NTRS)

    McGuire, Mary Kathleen

    2011-01-01

    NASA has been recently updating design reference missions for the human exploration of Mars and evaluating the technology investments required to do so. The first of these started in January 2007 and developed the Mars Design Reference Architecture 5.0 (DRA5). As part of DRA5, Thermal Protection System (TPS) sizing analysis was performed on a mid L/D rigid aeroshell undergoing a dual heat pulse (aerocapture and atmospheric entry) trajectory. The DRA5 TPS subteam determined that using traditional monolithic ablator systems would be mass expensive. They proposed a new dual-layer TPS concept utilizing an ablator atop a low thermal conductivity insulative substrate to address the issue. Using existing thermal response models for an ablator and insulative tile, preliminary hand analysis of the dual layer concept at a few key heating points indicated that the concept showed potential to reduce TPS masses and warranted further study. In FY09, the followon Entry, Descent and Landing Systems Analysis (EDL-SA) project continued by focusing on Exploration-class cargo or crewed missions requiring 10 to 50 metric tons of landed payload. The TPS subteam advanced the preliminary dual-layer TPS analysis by developing a new process and updated TPS sizing code to rapidly evaluate mass-optimized, full body sizing for a dual layer TPS that is capable of dual heat pulse performance. This paper describes the process and presents the results of the EDL-SA FY09 dual-layer TPS analyses on the rigid mid L/D aeroshell. Additionally, several trade studies were conducted with the sizing code to evaluate the impact of various design factors, assumptions and margins.

  19. Materials Needs for Future In-space Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Johnson, Charles Les

    2008-01-01

    NASA is developing the next generation of in-space propulsion systems in support of robotic exploration missions throughout the solar system. The propulsion technologies being developed are non-traditional and have stressing materials performance requirements. (Chemical Propulsion) Earth-storable chemical bipropellant performance is constrained by temperature limitations of the columbium used in the chamber. Iridium/rhenium (Ir/Re) is now available and has been implemented in initial versions of Earth-Storable rockets with specific impulses (Isp) about 10 seconds higher than columbium rocket chambers. New chamber fabrication methods that improve process and performance of Ir/Re and other promising material systems are needed. (Solar Sail Propulsion) The solar sail is a propellantless propulsion system that gains momentum by reflecting sunlight. The sails need to be very large in area (from 10000 m2 up to 62500 m2) yet be very lightweight in order to achieve adequate accelerations for realistic mission times. Lightweight materials that can be manufactured in thicknesses of less than 1 micron and that are not harmed by the space environment are desired. (Aerocapture) Blunt Body Aerocapture uses aerodynamic drag to slow an approaching spacecraft and insert it into a science orbit around any planet or moon with an atmosphere. The spacecraft is enclosed by a rigid aeroshell that protects it from the entry heating and aerodynamic environment. Lightweight, high-temperature structural systems, adhesives, insulators, and ablatives are key components for improving aeroshell efficiencies at heating rates of 1000-2000 W/cu cm and beyond. Inflatable decelerators in the forms of ballutes and inflatable aeroshells will use flexible polymeric thin film materials, high temperature fabrics, and structural adhesives. The inflatable systems will be tightly packaged during cruise and will be inflated prior to entry interface at the destination. Materials must maintain strength and

  20. Shock Layer Radiation Measurements and Analysis for Mars Entry

    NASA Technical Reports Server (NTRS)

    Bose, Deepak; Grinstead, Jay Henderson; Bogdanoff, David W.; Wright, Michael J.

    2009-01-01

    NASA's In-Space Propulsion program is supporting the development of shock radiation transport models for aerocapture missions to Mars. A comprehensive test series in the NASA Antes Electric Arc Shock Tube facility at a representative flight condition was recently completed. The facility optical instrumentation enabled spectral measurements of shocked gas radiation from the vacuum ultraviolet to the near infrared. The instrumentation captured the nonequilibrium post-shock excitation and relaxation dynamics of dispersed spectral features. A description of the shock tube facility, optical instrumentation, and examples of the test data are presented. Comparisons of measured spectra with model predictions are also made.

  1. Performance Analysis of MYSEA

    DTIC Science & Technology

    2012-09-01

    Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties

  2. Co-Optimization of Blunt Body Shapes for Moving Vehicles

    NASA Technical Reports Server (NTRS)

    Kinney, David J. (Inventor); Mansour, Nagi N (Inventor); Brown, James L. (Inventor); Garcia, Joseph A (Inventor); Bowles, Jeffrey V (Inventor)

    2014-01-01

    A method and associated system for multi-disciplinary optimization of various parameters associated with a space vehicle that experiences aerocapture and atmospheric entry in a specified atmosphere. In one embodiment, simultaneous maximization of a ratio of landed payload to vehicle atmospheric entry mass, maximization of fluid flow distance before flow separation from vehicle, and minimization of heat transfer to the vehicle are performed with respect to vehicle surface geometric parameters, and aerostructure and aerothermal vehicle response for the vehicle moving along a specified trajectory. A Pareto Optimal set of superior performance parameters is identified.

  3. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  4. Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    The primary tasks performed are: (1) the development of a second order local thermodynamic nonequilibrium (LTNE) model for atoms; (2) the continued development of vibrational nonequilibrium models; and (3) the development of a new multicomponent diffusion model. In addition, studies comparing these new models with previous models and results were conducted and reported.

  5. Performance Analysis of GYRO: A Tool Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worley, P.; Roth, P.; Candy, J.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manualmore » analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.« less

  6. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  7. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  8. Comprehensive analysis of transport aircraft flight performance

    NASA Astrophysics Data System (ADS)

    Filippone, Antonio

    2008-04-01

    This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance

  9. COBRA ATD minefield detection model initial performance analysis

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.

  10. Model Performance Evaluation and Scenario Analysis ...

    EPA Pesticide Factsheets

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too

  11. Performance Analysis: Control of Hazardous Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Grange, Connie E.; Freeman, Jeff W.; Kerr, Christine E.

    2010-10-06

    LLNL experienced 26 occurrences related to the control of hazardous energy from January 1, 2008 through August 2010. These occurrences were 17% of the total number of reported occurrences during this 32-month period. The Performance Analysis and Reporting Section of the Contractor Assurance Office (CAO) routinely analyzes reported occurrences and issues looking for patterns that may indicate changes in LLNL’s performance and early indications of performance trends. It became apparent through these analyses that LLNL might have experienced a change in the control of hazardous energy and that these occurrences should be analyzed in more detail to determine if themore » perceived change in performance was real, whether that change is significant and if the causes of the occurrences are similar. This report documents the results of this more detailed analysis.« less

  12. Performance Analysis of Surfing: A Review.

    PubMed

    Farley, Oliver R L; Abbiss, Chris R; Sheppard, Jeremy M

    2017-01-01

    Farley, ORL, Abbiss, CR, and Sheppard, JM. Performance Analysis of Surfing: A Review. J Strength Cond Res 31(1): 260-271, 2017-Despite the increased professionalism and substantial growth of surfing worldwide, there is limited information available to practitioners and coaches in terms of key performance analytics that are common in other field-based sports. Indeed, research analyzing surfing performance is limited to a few studies examining male surfers' heart rates, surfing activities through time-motion analysis (TMA) using video recordings and Global Positioning Satellite (GPS) data during competition and recreational surfing. These studies have indicated that specific activities undertaken during surfing are unique with a variety of activities (i.e., paddling, resting, wave riding, breath holding, and recovery of surfboard in the surf). Furthermore, environmental and wave conditions also seem to influence the physical demands of competition surfing. It is due to these demands that surfers are required to have a high cardiorespiratory fitness, high muscular endurance, and considerable strength and anaerobic power, particular within the upper torso. By exploring various methods of performance analysis used within other sports, it is possible to improve our understanding of surfing demands. In so doing this will assist in the development of protocols and strategies to assess physiological characteristics of surfers, monitor athlete performance, improve training prescription, and identify talent. Therefore, this review explores the current literature to provide insights into methodological protocols, delimitations of research into athlete analysis and an overview of surfing dynamics. Specifically, this review will describe and review the use of TMA, GPS, and other technologies (i.e., HR) that are used in external and internal load monitoring as they pertain to surfing.

  13. Analytical guidance law development for aerocapture at Mars

    NASA Technical Reports Server (NTRS)

    Calise, A. J.

    1992-01-01

    During the first part of this reporting period research has concentrated on performing a detailed evaluation, to zero order, of the guidance algorithm developed in the first period taking the numerical approach developed in the third period. A zero order matched asymptotic expansion (MAE) solution that closely satisfies a set of 6 implicit equations in 6 unknowns to an accuracy of 10(exp -10), was evaluated. Guidance law implementation entails treating the current state as a new initial state and repetitively solving the MAE problem to obtain the feedback controls. A zero order guided solution was evaluated and compared with optimal solution that was obtained by numerical methods. Numerical experience shows that the zero order guided solution is close to optimal solution, and that the zero order MAE outer solution plays a critical role in accounting for the variations in Loh's term near the exit phase of the maneuver. However, the deficiency that remains in several of the critical variables indicates the need for a first order correction. During the second part of this period, methods for computing a first order correction were explored.

  14. Status and Mission Applicability of NASA's In-Space Propulsion Technology Project

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Dankanich, John; Pencil, Eric; Liou, Larry

    2009-01-01

    The In-Space Propulsion Technology (ISPT) project develops propulsion technologies that will enable or enhance NASA robotic science missions. Since 2001, the ISPT project developed and delivered products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. These in-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations. This paper provides status of the technology development, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of advanced chemical thrusters, electric propulsion, aerocapture, and systems analysis tools. The current chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. Investments in electric propulsion technologies focused on completing NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system, and the High Voltage Hall Accelerator (HiVHAC) thruster, which is a mid-term product specifically designed for a low-cost electric propulsion option. Aerocapture investments developed a family of thermal protections system materials and structures; guidance, navigation, and control models of blunt-body rigid aeroshells; atmospheric models for Earth, Titan, Mars and Venus; and models for aerothermal effects. In 2009 ISPT started the development of propulsion technologies that would enable future sample return missions. The paper describes the ISPT project's future focus on propulsion for sample return missions. The future technology development areas for ISPT is: Planetary Ascent Vehicles (PAV), with a Mars Ascent Vehicle (MAV) being the initial development focus; multi-mission technologies for Earth Entry Vehicles (MMEEV) needed

  15. A case study in nonconformance and performance trend analysis

    NASA Technical Reports Server (NTRS)

    Maloy, Joseph E.; Newton, Coy P.

    1990-01-01

    As part of NASA's effort to develop an agency-wide approach to trend analysis, a pilot nonconformance and performance trending analysis study was conducted on the Space Shuttle auxiliary power unit (APU). The purpose of the study was to (1) demonstrate that nonconformance analysis can be used to identify repeating failures of a specific item (and the associated failure modes and causes) and (2) determine whether performance parameters could be analyzed and monitored to provide an indication of component or system degradation prior to failure. The nonconformance analysis of the APU did identify repeating component failures, which possibly could be reduced if key performance parameters were monitored and analyzed. The performance-trending analysis verified that the characteristics of hardware parameters can be effective in detecting degradation of hardware performance prior to failure.

  16. Frame synchronization performance and analysis

    NASA Technical Reports Server (NTRS)

    Aguilera, C. S. R.; Swanson, L.; Pitt, G. H., III

    1988-01-01

    The analysis used to generate the theoretical models showing the performance of the frame synchronizer is described for various frame lengths and marker lengths at various signal to noise ratios and bit error tolerances.

  17. Performance analysis of mini-propellers based on FlightGear

    NASA Astrophysics Data System (ADS)

    Vogeltanz, Tomáš

    2016-06-01

    This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.

  18. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  19. CMS endcap RPC performance analysis

    NASA Astrophysics Data System (ADS)

    Teng, H.; CMS Collaboration

    2014-08-01

    The Resistive Plate Chamber (RPC) detector system in LHC-CMS experiment is designed for the trigger purpose. The endcap RPC system has been successfully operated since the commissioning period (2008) to the end of RUN1 (2013). We have developed an analysis tool for endcap RPC performance and validated the efficiency calculation algorithm, focusing on the first endcap station which was assembled and tested by the Peking University group. We cross checked the results obtained with those extracted with alternative methods and we found good agreement in terms of performance parameters [1]. The results showed that the CMS-RPC endcap system fulfilled the performance expected in the Technical Design Report [2].

  20. Factors affecting construction performance: exploratory factor analysis

    NASA Astrophysics Data System (ADS)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  1. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  2. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  3. 21st century early mission concepts for Mars delivery and earth return

    NASA Technical Reports Server (NTRS)

    Cruz, Manuel I.; Ilgen, Marc R.

    1990-01-01

    In the 21st century, the early missions to Mars will entail unmanned Rover and Sample Return reconnaissance missions to be followed by manned exploration missions. High performance leverage technologies will be required to reach Mars and return to earth. This paper describes the mission concepts currently identified for these early Mars missions. These concepts include requirements and capabilities for Mars and earth aerocapture, Mars surface operations and ascent, and Mars and earth rendezvous. Although the focus is on the unmanned missions, synergism with the manned missions is also discussed.

  4. A New Approach to Aircraft Robust Performance Analysis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Tierno, Jorge E.

    2004-01-01

    A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.

  5. Thermal protection system development, testing, and qualification for atmospheric probes and sample return missions. Examples for Saturn, Titan and Stardust-type sample return

    NASA Astrophysics Data System (ADS)

    Venkatapathy, E.; Laub, B.; Hartman, G. J.; Arnold, J. O.; Wright, M. J.; Allen, G. A.

    2009-07-01

    The science community has continued to be interested in planetary entry probes, aerocapture, and sample return missions to improve our understanding of the Solar System. As in the case of the Galileo entry probe, such missions are critical to the understanding not only of the individual planets, but also to further knowledge regarding the formation of the Solar System. It is believed that Saturn probes to depths corresponding to 10 bars will be sufficient to provide the desired data on its atmospheric composition. An aerocapture mission would enable delivery of a satellite to provide insight into how gravitational forces cause dynamic changes in Saturn's ring structure that are akin to the evolution of protoplanetary accretion disks. Heating rates for the "shallow" Saturn probes, Saturn aerocapture, and sample Earth return missions with higher re-entry speeds (13-15 km/s) from Mars, Venus, comets, and asteroids are in the range of 1-6 KW/cm 2. New, mid-density thermal protection system (TPS) materials for such probes can be mission enabling for mass efficiency and also for use on smaller vehicles enabled by advancements in scientific instrumentation. Past consideration of new Jovian multiprobe missions has been considered problematic without the Giant Planet arcjet facility that was used to qualify carbon phenolic for the Galileo probe. This paper describes emerging TPS technologies and the proposed use of an affordable, small 5 MW arcjet that can be used for TPS development, in test gases appropriate for future planetary probe and aerocapture applications. Emerging TPS technologies of interest include new versions of the Apollo Avcoat material and a densified variant of Phenolic Impregnated Carbon Ablator (PICA). Application of these and other TPS materials and the use of other facilities for development and qualification of TPS for Saturn, Titan, and Sample Return missions of the Stardust class with entry speeds from 6.0 to 28.6 km/s are discussed.

  6. Current Developments in Future Planetary Probe Sensors for TPS

    NASA Technical Reports Server (NTRS)

    Martinez, Ed; Venkatapathy, Ethiraj; Oishu, Tomo

    2003-01-01

    In-situ Thermal Protection System (TPS) sensors are required to provide traceability of TPS performance and sizing tools. Traceability will lead to higher fidelity design tools, which in turn will lead to lower design safety margins, and decreased heatshield mass. Decreasing TPS mass will enable certain missions that are not otherwise feasible, and directly increase science payload. NASA Ames is currently developing two flight measurements as essential to advancing the state of TPS traceability for material modeling and aerothermal simulation: heat flux and surface recession (for ablators). The heat flux gage is applicable to both ablators and non-ablators and is therefore the more generalized sensor concept of the two with wider applicability to mission scenarios. This paper describes the development of a microsensor capable of surface and in-depth temperature and heat flux measurements for TPS materials appropriate to Titan, Neptune, and Mars aerocapture, and direct entry. The thermal sensor will be monolithic solid state devices composed of thick film platinum RTD on an alumina substrate. Choice of materials and critical dimensions are used to tailor gage response, determined during calibration activities, to specific (forebody vs. aftbody) heating environments. Current design has maximum operating temperature of 1500 K, and allowable constant heat flux of q=28.7 watts per square centimeter, and time constants between 0.05 and 0.2 seconds. The catalytic and radiative response of these heat flux gages can also be changed through the use of appropriate coatings. By using several co-located gages with various surface coatings, data can be obtained to isolate surface heat flux components due to radiation, catalycity and convection. Selectivity to radiative heat flux is a useful feature even for an in-depth gage, as radiative transport may be a significant heat transport mechanism for porous TPS materials in Titan aerocapture. This paper also reports on progress to

  7. Performance analysis in saber.

    PubMed

    Aquili, Andrea; Tancredi, Virginia; Triossi, Tamara; De Sanctis, Desiree; Padua, Elvira; DʼArcangelo, Giovanna; Melchiorri, Giovanni

    2013-03-01

    Fencing is a sport practiced by both men and women, which uses 3 weapons: foil, épée, and saber. In general, there are few scientific studies available in international literature; they are limited to the performance analysis of fencing bouts, yet there is nothing about saber. There are 2 kinds of competitions in the World Cup for both men and women: the "FIE GP" and "A." The aim of this study was to carry out a saber performance analysis to gain useful indicators for the definition of a performance model. In addition, it is expected to verify if it could be influenced by the type of competition and if there are differences between men and women. Sixty bouts: 33 FIE GP and 27 "A" competitions (35 men's and 25 women's saber bouts) were analyzed. The results indicated that most actions are offensive (55% for men and 49% for women); the central area of the piste is mostly used (72% for men and 67% for women); the effective fighting time is 13.6% for men and 17.1% for women, and the ratio between the action and break times is 1:6.5 for men and 1:5.1 for women. A lunge is carried out every 23.9 seconds by men and every 20 seconds by women, and a direction change is carried out every 65.3 seconds by men and every 59.7 seconds by women. The data confirm the differences between the saber and the other 2 weapons. There is no significant difference between the data of the 2 different kinds of competitions.

  8. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance

  9. Analysis of ultra-triathlon performances

    PubMed Central

    Lepers, Romuald; Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas

    2011-01-01

    Despite increased interest in ultra-endurance events, little research has examined ultra-triathlon performance. The aims of this study were: (i) to compare swimming, cycling, running, and overall performances in three ultra-distance triathlons, double Ironman distance triathlon (2IMT) (7.6 km swimming, 360 km cycling, and 84.4 km running), triple Ironman distance triathlon (3IMT) (11.4 km, 540 km, and 126.6 km), and deca Ironman distance triathlon (10IMT) (38 km, 1800 km, and 420 km) and (ii) to examine the relationships between the 2IMT, 3IMT, and 10IMT performances to create predicted equations of the 10IMT performances. Race results from 1985 through 2009 were examined to identify triathletes who performed the three considered ultra-distances. In total, 73 triathletes (68 men and 5 women) were identified. The contribution of swimming to overall ultra-triathlon performance was lower than for cycling and running. Running performance was more important to overall performance for 2IMT and 3IMT compared with 10IMT The 2IMT and 3IMT performances were significantly correlated with 10IMT performances for swimming and cycling, but not for running. 10IMT total time performance might be predicted by the following equation: 10IMT race time (minutes) = 5885 + 3.69 × 3IMT race time (minutes). This analysis of human performance during ultra-distance triathlons represents a unique data set in the field of ultra-endurance events. Additional studies are required to determine the physiological and psychological factors associated with ultra-triathlon performance. PMID:24198579

  10. Caffeine ingestion enhances Wingate performance: a meta-analysis.

    PubMed

    Grgic, Jozo

    2018-03-01

    The positive effects of caffeine ingestion on aerobic performance are well-established; however, recent findings are suggesting that caffeine ingestion might also enhance components of anaerobic performance. A commonly used test of anaerobic performance and power output is the 30-second Wingate test. Several studies explored the effects of caffeine ingestion on Wingate performance, with equivocal findings. To elucidate this topic, this paper aims to determine the effects of caffeine ingestion on Wingate performance using meta-analytic statistical techniques. Following a search through PubMed/MEDLINE, Scopus, and SportDiscus ® , 16 studies were found meeting the inclusion criteria (pooled number of participants = 246). Random-effects meta-analysis of standardized mean differences (SMD) for peak power output and mean power output was performed. Study quality was assessed using the modified version of the PEDro checklist. Results of the meta-analysis indicated a significant difference (p = .005) between the placebo and caffeine trials on mean power output with SMD values of small magnitude (0.18; 95% confidence interval: 0.05, 0.31; +3%). The meta-analysis performed for peak power output indicated a significant difference (p = .006) between the placebo and caffeine trials (SMD = 0.27; 95% confidence interval: 0.08, 0.47 [moderate magnitude]; +4%). The results from the PEDro checklist indicated that, in general, studies are of good and excellent methodological quality. This meta-analysis adds on to the current body of evidence showing that caffeine ingestion can also enhance components of anaerobic performance. The results presented herein may be helpful for developing more efficient evidence-based recommendations regarding caffeine supplementation.

  11. Structural performance analysis and redesign

    NASA Technical Reports Server (NTRS)

    Whetstone, W. D.

    1978-01-01

    Program performs stress buckling and vibrational analysis of large, linear, finite-element systems in excess of 50,000 degrees of freedom. Cost, execution time, and storage requirements are kept reasonable through use of sparse matrix solution techniques, and other computational and data management procedures designed for problems of very large size.

  12. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  13. High Altitude Venus Operations Concept Trajectory Design, Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Lugo, Rafael A.; Ozoroski, Thomas A.; Van Norman, John W.; Arney, Dale C.; Dec, John A.; Jones, Christopher A.; Zumwalt, Carlie H.

    2015-01-01

    A trajectory design and analysis that describes aerocapture, entry, descent, and inflation of manned and unmanned High Altitude Venus Operation Concept (HAVOC) lighter-than-air missions is presented. Mission motivation, concept of operations, and notional entry vehicle designs are presented. The initial trajectory design space is analyzed and discussed before investigating specific trajectories that are deemed representative of a feasible Venus mission. Under the project assumptions, while the high-mass crewed mission will require further research into aerodynamic decelerator technology, it was determined that the unmanned robotic mission is feasible using current technology.

  14. Mars-GRAM 2010: Additions and Resulting Improvements

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Burns, K. Lee

    2013-01-01

    factors. The adjustment factors generated by this process had to satisfy the gas law as well as the hydrostatic relation and are expressed as a function of height (z), Latitude (Lat) and areocentric solar longitude (Ls). The greatest adjustments are made at large optical depths such as tau greater than 1. The addition of the adjustment factors has led to better correspondence to TES Limb data from 0-60 km altitude as well as better agreement with MGS, ODY and MRO data at approximately 90-130 km altitude. Improved Mars-GRAM atmospheric simulations for various locations, times and dust conditions on Mars will be presented at the workshop session. The latest results validating Mars-GRAM 2010 versus Mars Climate Sounder data will also be presented. Mars-GRAM 2010 updates have resulted in improved atmospheric simulations which will be very important when beginning systems design, performance analysis, and operations planning for future aerocapture, aerobraking or landed missions to Mars.

  15. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  16. Temporal geospatial analysis of secondary school students’ examination performance

    NASA Astrophysics Data System (ADS)

    Nik Abd Kadir, ND; Adnan, NA

    2016-06-01

    Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.

  17. NPAC-Nozzle Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.

  18. Automated Cache Performance Analysis And Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohror, Kathryn

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool tomore » gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS

  19. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.

    2003-01-01

    Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.

  20. Atmospheric Entry Studies for Uranus

    NASA Astrophysics Data System (ADS)

    Agrawal, P.; Allen, G. A.; Hwang, H. H.; Marley, M. S.; McGuire, M. K.; Garcia, J. A.; Sklyanskiy, E.; Huynh, L. C.; Moses, R. W.

    2014-06-01

    To better understand the technology requirements for a Uranus atmospheric entry probe, an internal NASA study funded by ISPT program was conducted. The talk describes two different approaches to the planet: 1) direct ballistic entry and 2) Aerocapture.

  1. What Do HPT Consultants Do for Performance Analysis?

    ERIC Educational Resources Information Center

    Kang, Sung

    2017-01-01

    This study was conducted to contribute to the field of Human Performance Technology (HPT) through the validation of the performance analysis process of the International Society for Performance Improvement (ISPI) HPT model, the most representative and frequently utilized process model in the HPT field. The study was conducted using content…

  2. Performance analysis and prediction in triathlon.

    PubMed

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  3. The value of job analysis, job description and performance.

    PubMed

    Wolfe, M N; Coggins, S

    1997-01-01

    All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.

  4. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  5. NASA In-Space Propulsion Technology Program: Overview and Update

    NASA Technical Reports Server (NTRS)

    Johnson, Les; Alexander, Leslie; Baggett, Randy M.; Bonometti, Joseph A.; Herrmann, Melody; James, Bonnie F.; Montgomery, Sandy E.

    2004-01-01

    NASA's In-Space Propulsion Technology Program is investing in technologies that have the potential to revolutionize the robotic exploration of deep space. For robotic exploration and science missions, increased efficiencies of future propulsion systems are critical to reduce overall life-cycle costs and, in some cases, enable missions previously considered impossible. Continued reliance on conventional chemical propulsion alone will not enable the robust exploration of deep space - the maximum theoretical efficiencies have almost been reached and they are insufficient to meet needs for many ambitious science missions currently being considered. The In-Space Propulsion Technology Program's technology portfolio includes many advanced propulsion systems. From the next-generation ion propulsion system operating in the 5- to 10-kW range to aerocapture and solar sails, substantial advances in - spacecraft propulsion performance are anticipated. Some of the most promising technologies for achieving these goals use the environment of space itself for energy and propulsion and are generically called 'propellantless' because they do not require onboard fuel to achieve thrust. Propellantless propulsion technologies include scientific innovations such as solar sails, electrodynamic and momentum transfer.tethers, aeroassist and aerocapture. This paper will provide an overview of both propellantless and propellant-based advanced propulsion technologies, as well as NASA's plans for advancing them as part of the In-Space Propulsion Technology Program.

  6. NASA's In-Space Propulsion Technology Program: Overview and Update

    NASA Technical Reports Server (NTRS)

    Johnson, Les; Alexander, Leslie; Baggett, Randy M.; Bonometti, Joseph A.; Herrmann, Melody; James, Bonnie F.; Montgomery, Sandy E.

    2004-01-01

    NASA's In-Space Propulsion Technology Program is investing in technologies that have the potential to revolutionize the robotic exploration of deep space. For robotic exploration and science missions, increased efficiencies of future propulsion systems are critical to reduce overall life-cycle costs and, in some cases, enable missions previously considered impossible. Continued reliance on conventional chemical propulsion alone will not enable the robust exploration of deep space - the maximum theoretical efficiencies have almost been reached and they are insufficient to meet needs for many ambitious science missions currently being considered. The In-Space Propulsion Technology Program s technology portfolio includes many advanced propulsion systems. From the next-generation ion propulsion system operating in the 5- to 10-kW range to aerocapture and solar sails, substantial advances in spacecraft propulsion performance are anticipated. Some of the most promising technologies for achieving these goals ase the environment of space itself for energy and propulsion and are generically called 'propellantless' because they do not require onboard fuel to achieve thrust. Propellantless propulsion technologies include scientific innovations such as solar sails, electrodynamic and momentum transfer tethers, aeroassist, and aerocapture. This paper will provide an overview of both propellantless and propellant-based advanced propulsion technologies, as well as NASA s plans for advancing them as part of the In-Space Propulsion Technology Program.

  7. Performance Analysis of HF Band FB-MC-SS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hussein Moradi; Stephen Andrew Laraway; Behrouz Farhang-Boroujeny

    Abstract—In a recent paper [1] the filter bank multicarrier spread spectrum (FB-MC-SS) waveform was proposed for wideband spread spectrum HF communications. A significant benefit of this waveform is robustness against narrow and partial band interference. Simulation results in [1] demonstrated good performance in a wideband HF channel over a wide range of conditions. In this paper we present a theoretical analysis of the bit error probably for this system. Our analysis tailors the results from [2] where BER performance was analyzed for maximum ration combining systems that accounted for correlation between subcarriers and channel estimation error. Equations are give formore » BER that closely match the simulated performance in most situations.« less

  8. Intelligent Performance Analysis with a Natural Language Interface

    NASA Astrophysics Data System (ADS)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  9. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  10. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  11. Performance analysis of LAN bridges and routers

    NASA Technical Reports Server (NTRS)

    Hajare, Ankur R.

    1991-01-01

    Bridges and routers are used to interconnect Local Area Networks (LANs). The performance of these devices is important since they can become bottlenecks in large multi-segment networks. Performance metrics and test methodology for bridges and routers were not standardized. Performance data reported by vendors is not applicable to the actual scenarios encountered in an operational network. However, vendor-provided data can be used to calibrate models of bridges and routers that, along with other models, yield performance data for a network. Several tools are available for modeling bridges and routers - Network II.5 was used. The results of the analysis of some bridges and routers are presented.

  12. Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance

    DTIC Science & Technology

    2003-07-21

    Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance Vincent A. Cicirello CMU-RI-TR-03-27 Submitted in partial fulfillment...AND SUBTITLE Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...lead to the development of a search control framework, called QD-BEACON that uses online -generated statistical models of search performance to

  13. Engineering-Level Model Atmospheres for Titan and Neptune

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta; Johnson, D. L.

    2003-01-01

    Engineering-level atmospheric models for Titan and Neptune have been developed for use in NASA s systems analysis studies of aerocapture applications in missions to the outer planets. Analogous to highly successful Global Reference Atmospheric Models for Earth (GRAM, Justus et al., 2000) and Mars (Mars-GRAM, Justus and Johnson, 2001, Justus et al., 2002) the new models are called Titan-GRAM and Neptune-GRAM. Like GRAM and Mars-GRAM, an important feature of Titan-GRAM and Neptune-GRAM is their ability to simulate quasi-random perturbations for Monte- Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design.

  14. Multiprocessor smalltalk: Implementation, performance, and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pallas, J.I.

    1990-01-01

    Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possiblemore » to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.« less

  15. How to Perform an Ethical Risk Analysis (eRA).

    PubMed

    Hansson, Sven Ove

    2018-02-26

    Ethical analysis is often needed in the preparation of policy decisions on risk. A three-step method is proposed for performing an ethical risk analysis (eRA). In the first step, the people concerned are identified and categorized in terms of the distinct but compatible roles of being risk-exposed, a beneficiary, or a decisionmaker. In the second step, a more detailed classification of roles and role combinations is performed, and ethically problematic role combinations are identified. In the third step, further ethical deliberation takes place, with an emphasis on individual risk-benefit weighing, distributional analysis, rights analysis, and power analysis. Ethical issues pertaining to subsidiary risk roles, such as those of experts and journalists, are also treated in this phase. An eRA should supplement, not replace, a traditional risk analysis that puts emphasis on the probabilities and severities of undesirable events but does not cover ethical issues such as agency, interpersonal relationships, and justice. © 2018 Society for Risk Analysis.

  16. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  17. The development of a reliable amateur boxing performance analysis template.

    PubMed

    Thomson, Edward; Lamb, Kevin; Nicholas, Ceri

    2013-01-01

    The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.

  18. Performance criteria for emergency medicine residents: a job analysis.

    PubMed

    Blouin, Danielle; Dagnone, Jeffrey Damon

    2008-11-01

    A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.

  19. Using Importance-Performance Analysis To Evaluate Teaching Effectiveness.

    ERIC Educational Resources Information Center

    Attarian, Aram

    This paper introduces Importance-Performance (IP) analysis as a method to evaluate teaching effectiveness in a university outdoor program. Originally developed for use in the field of marketing, IP analysis is simple and easy to administer, and provides the instructor with a visual representation of what teaching attributes are important, how…

  20. RTOD- RADIAL TURBINE OFF-DESIGN PERFORMANCE ANALYSIS

    NASA Technical Reports Server (NTRS)

    Glassman, A. J.

    1994-01-01

    The RTOD program was developed to accurately predict radial turbine off-design performance. The radial turbine has been used extensively in automotive turbochargers and aircraft auxiliary power units. It is now being given serious consideration for primary powerplant applications. In applications where the turbine will operate over a wide range of power settings, accurate off-design performance prediction is essential for a successful design. RTOD predictions have already illustrated a potential improvement in off-design performance offered by rotor back-sweep for high-work-factor radial turbines. RTOD can be used to analyze other potential performance enhancing design features. RTOD predicts the performance of a radial turbine (with or without rotor blade sweep) as a function of pressure ratio, speed, and stator setting. The program models the flow with the following: 1) stator viscous and trailing edge losses; 2) a vaneless space loss between the stator and the rotor; and 3) rotor incidence, viscous, trailing-edge, clearance, and disk friction losses. The stator and rotor viscous losses each represent the combined effects of profile, endwall, and secondary flow losses. The stator inlet and exit and the rotor inlet flows are modeled by a mean-line analysis, but a sector analysis is used at the rotor exit. The leakage flow through the clearance gap in a pivoting stator is also considered. User input includes gas properties, turbine geometry, and the stator and rotor viscous losses at a reference performance point. RTOD output includes predicted turbine performance over a specified operating range and any user selected flow parameters. The RTOD program is written in FORTRAN IV for batch execution and has been implemented on an IBM 370 series computer with a central memory requirement of approximately 100K of 8 bit bytes. The RTOD program was developed in 1983.

  1. Performance analysis of the ascent propulsion system of the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Hooper, J. C., III

    1973-01-01

    Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.

  2. Dynamic performances analysis of a real vehicle driving

    NASA Astrophysics Data System (ADS)

    Abdullah, M. A.; Jamil, J. F.; Salim, M. A.

    2015-12-01

    Vehicle dynamic is the effects of movement of a vehicle generated from the acceleration, braking, ride and handling activities. The dynamic behaviours are determined by the forces from tire, gravity and aerodynamic which acting on the vehicle. This paper emphasizes the analysis of vehicle dynamic performance of a real vehicle. Real driving experiment on the vehicle is conducted to determine the effect of vehicle based on roll, pitch, and yaw, longitudinal, lateral and vertical acceleration. The experiment is done using the accelerometer to record the reading of the vehicle dynamic performance when the vehicle is driven on the road. The experiment starts with weighing a car model to get the center of gravity (COG) to place the accelerometer sensor for data acquisition (DAQ). The COG of the vehicle is determined by using the weight of the vehicle. A rural route is set to launch the experiment and the road conditions are determined for the test. The dynamic performance of the vehicle are depends on the road conditions and driving maneuver. The stability of a vehicle can be controlled by the dynamic performance analysis.

  3. Performance Test Data Analysis of Scintillation Cameras

    NASA Astrophysics Data System (ADS)

    Demirkaya, Omer; Mazrou, Refaat Al

    2007-10-01

    In this paper, we present a set of image analysis tools to calculate the performance parameters of gamma camera systems from test data acquired according to the National Electrical Manufacturers Association NU 1-2001 guidelines. The calculation methods are either completely automated or require minimal user interaction; minimizing potential human errors. The developed methods are robust with respect to varying conditions under which these tests may be performed. The core algorithms have been validated for accuracy. They have been extensively tested on images acquired by the gamma cameras from different vendors. All the algorithms are incorporated into a graphical user interface that provides a convenient way to process the data and report the results. The entire application has been developed in MATLAB programming environment and is compiled to run as a stand-alone program. The developed image analysis tools provide an automated, convenient and accurate means to calculate the performance parameters of gamma cameras and SPECT systems. The developed application is available upon request for personal or non-commercial uses. The results of this study have been partially presented in Society of Nuclear Medicine Annual meeting as an InfoSNM presentation.

  4. Performance management in healthcare: a critical analysis.

    PubMed

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals.

  5. Performance Indicators in Math: Implications for Brief Experimental Analysis of Academic Performance

    ERIC Educational Resources Information Center

    VanDerheyden, Amanda M.; Burns, Matthew K.

    2009-01-01

    Brief experimental analysis (BEA) can be used to specify intervention characteristics that produce positive learning gains for individual students. A key challenge to the use of BEA for intervention planning is the identification of performance indicators (including topography of the skill, measurement characteristics, and decision criteria) that…

  6. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  7. Scalable Performance Measurement and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number ofmore » tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.« less

  8. Phobos/Deimos Sample Return via Solar Sail

    NASA Technical Reports Server (NTRS)

    Matloff, Gregory L.; Taylor, Travis; Powell, Conley; Moton, Tryshanda

    2004-01-01

    Abstract A sample-return mission to the martian satellites using a contemporary solar sail for all post-Earth-escape propulsion is proposed. The 0.015 kg/sq m areal mass-thickness sail unfurls after launch and injection onto a Mars-bound Hohmann-transfer ellipse. Structure and pay!oad increase spacecraft areal mass thickness to 0.028 kg/sq m. During Mars-encounter, the sail functions parachute-like in Mars s outer atmosphere to accomplish aerocapture. On-board thrusters or the sail maneuver the spacecraft into an orbit with periapsis near Mars and apoapsis near Phobos. The orbit is circularized for Phobos-rendezvous; surface samples are collected. The sail then raises the orbit for Deimos-rendezvous and sample collection. The sail next places the spacecraft on an Earth-bound Hohmann-transfer ellipse. During Earth-encounter, the sail accomplishes Earth-aerocapture or partially decelerates the sample container for entry into Earth s atmosphere. Mission mass budget is about 218 grams and; mission duration is <5 years.

  9. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  10. Advanced orbit transfer vehicle propulsion system study

    NASA Technical Reports Server (NTRS)

    Cathcart, J. A.; Cooper, T. W.; Corringrato, R. M.; Cronau, S. T.; Forgie, S. C.; Harder, M. J.; Mcallister, J. G.; Rudman, T. J.; Stoneback, V. W.

    1985-01-01

    A reuseable orbit transfer vehicle concept was defined and subsequent recommendations for the design criteria of an advanced LO2/LH2 engine were presented. The major characteristics of the vehicle preliminary design include a low lift to drag aerocapture capability, main propulsion system failure criteria of fail operational/fail safe, and either two main engines with an attitude control system for backup or three main engines to meet the failure criteria. A maintenance and servicing approach was also established for the advanced vehicle and engine concepts. Design tradeoff study conclusions were based on the consideration of reliability, performance, life cycle costs, and mission flexibility.

  11. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  12. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  13. Design and performance analysis of gas and liquid radial turbines

    NASA Astrophysics Data System (ADS)

    Tan, Xu

    In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.

  14. Radio-science performance analysis software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  15. Radio-Science Performance Analysis Software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1994-10-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussion on operating the program set on Galileo and Ulysses data will be presented.

  16. Radio-science performance analysis software

    NASA Technical Reports Server (NTRS)

    Morabito, D. D.; Asmar, S. W.

    1995-01-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  17. Statistical analysis in MSW collection performance assessment.

    PubMed

    Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel

    2014-09-01

    The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. An Analysis of Effects of Variable Factors on Weapon Performance

    DTIC Science & Technology

    1993-03-01

    ALTERNATIVE ANALYSIS A. CATEGORICAL DATA ANALYSIS Statistical methodology for categorical data analysis traces its roots to the work of Francis Galton in the...choice of statistical tests . This thesis examines an analysis performed by Surface Warfare Development Group (SWDG). The SWDG analysis is shown to be...incorrect due to the misapplication of testing methods. A corrected analysis is presented and recommendations suggested for changes to the testing

  19. Statistical analysis of RHIC beam position monitors performance

    NASA Astrophysics Data System (ADS)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  20. Performance characterization of image and video analysis systems at Siemens Corporate Research

    NASA Astrophysics Data System (ADS)

    Ramesh, Visvanathan; Jolly, Marie-Pierre; Greiffenhagen, Michael

    2000-06-01

    There has been a significant increase in commercial products using imaging analysis techniques to solve real-world problems in diverse fields such as manufacturing, medical imaging, document analysis, transportation and public security, etc. This has been accelerated by various factors: more advanced algorithms, the availability of cheaper sensors, and faster processors. While algorithms continue to improve in performance, a major stumbling block in translating improvements in algorithms to faster deployment of image analysis systems is the lack of characterization of limits of algorithms and how they affect total system performance. The research community has realized the need for performance analysis and there have been significant efforts in the last few years to remedy the situation. Our efforts at SCR have been on statistical modeling and characterization of modules and systems. The emphasis is on both white-box and black box methodologies to evaluate and optimize vision systems. In the first part of this paper we review the literature on performance characterization and then provide an overview of the status of research in performance characterization of image and video understanding systems. The second part of the paper is on performance evaluation of medical image segmentation algorithms. Finally, we highlight some research issues in performance analysis in medical imaging systems.

  1. Performance bounds for modal analysis using sparse linear arrays

    NASA Astrophysics Data System (ADS)

    Li, Yuanxin; Pezeshki, Ali; Scharf, Louis L.; Chi, Yuejie

    2017-05-01

    We study the performance of modal analysis using sparse linear arrays (SLAs) such as nested and co-prime arrays, in both first-order and second-order measurement models. We treat SLAs as constructed from a subset of sensors in a dense uniform linear array (ULA), and characterize the performance loss of SLAs with respect to the ULA due to using much fewer sensors. In particular, we claim that, provided the same aperture, in order to achieve comparable performance in terms of Cramér-Rao bound (CRB) for modal analysis, SLAs require more snapshots, of which the number is about the number of snapshots used by ULA times the compression ratio in the number of sensors. This is shown analytically for the case with one undamped mode, as well as empirically via extensive numerical experiments for more complex scenarios. Moreover, the misspecified CRB proposed by Richmond and Horowitz is also studied, where SLAs suffer more performance loss than their ULA counterpart.

  2. PATHA: Performance Analysis Tool for HPC Applications

    DOE PAGES

    Yoo, Wucherl; Koo, Michelle; Cao, Yi; ...

    2016-02-18

    Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data.more » Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.« less

  3. Cross-industry Performance Modeling: Toward Cooperative Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reece, Wendy Jane; Blackman, Harold Stabler

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experiencemore » and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.« less

  4. Cross-Industry Performance Modeling: Toward Cooperative Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    H. S. Blackman; W. J. Reece

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operatingmore » experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.« less

  5. Multiplex network analysis of employee performance and employee social relationships

    NASA Astrophysics Data System (ADS)

    Cai, Meng; Wang, Wei; Cui, Ying; Stanley, H. Eugene

    2018-01-01

    In human resource management, employee performance is strongly affected by both formal and informal employee networks. Most previous research on employee performance has focused on monolayer networks that can represent only single categories of employee social relationships. We study employee performance by taking into account the entire multiplex structure of underlying employee social networks. We collect three datasets consisting of five different employee relationship categories in three firms, and predict employee performance using degree centrality and eigenvector centrality in a superimposed multiplex network (SMN) and an unfolded multiplex network (UMN). We use a quadratic assignment procedure (QAP) analysis and a regression analysis to demonstrate that the different categories of relationship are mutually embedded and that the strength of their impact on employee performance differs. We also use weighted/unweighted SMN/UMN to measure the predictive accuracy of this approach and find that employees with high centrality in a weighted UMN are more likely to perform well. Our results shed new light on how social structures affect employee performance.

  6. An Empirical Analysis of the Performance of Vietnamese Higher Education Institutions

    ERIC Educational Resources Information Center

    Tran, Carolyn-Dung T. T.; Villano, Renato A.

    2017-01-01

    This article provides an analysis of the academic performance of higher education institutions (HEIs) in Vietnam with 50 universities and 50 colleges in 2011/12. The two-stage semiparametric data envelopment analysis is used to estimate the efficiency of HEIs and investigate the effects of various factors on their performance. The findings reveal…

  7. Visuo-spatial performance in autism: a meta-analysis.

    PubMed

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M

    2014-12-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large heterogeneity that is unaccounted for. No clear differences were found for Mental Rotation. ASD samples showed a stronger local processing preference for Navon tasks (d = 0.35); less clear evidence for performance differences of a similar magnitude emerged. We discuss the meta-analysis results together with other findings relating to visuo-spatial processing and three cognitive theories of ASD: Weak Central Coherence, Enhanced Perceptual Functioning and Extreme Male Brain theory.

  8. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  9. Predicting performance with traffic analysis tools : final report.

    DOT National Transportation Integrated Search

    2008-03-01

    This document provides insights into the common pitfalls and challenges associated with use of traffic analysis tools for predicting future performance of a transportation facility. It provides five in-depth case studies that demonstrate common ways ...

  10. Idaho National Laboratory Quarterly Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lisbeth

    2014-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INLmore » from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.« less

  11. Approaches to Cycle Analysis and Performance Metrics

    NASA Technical Reports Server (NTRS)

    Parson, Daniel E.

    2003-01-01

    The following notes were prepared as part of an American Institute of Aeronautics and Astronautics (AIAA) sponsored short course entitled Air Breathing Pulse Detonation Engine (PDE) Technology. The course was presented in January of 2003, and again in July of 2004 at two different AIAA meetings. It was taught by seven instructors, each of whom provided information on particular areas of PDE research. These notes cover two areas. The first is titled Approaches to Cycle Analysis and Performance Metrics. Here, the various methods of cycle analysis are introduced. These range from algebraic, thermodynamic equations, to single and multi-dimensional Computational Fluid Dynamic (CFD) solutions. Also discussed are the various means by which performance is measured, and how these are applied in a device which is fundamentally unsteady. The second topic covered is titled PDE Hybrid Applications. Here the concept of coupling a PDE to a conventional turbomachinery based engine is explored. Motivation for such a configuration is provided in the form of potential thermodynamic benefits. This is accompanied by a discussion of challenges to the technology.

  12. Performance evaluation of existing building structure with pushover analysis

    NASA Astrophysics Data System (ADS)

    Handana, MAP; Karolina, R.; Steven

    2018-02-01

    In the management of the infrastructure of the building, during the period of buildings common building damage as a result of several reasons, earthquakes are common. The building is planned to work for a certain service life. But during the certain service life, the building vulnerable to damage due to various things. Any damage to cultivate can be detected as early as possible, because the damage could spread, triggering and exacerbating the latest. The newest concept to earthquake engineering is Performance Based Earthquake Engineering (PBEE). PBEE divided into two, namely Performance Based Seismic Design (PBSD) and Performance Based Seismic Evaluation (PBSE). Evaluation on PBSE one of which is the analysis of nonlinear pushover. Pushover analysis is a static analysis of nonlinear where the influence of the earthquake plan on building structure is considered as burdens static catch at the center of mass of each floor, which it was increased gradually until the loading causing the melting (plastic hinge) first within the building structure, then the load increases further changes the shapes of post-elastic large it reached the condition of elastic. Then followed melting (plastic hinge) in the location of the other structured.

  13. Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.

  14. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review

    PubMed Central

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-01-01

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers’ performance and may serve to inform both applied and research practices. PMID:26712760

  15. Modeling and performance analysis of QoS data

    NASA Astrophysics Data System (ADS)

    Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.

    2016-09-01

    The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.

  16. Relative performance of academic departments using DEA with sensitivity analysis.

    PubMed

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  17. The Current State of Human Performance Technology: A Citation Network Analysis of "Performance Improvement Quarterly," 1988-2010

    ERIC Educational Resources Information Center

    Cho, Yonjoo; Jo, Sung Jun; Park, Sunyoung; Kang, Ingu; Chen, Zengguan

    2011-01-01

    This study conducted a citation network analysis (CNA) of human performance technology (HPT) to examine its current state of the field. Previous reviews of the field have used traditional research methods, such as content analysis, survey, Delphi, and citation analysis. The distinctive features of CNA come from using a social network analysis…

  18. Performance analysis of CCSDS path service

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1989-01-01

    A communications service, called Path Service, is currently being developed by the Consultative Committee for Space Data Systems (CCSDS) to provide a mechanism for the efficient transmission of telemetry data from space to ground for complex space missions of the future. This is an important service, due to the large volumes of telemetry data that will be generated during these missions. A preliminary analysis of performance of Path Service is presented with respect to protocol-processing requirements and channel utilization.

  19. Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system

    NASA Technical Reports Server (NTRS)

    Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)

    1992-01-01

    Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

  20. Study of Solid State Drives performance in PROOF distributed analysis system

    NASA Astrophysics Data System (ADS)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  1. Performance analysis: a study using data envelopment analysis in 26 Brazilian hospitals.

    PubMed

    Guerra, Mariana; de Souza, Antônio Artur; Moreira, Douglas Rafael

    2012-01-01

    This article describes a proposal for analyzing the performance of public Brazilian hospitals using financial and non-financial rates (i.e., operational rates), and thereby highlights the effectiveness (or otherwise) of the financial management of organizations in this study. A total of 72 hospitals in the Brazilian Unified Health Care System (in Portuguese, Sistema Unico de Saúde-SUS), were selected for accessibility and completeness of their data. Twenty-six organizations were used for the study sample, consisting of entities that had publicly disclosed financial statements for the period from 2008 (in particular, via the Internet) and whose operational data could be found in the SUS database. Our proposal, based on models using the method of Data Envelopment Analysis (DEA), was the construction of six initial models that were later compiled into a standard model. The relations between the rates that comprised the models were based on the variables and the notes of: Schuhmann, McCue and Nayar, Barnum and Kutzin, Younis, Younies, and Okojie, Marinho, Moreno, and Cavalini, and Ersoy, Kavuncubasi, Ozcan, and Harris II. We put forward an enhanced grant proposal applicable to Brazil aiming to (i) confirm or refute the rates that show the effectiveness or ineffectiveness of financial management of national hospitals; and (ii) determine the best performances, which could be used as a reference for future studies. Obtained results: (i) for all financial indicators considered, only one showed no significance in all models; and (ii) for operational indicators, the results were not relevant when the number of occupied beds was considered. Though the analysis was related to only services provided by SUS, we conclude that our study has great potential for analyzing the financial management performance of Brazilian hospitals in general, for the following reasons: (i) it shows the relationship of financial and operational rates that can be used to analyze the performance of

  2. Using Importance-Performance Analysis to Guide Instructional Design of Experiential Learning Activities

    ERIC Educational Resources Information Center

    Anderson, Sheri; Hsu, Yu-Chang; Kinney, Judy

    2016-01-01

    Designing experiential learning activities requires an instructor to think about what they want the students to learn. Using importance-performance analysis can assist with the instructional design of the activities. This exploratory study used importance-performance analysis in an online introduction to criminology course. There is limited…

  3. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  4. Analysis of Photovoltaic System Energy Performance Evaluation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, S.; Newmiller, J.; Kimber, A.

    2013-11-01

    Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with amore » very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.« less

  5. Artificial Intelligence: An Analysis of Potential Applications to Training, Performance Measurement, and Job Performance Aiding.

    DTIC Science & Technology

    1983-09-01

    AD-Ali33 592 ARTIFICIAL INTELLIGENCE: AN ANALYSIS OF POTENTIAL 1/1 APPLICATIONS TO TRAININ..(U) DENVER RESEARCH INST CO JRICHARDSON SEP 83 AFHRL-TP...83-28 b ’ 3 - 4. TITLE (aied Suhkie) 5. TYPE OF REPORT & PERIOD COVERED ARTIFICIAL INTEL11GENCE: AN ANALYSIS OF Interim POTENTIAL APPLICATIONS TO...8217 sde if neceseamy end ides*f by black naumber) artificial intelligence military research * computer-aided diagnosis performance tests computer

  6. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatele, Abhinav

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research alongmore » the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.« less

  7. The Status of Spacecraft Bus and Platform Technology Development Under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David; Munk, Michelle M.; Pencil, Eric; Dankanich, John; Glaab, Louis; Peterson, Todd

    2014-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in three areas that include Propulsion System Technologies, Entry Vehicle Technologies, and Systems Mission Analysis. ISPTs propulsion technologies include: 1) NASAs Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; 2) a Hall-effect electric propulsion (HEP) system for sample return and low cost missions; 3) the Advanced Xenon Flow Control System (AXFS); ultra-lightweight propellant tank technologies (ULTT); and propulsion technologies for a Mars Ascent Vehicle (MAV). The AXFS and ULTT are two component technologies being developed with nearer-term flight infusion in mind, whereas NEXT and the HEP are being developed as EP systems. ISPTs entry vehicle technologies are: 1) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GNC) models of blunt-body rigid aeroshells; and aerothermal effect models; and 2) Multi-mission technologies for Earth Entry Vehicles (MMEEV) for sample return missions. The Systems Mission Analysis area is focused on developing tools and assessing the application of propulsion, entry vehicle, and spacecraft bus technologies to a wide variety of mission concepts. Several of the ISPT technologies are related to sample return missions and other spacecraft bus technology needs like: MAV propulsion, MMEEV, and electric propulsion. These technologies, as well as Aerocapture, are more vehicle and mission-focused, and present a different set of technology development challenges. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, Flagship and sample return missions currently under consideration. This paper provides

  8. Accounting for trip frequency in importance-performance analysis

    Treesearch

    Joshua K. Gill; J.M. Bowker; John C. Bergstrom; Stanley J. Zarnoch

    2010-01-01

    Understanding customer satisfaction is critical to the successful operation of both privately and publicly managed recreation venues. A popular tool for assessing recreation visitor satisfaction is Importance- Performance Analysis (IPA). IPA provides resource managers, government officials, and private businesses with easy-to-understand and -use information about...

  9. Performance analysis of a generalized upset detection procedure

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Masson, Gerald M.

    1987-01-01

    A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.

  10. Estimating Driving Performance Based on EEG Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Wu, Ruei-Cheng; Jung, Tzyy-Ping; Liang, Sheng-Fu; Huang, Teng-Yi

    2005-12-01

    The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG) log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  11. Phobos/Deimos sample return via solar sail.

    PubMed

    Matloff, Gregory L; Taylor, Travis; Powell, Conley; Moton, Tryshanda

    2005-12-01

    A sample-return mission to the Martian satellites using a con-temporary solar sail for all post-Earth-escape propulsion is proposed. The 0.015 kg/m(2) areal mass-thickness sail unfurls after launch and injection onto a Mars-bound Hohmann-transfer ellipse. Structure and payload increase spacecraft areal mass thickness to 0.028 kg/m(2). During the Mars encounter, the sail functions as a parachute in the outer atmosphere of Mars to accomplish aerocapture. On-board thrusters or the sail maneuver the spacecraft into an orbit with periapsis near Mars and apoapsis near Phobos. The orbit is circularized for Phobos-rendezvous; surface samples are collected. The sail then raises the orbit for Deimos-rendezvous and sample collection. The sail next places the spacecraft on an Earth-bound Hohmann-transfer ellipse. During Earth encounter, the sail accomplishes Earth-aerocapture or partially decelerates the sample container for entry into the Earth's atmosphere. Mission mass budget is about 218 grams and mission duration is less than five years.

  12. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  13. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  14. Stability, performance and sensitivity analysis of I.I.D. jump linear systems

    NASA Astrophysics Data System (ADS)

    Chávez Fuentes, Jorge R.; González, Oscar R.; Gray, W. Steven

    2018-06-01

    This paper presents a symmetric Kronecker product analysis of independent and identically distributed jump linear systems to develop new, lower dimensional equations for the stability and performance analysis of this type of systems than what is currently available. In addition, new closed form expressions characterising multi-parameter relative sensitivity functions for performance metrics are introduced. The analysis technique is illustrated with a distributed fault-tolerant flight control example where the communication links are allowed to fail randomly.

  15. Job Analysis, Job Descriptions, and Performance Appraisal Systems.

    ERIC Educational Resources Information Center

    Sims, Johnnie M.; Foxley, Cecelia H.

    1980-01-01

    Job analysis, job descriptions, and performance appraisal can benefit student services administration in many ways. Involving staff members in the development and implementation of these techniques can increase commitment to and understanding of the overall objectives of the office, as well as communication and cooperation among colleagues.…

  16. Performance analysis and dynamic modeling of a single-spool turbojet engine

    NASA Astrophysics Data System (ADS)

    Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin

    2017-01-01

    The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.

  17. An Atmospheric Guidance Algorithm Testbed for the Mars Surveyor Program 2001 Orbiter and Lander

    NASA Technical Reports Server (NTRS)

    Striepe, Scott A.; Queen, Eric M.; Powell, Richard W.; Braun, Robert D.; Cheatwood, F. McNeil; Aguirre, John T.; Sachi, Laura A.; Lyons, Daniel T.

    1998-01-01

    An Atmospheric Flight Team was formed by the Mars Surveyor Program '01 mission office to develop aerocapture and precision landing testbed simulations and candidate guidance algorithms. Three- and six-degree-of-freedom Mars atmospheric flight simulations have been developed for testing, evaluation, and analysis of candidate guidance algorithms for the Mars Surveyor Program 2001 Orbiter and Lander. These simulations are built around the Program to Optimize Simulated Trajectories. Subroutines were supplied by Atmospheric Flight Team members for modeling the Mars atmosphere, spacecraft control system, aeroshell aerodynamic characteristics, and other Mars 2001 mission specific models. This paper describes these models and their perturbations applied during Monte Carlo analyses to develop, test, and characterize candidate guidance algorithms.

  18. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-01-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.

  19. Efficacy of Ginseng Supplements on Fatigue and Physical Performance: a Meta-analysis

    PubMed Central

    2016-01-01

    We conducted a meta-analysis to investigate the efficacy of ginseng supplements on fatigue reduction and physical performance enhancement as reported by randomized controlled trials (RCTs). RCTs that investigated the efficacy of ginseng supplements on fatigue reduction and physical performance enhancement compared with placebos were included. The main outcome measures were fatigue reduction and physical performance enhancement. Out of 155 articles meeting initial criteria, 12 RCTs involving 630 participants (311 participants in the intervention group and 319 participants in the placebo group) were included in the final analysis. In the fixed-effect meta-analysis of four RCTs, there was a statistically significant efficacy of ginseng supplements on fatigue reduction (standardized mean difference, SMD = 0.34; 95% confidence interval [CI] = 0.16 to 0.52). However, ginseng supplements were not associated with physical performance enhancement in the fixed-effect meta-analysis of eight RCTs (SMD = −0.01; 95% CI = −0.29 to 0.27). We found that there was insufficient clinical evidence to support the use of ginseng supplements on reducing fatigue and enhancing physical performance because only few RCTs with a small sample size have been published so far. Further lager RCTs are required to confirm the efficacy of ginseng supplements on fatigue reduction. PMID:27822924

  20. Materials Needs for Future In-Space Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Johnson, Les

    2006-01-01

    NASA's In-Space Propulsion Technology Project is developing the next generation of in-space propulsion systems in support of robotic exploration missions throughout the solar system. The propulsion technologies being developed are non-traditional and have stressing materials performance requirements. Earth-storable bipropellant performance is constrained by temperature limitations of the columbium used in the chamber. Iridium/rhenium (Ir/Re) is now available and has been implemented in initial versions of Earth- Storable rockets with specific impulses about 10 seconds higher than columbium rocket chambers. New chamber fabrication methods that improve process and performance of Ir/Re and other promising material systems are needed. The solar sail is a propellantless propulsion system that gains momentum by reflecting sunlight. The sails need to be very large in area (from 10000 sq m up to 62500 sq m) yet be very lightweight in order to achieve adequate accelerations for realistic mission times. Lightweight materials that can be manufactured in thicknesses of less than 1 micron and that are not harmed by the space environment are desired. Blunt Body Aerocapture uses aerodynamic drag to slow an approaching spacecraft and insert it into a science orbit around any planet or moon with an atmosphere. The spacecraft is enclosed by a rigid aeroshell that protects it from the entry heating and aerodynamic environment. Lightweight, high-temperature structural systems, adhesives, insulators, and ablatives are key components for improving aeroshell efficiencies at heating rates of 1000-2000 W/sq cm and beyond. Inflatable decelerators in the forms of ballutes and inflatable aeroshells will use flexible polymeric thin film materials, high temperature fabrics, and structural adhesives. The inflatable systems will be tightly packaged during cruise and will be inflated prior to entry interface at the destination. Materials must maintain strength and flexibility while packaged at

  1. A multifaceted independent performance analysis of facial subspace recognition algorithms.

    PubMed

    Bajwa, Usama Ijaz; Taj, Imtiaz Ahmad; Anwar, Muhammad Waqas; Wang, Xuan

    2013-01-01

    Face recognition has emerged as the fastest growing biometric technology and has expanded a lot in the last few years. Many new algorithms and commercial systems have been proposed and developed. Most of them use Principal Component Analysis (PCA) as a base for their techniques. Different and even conflicting results have been reported by researchers comparing these algorithms. The purpose of this study is to have an independent comparative analysis considering both performance and computational complexity of six appearance based face recognition algorithms namely PCA, 2DPCA, A2DPCA, (2D)(2)PCA, LPP and 2DLPP under equal working conditions. This study was motivated due to the lack of unbiased comprehensive comparative analysis of some recent subspace methods with diverse distance metric combinations. For comparison with other studies, FERET, ORL and YALE databases have been used with evaluation criteria as of FERET evaluations which closely simulate real life scenarios. A comparison of results with previous studies is performed and anomalies are reported. An important contribution of this study is that it presents the suitable performance conditions for each of the algorithms under consideration.

  2. An assessment of SBS modified asphalt concrete pavements performance features performing numerical analysis

    NASA Astrophysics Data System (ADS)

    Karakas, Ahmet Sertac; Bozkurt, Tarik Serhat; Sayin, Baris; Ortes, Faruk

    2017-07-01

    In passenger and freight traffic on the roads, which has the largest share of the hot mix asphalt (HMA) prepared asphalt concrete pavement is one of the most preferred type of flexible superstructure. During the service life of the road, they must provide the performance which is expected to show. HMA must be high performance mix design, comfortable, safe and resistance to degradation. In addition, it becomes a critical need to use various additives materials for roads to be able to serve long-term against environmental conditions such as traffic and climate due to the fact that the way of raw materials is limited. Styrene Butadiene Styrene (SBS) polymers are widely used among additives. In this study, the numerical analysis of SBS modified HMA designed asphalt concrete coatings prepared with different thicknesses with SBS modified HMA is performed. After that, stress and deformation values of the three pavement models are compared and evaluated.

  3. Mission analysis and performance specification studies report, appendix A

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The Near Term Hybrid Passenger Vehicle Development Program tasks included defining missions, developing distributions of daily travel and composite driving cycles for these missions, providing information necessary to estimate the potential replacement of the existing fleet by hybrids, and estimating acceleration/gradeability performance requirements for safe operation. The data was then utilized to develop mission specifications, define reference vehicles, develop hybrid vehicle performance specifications, and make fuel consumption estimates for the vehicles. The major assumptions which underlie the approach taken to the mission analysis and development of performance specifications are the following: the daily operating range of a hybrid vehicle should not be limited by the stored energy capacity and the performance of such a vehicle should not be strongly dependent on the battery state of charge.

  4. Products from NASA's In-Space Propulsion Technology Program Applicable to Low-Cost Planetary Missions

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Pencil, Eric; Vento, Daniel; Peterson, Todd; Dankanich, John; Hahne, David; Munk, Michelle M.

    2011-01-01

    Since September 2001 NASA s In-Space Propulsion Technology (ISPT) program has been developing technologies for lowering the cost of planetary science missions. Recently completed is the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. Two other cost saving technologies nearing completion are the NEXT ion thruster and the Aerocapture technology project. Also under development are several technologies for low cost sample return missions. These include a low cost Hall effect thruster (HIVHAC) which will be completed in 2011, light weight propellant tanks, and a Multi-Mission Earth Entry Vehicle (MMEEV). This paper will discuss the status of the technology development, the cost savings or performance benefits, and applicability of these in-space propulsion technologies to NASA s future Discovery, and New Frontiers missions, as well as their relevance for sample return missions.

  5. Diversity Performance Analysis on Multiple HAP Networks.

    PubMed

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-06-30

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  6. Diversity Performance Analysis on Multiple HAP Networks

    PubMed Central

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  7. Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser

    NASA Astrophysics Data System (ADS)

    Adib, M. A. H. M.; Adnan, F.; Ismail, A. R.; Kardigama, K.; Salaam, H. A.; Ahmad, Z.; Johari, N. H.; Anuar, Z.; Azmi, N. S. N.

    2012-09-01

    Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ~ 60%) acceptable compared to diffuser with 6mm ~ 40% and 12mm ~ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).

  8. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Koo, Michelle; Cao, Yu

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less

  9. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  10. Visuo-Spatial Performance in Autism: A Meta-Analysis

    ERIC Educational Resources Information Center

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  11. Evaluating Service Quality from Patients' Perceptions: Application of Importance-performance Analysis Method.

    PubMed

    Mohebifar, Rafat; Hasani, Hana; Barikani, Ameneh; Rafiei, Sima

    2016-08-01

    Providing high service quality is one of the main functions of health systems. Measuring service quality is the basic prerequisite for improving quality. The aim of this study was to evaluate the quality of service in teaching hospitals using importance-performance analysis matrix. A descriptive-analytic study was conducted through a cross-sectional method in six academic hospitals of Qazvin, Iran, in 2012. A total of 360 patients contributed to the study. The sampling technique was stratified random sampling. Required data were collected based on a standard questionnaire (SERVQUAL). Data analysis was done through SPSS version 18 statistical software and importance-performance analysis matrix. The results showed a significant gap between importance and performance in all five dimensions of service quality (p < 0.05). In reviewing the gap, "reliability" (2.36) and "assurance" (2.24) dimensions had the highest quality gap and "responsiveness" had the lowest gap (1.97). Also, according to findings, reliability and assurance were in Quadrant (I), empathy was in Quadrant (II), and tangibles and responsiveness were in Quadrant (IV) of the importance-performance matrix. The negative gap in all dimensions of quality shows that quality improvement is necessary in all dimensions. Using quality and diagnosis measurement instruments such as importance-performance analysis will help hospital managers with planning of service quality improvement and achieving long-term goals.

  12. Waveguide-based electro-absorption modulator performance: comparative analysis

    NASA Astrophysics Data System (ADS)

    Amin, Rubab; Khurgin, Jacob B.; Sorger, Volker J.

    2018-06-01

    Electro-optic modulation is a key function for data communication. Given the vast amount of data handled, understanding the intricate physics and trade-offs of modulators on-chip allows revealing performance regimes not explored yet. Here we show a holistic performance analysis for waveguide-based electro-absorption modulators. Our approach centers around material properties revealing obtainable optical absorption leading to effective modal cross-section, and material broadening effects. Taken together both describe the modulator physical behavior entirely. We consider a plurality of material modulation classes to include two-level absorbers such as quantum dots, free carrier accumulation or depletion such as ITO or Silicon, two-dimensional electron gas in semiconductors such as quantum wells, Pauli blocking in Graphene, and excitons in two-dimensional atomic layered materials such as found in transition metal dichalcogendies. Our results show that reducing the modal area generally improves modulator performance defined by the amount of induced electrical charge, and hence the energy-per-bit function, required switching the signal. We find that broadening increases the amount of switching charge needed. While some material classes allow for reduced broadening such as quantum dots and 2-dimensional materials due to their reduced Coulomb screening leading to increased oscillator strengths, the sharpness of broadening is overshadowed by thermal effects independent of the material class. Further we find that plasmonics allows the switching charge and energy-per-bit function to be reduced by about one order of magnitude compared to bulk photonics. This analysis is aimed as a guide for the community to predict anticipated modulator performance based on both existing and emerging materials.

  13. The Development of a Handbook for Astrobee F Performance and Stability Analysis

    NASA Technical Reports Server (NTRS)

    Wolf, R. S.

    1982-01-01

    An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.

  14. Thermodynamic performance analysis of ramjet engine at wide working conditions

    NASA Astrophysics Data System (ADS)

    Ou, Min; Yan, Li; Tang, Jing-feng; Huang, Wei; Chen, Xiao-qian

    2017-03-01

    Although ramjet has the advantages of high-speed flying and higher specific impulse, the performance parameters will decline seriously with the increase of flight Mach number and flight height. Therefore, the investigation on the thermodynamic performance of ramjet is very crucial for broadening the working range. In the current study, a typical ramjet model has been employed to investigate the performance characteristics at wide working conditions. First of all, the compression characteristic analysis is carried out based on the Brayton cycle. The obtained results show that the specific cross-section area (A2 and A5) and the air-fuel ratio (f) have a great influence on the ramjet performance indexes. Secondly, the thermodynamic calculation process of ramjet is given from the view of the pneumatic thermal analysis. Then, the variable trends of the ramjet performance indexes with the flow conditions, the air-fuel ratio (f), the specific cross-sectional area (A2 and A5) under the fixed operating condition, equipotential dynamic pressure condition and variable dynamic pressure condition have been discussed. Finally, the optimum value of the specific cross-sectional area (A5) and the air-fuel ratio (f) of the ramjet model at a fixed work condition (Ma=3.5, H=12 km) are obtained.

  15. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  16. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  17. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE PAGES

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    2017-01-16

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  18. Relative performance analysis of IR FPA technologies from the perspective of system level performance

    NASA Astrophysics Data System (ADS)

    Haran, Terence L.; James, J. Christopher; Cincotta, Tomas E.

    2017-08-01

    The majority of high performance infrared systems today utilize FPAs composed of intrinsic direct bandgap semiconductor photon detectors such as MCT or InSb. Quantum well detector technologies such as QWIPs, QDIPs, and SLS photodetectors are potentially lower cost alternatives to MCT and InSb, but the relative performance of these technologies has not been sufficiently high to allow widespread adoption outside of a handful of applications. While detectors are often evaluated using figures of merit such as NETD or D∗, these metrics, which include many underlying aspects such as spectral quantum efficiency, dark current, well size, MTF, and array response uniformity, may be far removed from the performance metrics used to judge performance of a system in an operationally relevant scenario. True comparisons of performance for various detector technologies from the perspective of end-to-end system performance have rarely been conducted, especially considering the rapid progress of the newer quantum well technologies. System level models such as the US Army's Night Vision Integrated Performance Model (NV-IPM) can calculate image contrast and spatial frequency content using data from the target/background, intervening atmosphere, and system components. This paper includes results from a performance parameter sensitivity analysis using NV-IPM to determine the relative importance of various FPA performance parameters to the overall performance of a long range imaging system. Parameters included are: QE, dark current density, quantum well capacity, downstream readout noise, well fill, image frame rate, frame averaging, and residual fixed pattern noise. The state-of-the art for XBn, QWIP, and SLS detector technologies operating in the MWIR and LWIR bands will be surveyed to assess performance of quantum structures compared to MCT and InSb. The intent is to provide a comprehensive assessment of quantum detector performance and to identify areas where increased research

  19. Past Performance analysis of HPOTP bearings

    NASA Technical Reports Server (NTRS)

    Bhat, B. N.; Dolan, F. J.

    1982-01-01

    The past performance analysis conducted on three High Pressure Oxygen Turbopump (HPOTP) bearings from the Space Shuttle Main Engine is presented. Metallurgical analysis of failed bearing balls and races, and wear track and crack configuration analyses were carried out. In addition, one bearing was tested in laboratory at very high axial loads. The results showed that the cracks were surface initiated and propagated into subsurface locations at relatively small angles. Subsurface cracks were much more extensive than was appeared on the surface. The location of major cracks in the races corresponded to high radial loads rather than high axial loads. There was evidence to suggest that the inner races were heated to elevated temperatures. A failure scenario was developed based on the above findings. According to this scenario the HPOTP bearings are heated by a combination of high loads and high coefficient of friction (poor lubrication). Different methods of extending the HPOTP bearing life are also discussed. These include reduction of axial loads, improvements in bearing design, lubrication and cooling, and use of improved bearing materials.

  20. Advanced flight design systems subsystem performance models. Sample model: Environmental analysis routine library

    NASA Technical Reports Server (NTRS)

    Parker, K. C.; Torian, J. G.

    1980-01-01

    A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.

  1. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  2. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  3. Open-cycle systems performance analysis programming guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, D.A.

    1981-12-01

    The Open-Cycle OTEC Systems Performance Analysis Program is an algorithm programmed on SERI's CDC Cyber 170/720 computer to predict the performance of a Claude-cycle, open-cycle OTEC plant. The algorithm models the Claude-cycle system as consisting of an evaporator, a turbine, a condenser, deaerators, a condenser gas exhaust, a cold water pipe and cold and warm seawater pumps. Each component is a separate subroutine in the main program. A description is given of how to write Fortran subroutines to fit into the main program for the components of the OTEC plant. An explanation is provided of how to use the algorithm.more » The main program and existing component subroutines are described. Appropriate common blocks and input and output variables are listed. Preprogrammed thermodynamic property functions for steam, fresh water, and seawater are described.« less

  4. Electro-optical system for gunshot detection: analysis, concept, and performance

    NASA Astrophysics Data System (ADS)

    Kastek, M.; Dulski, R.; Madura, H.; Trzaskawka, P.; Bieszczad, G.; Sosnowski, T.

    2011-08-01

    The paper discusses technical possibilities to build an effective electro-optical sensor unit for sniper detection using infrared cameras. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. At first, the analysis was presented of three distinguished phases of sniper activity: before, during and after the shot. On the basis of experimental data the parameters defining the relevant sniper signatures were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets and the descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. The analyzed infrared systems were simulated using NVTherm software. The calculations for several cameras, equipped with different lenses and detector types were performed. The simulation of detection ranges was performed for the selected scenarios of sniper detection tasks. After the analysis of simulation results, the technical specifications of infrared sniper detection system were discussed, required to provide assumed detection range. Finally the infrared camera setup was proposed which can detected sniper from 1000 meters range.

  5. Integrated Model for Performance Analysis of All-Optical Multihop Packet Switches

    NASA Astrophysics Data System (ADS)

    Jeong, Han-You; Seo, Seung-Woo

    2000-09-01

    The overall performance of an all-optical packet switching system is usually determined by two criteria, i.e., switching latency and packet loss rate. In some real-time applications, however, in which packets arriving later than a timeout period are discarded as loss, the packet loss rate becomes the most dominant criterion for system performance. Here we focus on evaluating the performance of all-optical packet switches in terms of the packet loss rate, which normally arises from the insufficient hardware or the degradation of an optical signal. Considering both aspects, we propose what we believe is a new analysis model for the packet loss rate that reflects the complicated interactions between physical impairments and system-level parameters. On the basis of the estimation model for signal quality degradation in a multihop path we construct an equivalent analysis model of a switching network for evaluating an average bit error rate. With the model constructed we then propose an integrated model for estimating the packet loss rate in three architectural examples of multihop packet switches, each of which is based on a different switching concept. We also derive the bounds on the packet loss rate induced by bit errors. Finally, it is verified through simulation studies that our analysis model accurately predicts system performance.

  6. Effects of specified performance criterion and performance feedback on staff behavior: a component analysis.

    PubMed

    Hardesty, Samantha L; Hagopian, Louis P; McIvor, Melissa M; Wagner, Leaora L; Sigurdsson, Sigurdur O; Bowman, Lynn G

    2014-09-01

    The present study isolated the effects of frequently used staff training intervention components to increase communication between direct care staff and clinicians working on an inpatient behavioral unit. Written "protocol review" quizzes developed by clinicians were designed to assess knowledge about a patient's behavioral protocols. Direct care staff completed these at the beginning of each day and evening shift. Clinicians were required to score and discuss these protocol reviews with direct care staff for at least 75% of shifts over a 2-week period. During baseline, only 21% of clinicians met this requirement. Completing and scoring of protocol reviews did not improve following additional in-service training (M = 15%) or following an intervention aimed at decreasing response effort combined with prompting (M = 28%). After implementing an intervention involving specified performance criterion and performance feedback, 86% of clinicians reached the established goal. Results of a component analysis suggested that the presentation of both the specified performance criterion and supporting contingencies was necessary to maintain acceptable levels of performance. © The Author(s) 2014.

  7. In-Space Propulsion Technologies for Robotic Exploration of the Solar System

    NASA Technical Reports Server (NTRS)

    Johnson, Les; Meyer, Rae Ann; Frame, Kyle

    2006-01-01

    Supporting NASA's Science Mission Directorate, the In-Space Propulsion Technology Program is developing the next generation of space propulsion technologies for robotic, deep-space exploration. Recent technological advancements and demonstrations of key, high-payoff propulsion technologies have been achieved and will be described. Technologies under development and test include aerocapture, solar electric propulsion, solar sail propulsion, and advanced chemical propulsion.

  8. Thermal Protection System Aerothermal Screening Tests in HYMETS Facility

    NASA Technical Reports Server (NTRS)

    Szalai, Christine E.; Beck, Robin A. S.; Gasch, Matthew J.; Alumni, Antonella I.; Chavez-Garcia, Jose F.; Splinter, Scott C.; Gragg, Jeffrey G.; Brewer, Amy

    2011-01-01

    The Entry, Descent, and Landing (EDL) Technology Development Project has been tasked to develop Thermal Protection System (TPS) materials for insertion into future Mars Entry Systems. A screening arc jet test of seven rigid ablative TPS material candidates was performed in the Hypersonic Materials Environmental Test System (HYMETS) facility at NASA Langley Research Center, in both an air and carbon dioxide test environment. Recession, mass loss, surface temperature, and backface thermal response were measured for each test specimen. All material candidates survived the Mars aerocapture relevant heating condition, and some materials showed a clear increase in recession rate in the carbon dioxide test environment. These test results supported subsequent down-selection of the most promising material candidates for further development.

  9. Performance (Off-Design) Cycle Analysis for a Turbofan Engine With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This report presents the performance of a steady-state, dual-spool, separate-exhaust turbofan engine, with an interstage turbine burner (ITB) serving as a secondary combustor. The ITB, which is located in the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet-engine propulsion. A detailed off-design performance analysis of ITB engines is written in Microsoft(Registered Trademark) Excel (Redmond, Washington) macrocode with Visual Basic Application to calculate engine performances over the entire operating envelope. Several design-point engine cases are pre-selected using a parametric cycle-analysis code developed previously in Microsoft(Registered Trademark) Excel, for off-design analysis. The off-design code calculates engine performances (i.e. thrust and thrust-specific-fuel-consumption) at various flight conditions and throttle settings.

  10. Initial empirical analysis of nuclear power plant organization and its effect on safety performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, J.; McLaughlin, S.D.; Osborn, R.N.

    This report contains an analysis of the relationship between selected aspects of organizational structure and the safety-related performance of nuclear power plants. The report starts by identifying and operationalizing certain key dimensions of organizational structure that may be expected to be related to plant safety performance. Next, indicators of plant safety performance are created by combining existing performance measures into more reliable indicators. Finally, the indicators of plant safety performance using correlational and discriminant analysis. The overall results show that plants with better developed coordination mechanisms, shorter vertical hierarchies, and a greater number of departments tend to perform more safely.

  11. Analysis relating to pavement material characterizations and their effects on pavement performance.

    DOT National Transportation Integrated Search

    1998-01-01

    This report presents the analysis conducted on relating pavement performance or response measures and design considerations to specific pavement layers utilizing data contained in the Long-Term Pavement Performance Program National Information Manage...

  12. Characterizing the flow field around ballutes of various geometries

    NASA Astrophysics Data System (ADS)

    Panko, Jeffrey; Carnasciali, Maria-Isabel

    2016-11-01

    A ballute combines the performance of large parachutes with the rigidity and design flexibility of aeroshells. Such designs, when optimized, could drastically increase the allowable payload for interplanetary missions associated with high reentry velocities, for which, the current capabilities of thermal protection systems are being reached. Using commercially available software, a CFD investigation into the flow phenomena and performance characteristics of various such designs was conducted in order to determine features which may prove conducive for use in aerocapture missions, a primary application of such technology. Concerns around current ballute designs stem from the aerodynamic heating loads and flow instabilities at reentry velocities and as such, the study revolved around geometries which would provide favorable performance under such environments. Design parameters included: blunt versus sharp bodies, boundary layer control, and turbulence model. Results were monitored for changes in lift to drag ratios (L/D), separation point, vortex shedding, and control authority. Funding for this work, in part, provided by the CT Space Grant Consortium.

  13. Starting Performance Analysis for Universal Motors by FEM

    NASA Astrophysics Data System (ADS)

    Kurihara, Kazumi; Sakamoto, Shin-Ichi

    This paper presents a novel transient analysis of the universal motors taking into account the time-varying brush-contact resistance and mechanical loss. The transient current, torque and speed during the starting process are computed by solving the electromagnetic, circuit and dynamic motion equations, simultaneously. The computed performances have been validated by tests in a 500-W, 2-pole, 50Hz, 100V universal motor.

  14. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprisingmore » computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.« less

  15. Free wake analysis of hover performance using a new influence coefficient method

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong

    1990-01-01

    A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.

  16. Longitudinal Trend Analysis of Performance Indicators for South Carolina's Technical Colleges

    ERIC Educational Resources Information Center

    Hossain, Mohammad Nurul

    2010-01-01

    This study included an analysis of the trend of performance indicators for the technical college sector of higher education in South Carolina. In response to demands for accountability and transparency in higher education, the state of South Carolina developed sector specific performance indicators to measure various educational outcomes for each…

  17. Crew Exploration Vehicle Launch Abort Controller Performance Analysis

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.; Raney, David L.

    2007-01-01

    This paper covers the simulation and evaluation of a controller design for the Crew Module (CM) Launch Abort System (LAS), to measure its ability to meet the abort performance requirements. The controller used in this study is a hybrid design, including features developed by the Government and the Contractor. Testing is done using two separate 6-degree-of-freedom (DOF) computer simulation implementations of the LAS/CM throughout the ascent trajectory: 1) executing a series of abort simulations along a nominal trajectory for the nominal LAS/CM system; and 2) using a series of Monte Carlo runs with perturbed initial flight conditions and perturbed system parameters. The performance of the controller is evaluated against a set of criteria, which is based upon the current functional requirements of the LAS. Preliminary analysis indicates that the performance of the present controller meets (with the exception of a few cases) the evaluation criteria mentioned above.

  18. Performance analysis in sport: contributions from a joint analysis of athletes' experience and biomechanical indicators.

    PubMed

    Sève, C; Nordez, A; Poizat, G; Saury, J

    2013-10-01

    The purpose of this study was to test the usefulness of combining two types of analysis to investigate sports performance with the aim of optimizing it. These two types of analysis correspond to two levels of athletes' activity: (a) their experiences during performance and (b) the biomechanical characteristics of their movements. Rowing served as an illustration, and the activity of one female crew member was studied during a race. Three types of data were collected: (a) audiovisual data recorded during the race; (b) verbalization data obtained in interviews conducted afterward; and (c) biomechanical data. The courses of experience of the two rowers during the race were reconstructed on the basis of the audiovisual and verbalization data. This paper presents a detailed analysis of a single phenomenon of the race experienced by one of the rowers. According to the coaches, it reflected a dysfunction in crew coordination. The aim of this analysis was to identify the biomechanical characteristics of the rowers' movements that might explain it. The results showed that the phenomenon could be explained principally by an amplitude differential between the two rowers' strokes. On this basis, the coaches defined new training objectives to remedy the dysfunction in crew coordination. © 2011 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. High-efficiency high performance liquid chromatographic analysis of red wine anthocyanins.

    PubMed

    de Villiers, André; Cabooter, Deirdre; Lynen, Frédéric; Desmet, Gert; Sandra, Pat

    2011-07-22

    The analysis of anthocyanins in natural products is of significant relevance in recent times due to the recognised health benefits associated with their consumption. In red grapes and wines in particular, anthocyanins are known to contribute important properties to the sensory (colour and taste), anti-oxidant- and ageing characteristics. However, the detailed investigation of the alteration of these compounds during wine ageing is hampered by the challenges associated with the separation of grape-derived anthocyanins and their derived products. High performance liquid chromatography (HPLC) is primarily used for this purpose, often in combination with mass spectrometric (MS) detection, although conventional HPLC methods provide incomplete resolution. We have previously demonstrated how on-column inter-conversion reactions are responsible for poor chromatographic efficiency in the HPLC analysis of anthocyanins, and how an increase in temperature and decrease in particle size may improve the chromatographic performance. In the current contribution an experimental configuration for the high efficiency analysis of anthocyanins is derived using the kinetic plot method (KPM). Further, it is shown how analysis under optimal conditions, in combination with MS detection, delivers much improved separation and identification of red wine anthocyanins and their derived products. This improved analytical performance holds promise for the in-depth investigation of these influential compounds in wine during ageing. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Exact Performance Analysis of Two Distributed Processes with Multiple Synchronization Points.

    DTIC Science & Technology

    1987-05-01

    number of processes with straight-line sequences of semaphore operations . We use the geometric model for performance analysis, in contrast to proving...distribution unlimited. 4. PERFORMING’*ORGANIZATION REPORT NUMBERS) 5. MONITORING ORGANIZATION REPORT NUMB CS-TR-1845 6a. NAME OF PERFORMING ORGANIZATION 6b...OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATIO U University of Maryland (If applicable) Office of Naval Research N/A 6c. ADDRESS (City, State, and

  1. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  2. Performance Cycle Analysis of a Two-Spool, Separate-Exhaust Turbofan With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This paper presents the performance cycle analysis of a dual-spool, separate-exhaust turbofan engine, with an Interstage Turbine Burner serving as a secondary combustor. The ITB, which is located at the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet engine propulsion. A detailed performance analysis of this engine has been conducted for steady-state engine performance prediction. A code is written and is capable of predicting engine performances (i.e., thrust and thrust specific fuel consumption) at varying flight conditions and throttle settings. Two design-point engines were studied to reveal trends in performance at both full and partial throttle operations. A mission analysis is also presented to assure the advantage of saving fuel by adding ITB.

  3. Benefits of Application of Advanced Technologies for a Neptune Orbiter, Atmospheric Probes and Triton Lander

    NASA Technical Reports Server (NTRS)

    Somers, Alan; Celano, Luigi; Kauffman, Jeffrey; Rogers, Laura; Peterson, Craig

    2005-01-01

    Missions with planned launch dates several years from today pose significant design challenges in properly accounting for technology advances that may occur in the time leading up to actual spacecraft design, build, test and launch. Conceptual mission and spacecraft designs that rely solely on off the shelf technology will result in conservative estimates that may not be attractive or truly representative of the mission as it actually will be designed and built. This past summer, as part of one of NASA s Vision Mission Studies, a group of students at the Laboratory for Spacecraft and Mission Design (LSMD) have developed and analyzed different Neptune mission baselines, and determined the benefits of various assumed technology improvements. The baseline mission uses either a chemical propulsion system or a solar-electric system. Insertion into orbit around Neptune is achieved by means of aerocapture. Neptune s large moon Triton is used as a tour engine. With these technologies a comprehensive Cassini-class investigation of the Neptune system is possible. Technologies under investigation include the aerocapture heat shield and thermal protection system, both chemical and solar electric propulsion systems, spacecraft power, and energy storage systems.

  4. TPS design for aerobraking at Earth and Mars

    NASA Astrophysics Data System (ADS)

    Williams, S. D.; Gietzel, M. M.; Rochelle, W. C.; Curry, D. M.

    1991-08-01

    An investigation was made to determine the feasibility of using an aerobrake system for manned and unmanned missions to Mars, and to Earth from Mars and lunar orbits. A preliminary thermal protection system (TPS) was examined for five unmanned small nose radius, straight bi-conic vehicles and a scaled up Aeroassist Flight Experiment (AFE) vehicle aerocapturing at Mars. Analyses were also conducted for the scaled up AFE and an unmanned Sample Return Cannister (SRC) returning from Mars and aerocapturing into Earth orbit. Also analyzed were three different classes of lunar transfer vehicles (LTV's): an expendable scaled up modified Apollo Command Module (CM), a raked cone (modified AFT), and three large nose radius domed cylinders. The LTV's would be used to transport personnel and supplies between Earth and the moon in order to establish a manned base on the lunar surface. The TPS for all vehicles analyzed is shown to have an advantage over an all-propulsive velocity reduction for orbit insertion. Results indicate that TPS weight penalties of less than 28 percent can be achieved using current material technology, and slightly less than the most favorable LTV using advanced material technology.

  5. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  6. Performance Analysis, Design Considerations, and Applications of Extreme-Scale In Situ Infrastructures

    DOE PAGES

    Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...

    2016-11-01

    A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less

  7. Performance Analysis of Three-Phase Induction Motor with AC Direct and VFD

    NASA Astrophysics Data System (ADS)

    Kumar, Dinesh

    2018-03-01

    The electrical machine analysis and performance calculation is a very important aspect of efficient drive system design. The development of power electronics devices and power converters provide smooth speed control of Induction Motors by changing the frequency of input supply. These converters, on one hand are providing a more flexible speed control that also leads to problems of harmonics and their associated ailments like pulsating torque, distorted current and voltage waveforms, increasing losses etc. This paper includes the performance analysis of three phase induction motor with three-phase AC direct and variable frequency drives (VFD). The comparison has been concluded with respect to various parameters. MATLAB-SIMULINKTM is used for the analysis.

  8. Design and performance of an analysis-by-synthesis class of predictive speech coders

    NASA Technical Reports Server (NTRS)

    Rose, Richard C.; Barnwell, Thomas P., III

    1990-01-01

    The performance of a broad class of analysis-by-synthesis linear predictive speech coders is quantified experimentally. The class of coders includes a number of well-known techniques as well as a very large number of speech coders which have not been named or studied. A general formulation for deriving the parametric representation used in all of the coders in the class is presented. A new coder, named the self-excited vocoder, is discussed because of its good performance with low complexity, and because of the insight this coder gives to analysis-by-synthesis coders in general. The results of a study comparing the performances of different members of this class are presented. The study takes the form of a series of formal subjective and objective speech quality tests performed on selected coders. The results of this study lead to some interesting and important observations concerning the controlling parameters for analysis-by-synthesis speech coders.

  9. Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.

    PubMed

    Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie

    2010-07-01

    Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010

  10. Evaluating health service quality: using importance performance analysis.

    PubMed

    Izadi, Azar; Jahani, Younes; Rafiei, Sima; Masoud, Ali; Vali, Leila

    2017-08-14

    Purpose Measuring healthcare service quality provides an objective guide for managers and policy makers to improve their services and patient satisfaction. Consequently, the purpose of this paper is to measure service quality provided to surgical and medical inpatients at Kerman Medical Sciences University (KUMS) in 2015. Design/methodology/approach A descriptive-analytic study, using a cross-sectional method in the KUMS training hospitals, was implemented between October 2 and March 15, 2015. Using stratified random sampling, 268 patients were selected. Data were collected using an importance-performance analysis (IPA) questionnaire, which measures current performance and determines each item's importance from the patients' perspectives. These data indicate overall satisfaction and appropriate practical strategies for managers to plan accordingly. Findings Findings revealed a significant gap between service importance and performance. From the patients' viewpoint, tangibility was the highest priority (mean=3.54), while reliability was given the highest performance (mean=3.02). The least important and lowest performance level was social accountability (mean=1.91 and 1.98, respectively). Practical implications Healthcare managers should focus on patient viewpoints and apply patient comments to solve problems, improve service quality and patient satisfaction. Originality/value The authors applied an IPA questionnaire to measure service quality provided to surgical and medical ward patients. This method identifies and corrects service quality shortcomings and improving service recipient perceptions.

  11. Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Robert Meyer and Michael Christian examine select performance-pay plans…

  12. Analysis performance of proton exchange membrane fuel cell (PEMFC)

    NASA Astrophysics Data System (ADS)

    Mubin, A. N. A.; Bahrom, M. H.; Azri, M.; Ibrahim, Z.; Rahim, N. A.; Raihan, S. R. S.

    2017-06-01

    Recently, the proton exchange membrane fuel cell (PEMFC) has gained much attention to the technology of renewable energy due to its mechanically ideal and zero emission power source. PEMFC performance reflects from the surroundings such as temperature and pressure. This paper presents an analysis of the performance of the PEMFC by developing the mathematical thermodynamic modelling using Matlab/Simulink. Apart from that, the differential equation of the thermodynamic model of the PEMFC is used to explain the contribution of heat to the performance of the output voltage of the PEMFC. On the other hand, the partial pressure equation of the hydrogen is included in the PEMFC mathematical modeling to study the PEMFC voltage behaviour related to the input variable input hydrogen pressure. The efficiency of the model is 33.8% which calculated by applying the energy conversion device equations on the thermal efficiency. PEMFC’s voltage output performance is increased by increasing the hydrogen input pressure and temperature.

  13. An analysis of functional shoulder movements during task performance using Dartfish movement analysis software.

    PubMed

    Khadilkar, Leenesh; MacDermid, Joy C; Sinden, Kathryn E; Jenkyn, Thomas R; Birmingham, Trevor B; Athwal, George S

    2014-01-01

    Video-based movement analysis software (Dartfish) has potential for clinical applications for understanding shoulder motion if functional measures can be reliably obtained. The primary purpose of this study was to describe the functional range of motion (ROM) of the shoulder used to perform a subset of functional tasks. A second purpose was to assess the reliability of functional ROM measurements obtained by different raters using Dartfish software. Ten healthy participants, mean age 29 ± 5 years, were videotaped while performing five tasks selected from the Disabilities of the Arm, Shoulder and Hand (DASH). Video cameras and markers were used to obtain video images suitable for analysis in Dartfish software. Three repetitions of each task were performed. Shoulder movements from all three repetitions were analyzed using Dartfish software. The tracking tool of the Dartfish software was used to obtain shoulder joint angles and arcs of motion. Test-retest and inter-rater reliability of the measurements were evaluated using intraclass correlation coefficients (ICC). Maximum (coronal plane) abduction (118° ± 16°) and (sagittal plane) flexion (111° ± 15°) was observed during 'washing one's hair;' maximum extension (-68° ± 9°) was identified during 'washing one's own back.' Minimum shoulder ROM was observed during 'opening a tight jar' (33° ± 13° abduction and 13° ± 19° flexion). Test-retest reliability (ICC = 0.45 to 0.94) suggests high inter-individual task variability, and inter-rater reliability (ICC = 0.68 to 1.00) showed moderate to excellent agreement. KEY FINDINGS INCLUDE: 1) functional shoulder ROM identified in this study compared to similar studies; 2) healthy individuals require less than full ROM when performing five common ADL tasks 3) high participant variability was observed during performance of the five ADL tasks; and 4) Dartfish software provides a clinically relevant tool to analyze shoulder function.

  14. A Study of ATLAS Grid Performance for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Fine, Valery; Wenaus, Torre

    2012-12-01

    In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

  15. Space tug economic analysis study. Volume 2: Tug concepts analysis. Appendix: Tug design and performance data base

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.

  16. Thermodynamic Analysis of Dual-Mode Scramjet Engine Operation and Performance

    NASA Technical Reports Server (NTRS)

    Riggins, David; Tacket, Regan; Taylor, Trent; Auslender, Aaron

    2006-01-01

    Recent analytical advances in understanding the performance continuum (the thermodynamic spectrum) for air-breathing engines based on fundamental second-law considerations have clarified scramjet and ramjet operation, performance, and characteristics. Second-law based analysis is extended specifically in this work to clarify and describe the performance characteristics for dual-mode scramjet operation in the mid-speed range of flight Mach 4 to 7. This is done by a fundamental investigation of the complex but predictable interplay between heat release and irreversibilities in such an engine; results demonstrate the flow and performance character of the dual mode regime and of dual mode transition behavior. Both analytical and computational (multi-dimensional CFD) studies of sample dual-mode flow-fields are performed in order to demonstrate the second-law capability and performance and operability issues. The impact of the dual-mode regime is found to be characterized by decreasing overall irreversibility with increasing heat release, within the operability limits of the system.

  17. Sigma metric analysis for performance of creatinine with fresh frozen serum.

    PubMed

    Kang, Fengfeng; Zhang, Chuanbao; Wang, Wei; Wang, Zhiguo

    2016-01-01

    Six sigma provides an objective and quantitative methodology to describe the laboratory testing performance. In this study, we conducted a national trueness verification scheme with fresh frozen serum (FFS) for serum creatinine to evaluate its performance in China. Two different concentration levels of FFS, targeted with reference method, were sent to 98 laboratories in China. Imprecision and bias of the measurement procedure were calculated for each participant to further evaluate the sigma value. Quality goal index (QGI) analysis was used to investigate the reason of unacceptable performance for laboratories with σ < 3. Our study indicated that the sample with high concentration of creatinine had preferable sigma values. For the enzymatic method, 7.0% (5/71) to 45.1% (32/71) of the laboratories need to improve their measurement procedures (σ < 3). And for the Jaffe method, the percentages were from 11.5% (3/26) to 73.1% (19/26). QGI analysis suggested that most of the laboratories (62.5% for the enzymatic method and 68.4% for the Jaffe method) should make an effort to improve the trueness (QGI > 1.2). Only 3.1-5.3% of the laboratories should improve both of the precision and trueness. Sigma metric analysis of the serum creatinine assays is disappointing, which was mainly due to the unacceptable analytical bias according to the QGI analysis. Further effort is needed to enhance the trueness of the creatinine measurement.

  18. Internet Performance Analysis of South Asian Countries Using End-to-End Internet Performance Measurements

    DOE PAGES

    Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie; ...

    2018-05-28

    Internet performance is highly correlated with key economic development metrics of a region. According to World Bank, the economic growth of a country increases 1.3% with a 10% increase in the speed of the Internet. Therefore, it is necessary to monitor and understand the performance of the Internet links in the region. It helps to figure out the infrastructural inefficiencies, poor resource allocation, and routing issues in the region. Moreover, it provides healthy suggestions for future upgrades. Therefore, the objective of this paper is to understand the Internet performance and routing infrastructure of South Asian countries in comparison to themore » developed world and neighboring countries using end-to-end Internet performance measurements. The South Asian countries comprise nearly 32% of the Internet users in Asia and nearly 16% of the world. The Internet performance metrics in the region are collected through the PingER framework. The framework is developed by the SLAC National Accelerator Laboratory, USA and is running for the last 20 years. PingER has 16 monitoring nodes in the region, and in the last year PingER monitors about 40 sites in South Asia using the ubiquitous ping facility. The collected data is used to estimate the key Internet performance metrics of South Asian countries. The performance metrics are compared with the neighboring countries and the developed world. Particularly, the TCP throughput of the countries is also correlated with different development indices. Further, worldwide Internet connectivity and routing patterns of the countries are investigated to figure out the inconsistencies in the region. Furthermore, the performance analysis revealed that the South Asia region is 7-10 years behind the developed regions of North America (USA and Canada), Europe, and East Asia.« less

  19. Internet Performance Analysis of South Asian Countries Using End-to-End Internet Performance Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie

    Internet performance is highly correlated with key economic development metrics of a region. According to World Bank, the economic growth of a country increases 1.3% with a 10% increase in the speed of the Internet. Therefore, it is necessary to monitor and understand the performance of the Internet links in the region. It helps to figure out the infrastructural inefficiencies, poor resource allocation, and routing issues in the region. Moreover, it provides healthy suggestions for future upgrades. Therefore, the objective of this paper is to understand the Internet performance and routing infrastructure of South Asian countries in comparison to themore » developed world and neighboring countries using end-to-end Internet performance measurements. The South Asian countries comprise nearly 32% of the Internet users in Asia and nearly 16% of the world. The Internet performance metrics in the region are collected through the PingER framework. The framework is developed by the SLAC National Accelerator Laboratory, USA and is running for the last 20 years. PingER has 16 monitoring nodes in the region, and in the last year PingER monitors about 40 sites in South Asia using the ubiquitous ping facility. The collected data is used to estimate the key Internet performance metrics of South Asian countries. The performance metrics are compared with the neighboring countries and the developed world. Particularly, the TCP throughput of the countries is also correlated with different development indices. Further, worldwide Internet connectivity and routing patterns of the countries are investigated to figure out the inconsistencies in the region. Furthermore, the performance analysis revealed that the South Asia region is 7-10 years behind the developed regions of North America (USA and Canada), Europe, and East Asia.« less

  20. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  1. Analysis of architect’s performance indicators in project delivery process

    NASA Astrophysics Data System (ADS)

    Marisa, A.

    2018-03-01

    Architect as a professional in the construction industry should possess a good performance in project delivery process. As a design professional, architect has an important role to ensure that the process is well-conducted by delivering a high-quality product for the clients. Thus, analyzing architect’s performance indicators is crucial in the project delivery process. This study aims to analyze the relative importance of architect performance indicators in project delivery process among registered architects in North Sumatera, Indonesia. A total of five indicators that measure architect performance in project delivery process were identified and 110 completed questionnaires were obtained and used for data analysis. A relative importance index is used to rank the relative importance of architect performance indicators. Results indicate that focus on the clients is the most important indicator of architect performance in project delivery process. This study demonstrates project communication as one of crucial indicators perceived by the architects for measuring their performance, and fills a knowledge gap on the importance of identifying the most important indicator for measuring architect performance from their own perspectives which previous studies have overlooked to improve performance assessment in project delivery process.

  2. Body sway, aim point fluctuation and performance in rifle shooters: inter- and intra-individual analysis.

    PubMed

    Ball, Kevin A; Best, Russell J; Wrigley, Tim V

    2003-07-01

    In this study, we examined the relationships between body sway, aim point fluctuation and performance in rifle shooting on an inter- and intra-individual basis. Six elite shooters performed 20 shots under competition conditions. For each shot, body sway parameters and four aim point fluctuation parameters were quantified for the time periods 5 s to shot, 3 s to shot and 1 s to shot. Three parameters were used to indicate performance. An AMTI LG6-4 force plate was used to measure body sway parameters, while a SCATT shooting analysis system was used to measure aim point fluctuation and shooting performance. Multiple regression analysis indicated that body sway was related to performance for four shooters. Also, body sway was related to aim point fluctuation for all shooters. These relationships were specific to the individual, with the strength of association, parameters of importance and time period of importance different for different shooters. Correlation analysis of significant regressions indicated that, as body sway increased, performance decreased and aim point fluctuation increased for most relationships. We conclude that body sway and aim point fluctuation are important in elite rifle shooting and performance errors are highly individual-specific at this standard. Individual analysis should be a priority when examining elite sports performance.

  3. Clinical laboratory as an economic model for business performance analysis.

    PubMed

    Buljanović, Vikica; Patajac, Hrvoje; Petrovecki, Mladen

    2011-08-15

    To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by implementing changes in the next fiscal

  4. Energy performance analysis of a detached single-family house to be refurbished

    NASA Astrophysics Data System (ADS)

    Aleixo, Kevin; Curado, António

    2017-07-01

    This study was developed with the purpose of analyzing the reinforcement of the energy performance in a detached single-family house to be refurbished, using this building as a case-study for simulation and experimental analysis. The building is located in Viana do Castelo, a city in the northwest of Portugal nearby the Atlantic Ocean. The developed study was carried out in order to characterize the thermal performance of the house, using simulation analysis in a dynamic regime. The energy consumption study was developed in permanent regime analysis, using simulation tools. At the end, the study aimed to propose and define the best retrofitting solutions, both passive and active, and to improve the energy performance of the building. The outcomes of the study provided the importance of passive retrofitting solutions on thermal comfort and energy performance. The use of a set of thermal solutions, as the insulation of the roof, walls and the windows, it is possible to achieve a global gain of 0, 63 °C and to reduce energy consumption in 61, 46 [kWh/m2.year]. The study of the building in a simplified thermal regime, according to the Portuguese energy efficiency regulation, allowed the determination of the energy efficiency class of the house and retrofitting solutions proposed. The initial energy performance class of the building is C. With the application of a passive set of solutions, it's possible to improve the energy performance to a class B. With the implementation of some active solutions, it is possible to reach an energy class A +.

  5. Experimental and Numerical analysis of Metallic Bellow for Acoustic Performance

    NASA Astrophysics Data System (ADS)

    Panchwadkar, Amit A.; Awasare, Pradeep J., Dr.; Ingle, Ravidra B., Dr.

    2017-08-01

    Noise will concern about the work environment of industry. Machinery environment has overall noise which interrupts communication between the workers. This problem of miscommunication and health hazard will make sense to go for noise attenuation. Modification in machine setup may affect the performance of it. Instead of that, Helmholtz resonator principle will be a better option for noise reduction along the transmission path. Resonator has design variables which gives resonating frequency will help us to confirm the frequency range. This paper deals with metallic bellow which behaves like inertial mass under incident sound wave. Sound wave energy is affected by hard boundary condition of resonator and bellow. Metallic bellow is used in combination with resonator to find out Transmission loss (TL). Microphone attachment with FFT analyzer will give the frequency range for numerical analysis. Numerical analysis of bellow and resonator is carried out to summarize the acoustic behavior of bellow. Bellow can be numerically analyzed to check noise attenuation for centrifugal blower. An impedance tube measurement technique is performed to validate the numerical results for assembly. Dimensional and shape modification can be done to get the acoustic performance of bellow.

  6. GLOBAL REFERENCE ATMOSPHERIC MODELS FOR AEROASSIST APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Duvall, Aleta; Justus, C. G.; Keller, Vernon W.

    2005-01-01

    Aeroassist is a broad category of advanced transportation technology encompassing aerocapture, aerobraking, aeroentry, precision landing, hazard detection and avoidance, and aerogravity assist. The eight destinations in the Solar System with sufficient atmosphere to enable aeroassist technology are Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Saturn's moon Titan. Engineering-level atmospheric models for five of these targets - Earth, Mars, Titan, Neptune, and Venus - have been developed at NASA's Marshall Space Flight Center. These models are useful as tools in mission planning and systems analysis studies associated with aeroassist applications. The series of models is collectively named the Global Reference Atmospheric Model or GRAM series. An important capability of all the models in the GRAM series is their ability to simulate quasi-random perturbations for Monte Carlo analysis in developing guidance, navigation and control algorithms, for aerothermal design, and for other applications sensitive to atmospheric variability. Recent example applications are discussed.

  7. Link Performance Analysis and monitoring - A unified approach to divergent requirements

    NASA Astrophysics Data System (ADS)

    Thom, G. A.

    Link Performance Analysis and real-time monitoring are generally covered by a wide range of equipment. Bit Error Rate testers provide digital link performance measurements but are not useful during real-time data flows. Real-time performance monitors utilize the fixed overhead content but vary widely from format to format. Link quality information is also present from signal reconstruction equipment in the form of receiver AGC, bit synchronizer AGC, and bit synchronizer soft decision level outputs, but no general approach to utilizing this information exists. This paper presents an approach to link tests, real-time data quality monitoring, and results presentation that utilizes a set of general purpose modules in a flexible architectural environment. The system operates over a wide range of bit rates (up to 150 Mbs) and employs several measurement techniques, including P/N code errors or fixed PCM format errors, derived real-time BER from frame sync errors, and Data Quality Analysis derived by counting significant sync status changes. The architecture performs with a minimum of elements in place to permit a phased update of the user's unit in accordance with his needs.

  8. Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.; Munoz, Cesar A.

    2007-01-01

    This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.

  9. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  10. Individual and population pharmacokinetic compartment analysis: a graphic procedure for quantification of predictive performance.

    PubMed

    Eksborg, Staffan

    2013-01-01

    Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.

  11. An Analysis of Factors That Affect the Educational Performance of Agricultural Students

    ERIC Educational Resources Information Center

    Greenway, Gina

    2012-01-01

    Many factors contribute to student achievement. This study focuses on three areas: how students learn, how student personality type affects performance, and how course format affects performance outcomes. The analysis sought to improve understanding of the direction and magnitude with which each of these factors impacts student success. Improved…

  12. Embedded Figures Test Performance in the Broader Autism Phenotype: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cribb, Serena J.; Olaithe, Michelle; Di Lorenzo, Renata; Dunlop, Patrick D.; Maybery, Murray T.

    2016-01-01

    People with autism show superior performance to controls on the Embedded Figures Test (EFT). However, studies examining the relationship between autistic-like traits and EFT performance in neurotypical individuals have yielded inconsistent findings. To examine the inconsistency, a meta-analysis was conducted of studies that (a) compared high and…

  13. The Effects of Music on Microsurgical Technique and Performance: A Motion Analysis Study.

    PubMed

    Shakir, Afaaf; Chattopadhyay, Arhana; Paek, Laurence S; McGoldrick, Rory B; Chetta, Matthew D; Hui, Kenneth; Lee, Gordon K

    2017-05-01

    Music is commonly played in operating rooms (ORs) throughout the country. If a preferred genre of music is played, surgeons have been shown to perform surgical tasks quicker and with greater accuracy. However, there are currently no studies investigating the effects of music on microsurgical technique. Motion analysis technology has recently been validated in the objective assessment of plastic surgery trainees' performance of microanastomoses. Here, we aimed to examine the effects of music on microsurgical skills using motion analysis technology as a primary objective assessment tool. Residents and fellows in the Plastic and Reconstructive Surgery program were recruited to complete a demographic survey and participate in microsurgical tasks. Each participant completed 2 arterial microanastomoses on a chicken foot model, one with music playing, and the other without music playing. Participants were blinded to the study objectives and encouraged to perform their best. The order of music and no music was randomized. Microanastomoses were video recorded using a digitalized S-video system and deidentified. Video segments were analyzed using ProAnalyst motion analysis software for automatic noncontact markerless video tracking of the needle driver tip. Nine residents and 3 plastic surgery fellows were tested. Reported microsurgical experience ranged from 1 to 10 arterial anastomoses performed (n = 2), 11 to 100 anastomoses (n = 9), and 101 to 500 anastomoses (n = 1). Mean age was 33 years (range, 29-36 years), with 11 participants right-handed and 1 ambidextrous. Of the 12 subjects tested, 11 (92%) preferred music in the OR. Composite instrument motion analysis scores significantly improved with playing preferred music during testing versus no music (paired t test, P <0.001). Improvement with music was significant even after stratifying scores by order in which variables were tested (music first vs no music first), postgraduate year, and number of anastomoses (analysis

  14. New ways to analyze word generation performance in brain injury: A systematic review and meta-analysis of additional performance measures.

    PubMed

    Thiele, Kristina; Quinting, Jana Marie; Stenneken, Prisca

    2016-09-01

    The investigation of word generation performance is an accepted, widely used, and well-established method for examining cognitive, language, or communication impairment due to brain damage. The performance measure traditionally applied in the investigation of word generation is the number of correct responses. Previous studies, however, have suggested that this measure does not capture all potentially relevant aspects of word generation performance and hence its underlying processes, so that its analytical and explanatory power of word generation performance might be rather limited. Therefore, additional qualitative or quantitative performance measures have been introduced to gain information that goes beyond the deficit and allows for therapeutic implications. We undertook a systematic review and meta-analysis of original research that focused on the application of additional measures of word generation performance in adult clinical populations with acquired brain injury. Word generation tasks are an integral part of many different tests, but only few use additional performance measures in addition to the number of correct responses in the analysis of word generation performance. Additional measures, which showed increased or similar diagnostic utility relative to the traditional performance measure, regarded clustering and switching, error types, and temporal characteristics. The potential of additional performance measures is not yet fully exhausted in patients with brain injury. The temporal measure of response latencies in particular is not adequately represented, though it may be a reliable measure especially for identifying subtle impairments. Unfortunately, there is no general consensus as of yet on which additional measures are best suited to characterizing word generation performance. Further research is needed to specify the additional parameters that are best qualified for identifying and characterizing impaired word generation performance.

  15. Statistical analysis of microgravity experiment performance using the degrees of success scale

    NASA Technical Reports Server (NTRS)

    Upshaw, Bernadette; Liou, Ying-Hsin Andrew; Morilak, Daniel P.

    1994-01-01

    This paper describes an approach to identify factors that significantly influence microgravity experiment performance. Investigators developed the 'degrees of success' scale to provide a numerical representation of success. A degree of success was assigned to 293 microgravity experiments. Experiment information including the degree of success rankings and factors for analysis was compiled into a database. Through an analysis of variance, nine significant factors in microgravity experiment performance were identified. The frequencies of these factors are presented along with the average degree of success at each level. A preliminary discussion of the relationship between the significant factors and the degree of success is presented.

  16. Availability Performance Analysis of Thermal Power Plants

    NASA Astrophysics Data System (ADS)

    Bhangu, Navneet Singh; Singh, Rupinder; Pahuja, G. L.

    2018-03-01

    This case study presents the availability evaluation method of thermal power plants for conducting performance analysis in Indian environment. A generic availability model has been proposed for a maintained system (thermal plants) using reliability block diagrams and fault tree analysis. The availability indices have been evaluated under realistic working environment using inclusion exclusion principle. Four year failure database has been used to compute availability for different combinatory of plant capacity, that is, full working state, reduced capacity or failure state. Availability is found to be very less even at full rated capacity (440 MW) which is not acceptable especially in prevailing energy scenario. One of the probable reason for this may be the difference in the age/health of existing thermal power plants which requires special attention of each unit from case to case basis. The maintenance techniques being used are conventional (50 years old) and improper in context of the modern equipment, which further aggravate the problem of low availability. This study highlights procedure for finding critical plants/units/subsystems and helps in deciding preventive maintenance program.

  17. Core-Shell Columns in High-Performance Liquid Chromatography: Food Analysis Applications

    PubMed Central

    Preti, Raffaella

    2016-01-01

    The increased separation efficiency provided by the new technology of column packed with core-shell particles in high-performance liquid chromatography (HPLC) has resulted in their widespread diffusion in several analytical fields: from pharmaceutical, biological, environmental, and toxicological. The present paper presents their most recent applications in food analysis. Their use has proved to be particularly advantageous for the determination of compounds at trace levels or when a large amount of samples must be analyzed fast using reliable and solvent-saving apparatus. The literature hereby described shows how the outstanding performances provided by core-shell particles column on a traditional HPLC instruments are comparable to those obtained with a costly UHPLC instrumentation, making this novel column a promising key tool in food analysis. PMID:27143972

  18. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance

    PubMed Central

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4. PMID:29434562

  19. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance.

    PubMed

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4.

  20. How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.

    PubMed

    Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A

    2018-05-01

    A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  1. Response Modeling of Lightweight Charring Ablators and Thermal Radiation Testing Results

    NASA Technical Reports Server (NTRS)

    Congdon, William M.; Curry, Donald M.; Rarick, Douglas A.; Collins, Timothy J.

    2003-01-01

    Under NASA's In-Space Propulsion/Aerocapture Program, ARA conducted arc-jet and thermal-radiation ablation test series in 2003 for advanced development, characterization, and response modeling of SRAM-20, SRAM-17, SRAM-14, and PhenCarb-20 ablators. Testing was focused on the future Titan Explorer mission. Convective heating rates (CW) were as high as 153 W/sq cm in the IHF and radiation rates were 100 W/sq cm in the Solar Tower Facility. The ablators showed good performance in the radiation environment without spallation, which was initially a concern, but they also showed higher in-depth temperatures when compared to analytical predictions based on arc-jet thermal-ablation response models. More testing in 2003 is planned in both of these facility to generate a sufficient data base for Titan TPS engineering.

  2. Performance analysis of an integrated GPS/inertial attitude determination system. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Sullivan, Wendy I.

    1994-01-01

    The performance of an integrated GPS/inertial attitude determination system is investigated using a linear covariance analysis. The principles of GPS interferometry are reviewed, and the major error sources of both interferometers and gyroscopes are discussed and modeled. A new figure of merit, attitude dilution of precision (ADOP), is defined for two possible GPS attitude determination methods, namely single difference and double difference interferometry. Based on this figure of merit, a satellite selection scheme is proposed. The performance of the integrated GPS/inertial attitude determination system is determined using a linear covariance analysis. Based on this analysis, it is concluded that the baseline errors (i.e., knowledge of the GPS interferometer baseline relative to the vehicle coordinate system) are the limiting factor in system performance. By reducing baseline errors, it should be possible to use lower quality gyroscopes without significantly reducing performance. For the cases considered, single difference interferometry is only marginally better than double difference interferometry. Finally, the performance of the system is found to be relatively insensitive to the satellite selection technique.

  3. A meta-analysis of math performance in Turner syndrome.

    PubMed

    Baker, Joseph M; Reiss, Allan L

    2016-02-01

    Studies investigating the relationship between Turner syndrome and math learning disability have used a wide variation of tasks designed to test various aspects of mathematical competencies. Although these studies have revealed much about the math deficits common to Turner syndrome, their diversity makes comparisons between individual studies difficult. As a result, the consistency of outcomes among these diverse measures remains unknown. The overarching aim of this review is to provide a systematic meta-analysis of the differences in math and number performance between females with Turner syndrome and age-matched neurotypical peers. We provide a meta-analysis of behavioral performance in Turner syndrome relative to age-matched neurotypical populations on assessments of math and number aptitude. In total, 112 comparisons collected across 17 studies were included. Although 54% of all statistical comparisons in our analyses failed to reject the null hypothesis, our results indicate that meaningful group differences exist on all comparisons except those that do not require explicit calculation. Taken together, these results help elucidate our current understanding of math and number weaknesses in Turner syndrome, while highlighting specific topics that require further investigation. © 2015 Mac Keith Press.

  4. Performance analysis of medical video streaming over mobile WiMAX.

    PubMed

    Alinejad, Ali; Philip, N; Istepanian, R H

    2010-01-01

    Wireless medical ultrasound streaming is considered one of the emerging application within the broadband mobile healthcare domain. These applications are considered as bandwidth demanding services that required high data rates with acceptable diagnostic quality of the transmitted medical images. In this paper, we present the performance analysis of a medical ultrasound video streaming acquired via special robotic ultrasonography system over emulated WiMAX wireless network. The experimental set-up of this application is described together with the performance of the relevant medical quality of service (m-QoS) metrics.

  5. A Preliminary Analysis of LANDSAT-4 Thematic Mapper Radiometric Performance

    NASA Technical Reports Server (NTRS)

    Justice, C.; Fusco, L.; Mehl, W.

    1984-01-01

    Analysis was performed to characterize the radiometry of three Thematic Mapper (TM) digital products of a scene of Arkansas. The three digital products examined were the NASA raw (BT) product, the radiometrically corrected (AT) product and the radiometrically and geometrically corrected (PT) product. The frequency distribution of the digital data; the statistical correlation between the bands; and the variability between the detectors within a band were examined on a series of image subsets from the full scene. The results are presented from one 1024 x 1024 pixel subset of Realfoot Lake, Tennessee which displayed a representative range of ground conditions and cover types occurring within the full frame image. Bands 1, 2 and 5 of the sample area are presented. The subsets were extracted from the three digital data products to cover the same geographic area. This analysis provides the first step towards a full appraisal of the TM radiometry being performed as part of the ESA/CEC contribution to the NASA/LIDQA program.

  6. Sleep-deprivation effect on human performance: a meta-analysis approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candice D. Griffith; Candice D. Griffith; Sankaran Mahadevan

    Human fatigue is hard to define since there is no direct measure of fatigue, much like stress. Instead fatigue must be inferred from measures that are affected by fatigue. One such measurable output affected by fatigue is reaction time. In this study the relationship of reaction time to sleep deprivation is studied. These variables were selected because reaction time and hours of sleep deprivation are straightforward characteristics of fatigue to begin the investigation of fatigue effects on performance. Meta-analysis, a widely used procedure in medical and psychological studies, is applied to the variety of fatigue literature collected from various fieldsmore » in this study. Meta-analysis establishes a procedure for coding and analyzing information from various studies to compute an effect size. In this research the effect size reported is the difference between standardized means, and is found to be -0.6341, implying a strong relationship between sleep deprivation and performance degradation.« less

  7. Receiver operating characteristic analysis of age-related changes in lineup performance.

    PubMed

    Humphries, Joyce E; Flowe, Heather D

    2015-04-01

    In the basic face memory literature, support has been found for the late maturation hypothesis, which holds that face recognition ability is not fully developed until at least adolescence. Support for the late maturation hypothesis in the criminal lineup identification literature, however, has been equivocal because of the analytic approach that has been used to examine age-related changes in identification performance. Recently, receiver operator characteristic (ROC) analysis was applied for the first time in the adult eyewitness memory literature to examine whether memory sensitivity differs across different types of lineup tests. ROC analysis allows for the separation of memory sensitivity from response bias in the analysis of recognition data. Here, we have made the first ROC-based comparison of adults' and children's (5- and 6-year-olds and 9- and 10-year-olds) memory performance on lineups by reanalyzing data from Humphries, Holliday, and Flowe (2012). In line with the late maturation hypothesis, memory sensitivity was significantly greater for adults compared with young children. Memory sensitivity for older children was similar to that for adults. The results indicate that the late maturation hypothesis can be generalized to account for age-related performance differences on an eyewitness memory task. The implications for developmental eyewitness memory research are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Meta-analysis of the relationship between TQM and Business Performance

    NASA Astrophysics Data System (ADS)

    F, Ahmad M.; N, Zakuan; A, Jusoh; Z, Tasir; J, Takala

    2013-06-01

    Meta-analysis has been conducted based on 20 previous works from 4,040 firms at 16 countries from Asia, Europe and America. Throughout this paper a meta-analysis, this paper reviews the relationships between TQM and business performance amongst the regions. Meta-analysis result concludes that the average of rc is 0.47; Asia (rc=0.54), America (rc=0.43) and Europe (rc=0.38). The analysis also shows that Asia developed countries have greatest impact of TQM (rc=0.56). However, the analysis of ANOVA and t-test show that there is no significant difference amongst type of country (developed and developing countries) and regions at p=0.05. In addition, the average result of rc2 is 0.24; Asia (rc2=0.33), America (rc2=0.22) and Europe (rc2=0.15). Meanwhile, rc2 in developing countries (rc2=0.28) are higher than developed countries (rc2=0.21).

  9. Advanced multiphysics coupling for LWR fuel performance analysis

    DOE PAGES

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; ...

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics,more » particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is

  10. Mars Transportation Environment Definition Document

    NASA Technical Reports Server (NTRS)

    Alexander, M. (Editor)

    2001-01-01

    This document provides a compilation of environments knowledge about the planet Mars. Information is divided into three catagories: (1) interplanetary space environments (environments required by the technical community to travel to and from Mars); (2) atmospheric environments (environments needed to aerocapture, aerobrake, or use aeroassist for precision trajectories down to the surface); and (3) surface environments (environments needed to have robots or explorers survive and work on the surface).

  11. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    NASA Astrophysics Data System (ADS)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  12. Results of a 24-inch Hybrid Motor Performance Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Sims, Joseph D.; Coleman, Hugh W.

    1998-01-01

    The subscale (11 and 24-inch) hybrid motors at the Marshall Space Flight Center (MSFC) have been used as versatile and cost effective testbeds for developing new technology. Comparisons between motor configuration, ignition systems, feed systems, fuel formulations, and nozzle materials have been carried out without detailed consideration as to haw "good" the motor performance data were. For the 250,000 lb/thrust motor developed by the Hybrid Propulsion Demonstration Program consortium, this shortcoming is particularly risky because motor performance will likely be used as put of a set of downselect criteria to choose between competing ignition and feed systems under development. This analysis directly addresses that shortcoming by applying uncertainty analysis techniques to the experimental determination of the characteristic velocity, theoretical characteristic velocity, and characteristic velocity efficiency for a 24-inch motor firing. With the adoption of fuel-lined headends, flow restriction, and aft mixing chambers, state of the an 24-inch hybrid motors have become very efficient However, impossibly high combustion efficiencies (some computed as high as 108%) have been measured in some tests with 11-inch motors. This analysis has given new insight into explaining how these efficiencies were measured to be so high, and into which experimental measurements contribute the most to the overall uncertainty.

  13. Path Analysis Tests of Theoretical Models of Children's Memory Performance

    ERIC Educational Resources Information Center

    DeMarie, Darlene; Miller, Patricia H.; Ferron, John; Cunningham, Walter R.

    2004-01-01

    Path analysis was used to test theoretical models of relations among variables known to predict differences in children's memory--strategies, capacity, and metamemory. Children in kindergarten to fourth grade (chronological ages 5 to 11) performed different memory tasks. Several strategies (i.e., sorting, clustering, rehearsal, and self-testing)…

  14. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  15. Space Shuttle Main Engine performance analysis

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  16. Performance Analysis of Scientific and Engineering Applications Using MPInside and TAU

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Mehrotra, Piyush; Taylor, Kenichi Jun Haeng; Shende, Sameer Suresh; Biswas, Rupak

    2010-01-01

    In this paper, we present performance analysis of two NASA applications using performance tools like Tuning and Analysis Utilities (TAU) and SGI MPInside. MITgcmUV and OVERFLOW are two production-quality applications used extensively by scientists and engineers at NASA. MITgcmUV is a global ocean simulation model, developed by the Estimating the Circulation and Climate of the Ocean (ECCO) Consortium, for solving the fluid equations of motion using the hydrostatic approximation. OVERFLOW is a general-purpose Navier-Stokes solver for computational fluid dynamics (CFD) problems. Using these tools, we analyze the MPI functions (MPI_Sendrecv, MPI_Bcast, MPI_Reduce, MPI_Allreduce, MPI_Barrier, etc.) with respect to message size of each rank, time consumed by each function, and how ranks communicate. MPI communication is further analyzed by studying the performance of MPI functions used in these two applications as a function of message size and number of cores. Finally, we present the compute time, communication time, and I/O time as a function of the number of cores.

  17. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  18. Clinical laboratory as an economic model for business performance analysis

    PubMed Central

    Buljanović, Vikica; Patajac, Hrvoje; Petrovečki, Mladen

    2011-01-01

    Aim To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Methods Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. Conclusion The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by

  19. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    PubMed Central

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  20. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments.

    PubMed

    Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein

    2013-03-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.

  1. Development of Solid State Thermal Sensors for Aeroshell TPS Flight Applications

    NASA Technical Reports Server (NTRS)

    Martinez, Ed; Oishi, Tomo; Gorbonov, Sergey

    2005-01-01

    In-situ Thermal Protection System (TPS) sensors are required to provide verification by traceability of TPS performance and sizing tools. Traceability will lead to higher fidelity design tools, which in turn will lead to lower design safety margins, and decreased heatshield mass. Decreasing TPS mass will enable certain missions that are not otherwise feasible, and directly increase science payload. NASA Ames is currently developing two flight measurements as essential to advancing the state of TPS traceability for material modeling and aerothermal simulation: heat flux and surface recession (for ablators). The heat flux gage is applicable to both ablators and non-ablators and is therefore the more generalized sensor concept of the two with wider applicability to mission scenarios. This paper describes the continuing development of a thermal microsensor capable of surface and in-depth temperature and heat flux measurements for TPS materials appropriate to Titan, Neptune, and Mars aerocapture, and direct entry. The thermal sensor is a monolithic solid state device composed of thick film platinum RTD on an alumina substrate. Choice of materials and critical dimensions are used to tailor gage response, determined during calibration activities, to specific (forebody vs. aftbody) heating environments. Current design has maximum operating temperature of 1500K, and allowable constant heat flux of q=28.7 W/cm(sup 2), and time constants between 0.05 and 0.2 seconds. The catalytic and radiative response of these heat flux gages can also be changed through the use of appropriate coatings. By using several co-located gages with various surface coatings, data can be obtained to isolate surface heat flux components due to radiation, catalycity and convection. Selectivity to radiative heat flux is a useful feature even for an in-depth gage, as radiative transport may be a significant heat transport mechanism for porous TPS materials in Titan aerocapture.

  2. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  3. Empirical Analysis of Optical Attenuator Performance in Quantum Key Distribution Systems Using a Particle Model

    DTIC Science & Technology

    2012-03-01

    EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution

  4. The UTRC wind energy conversion system performance analysis for horizontal axis wind turbines (WECSPER)

    NASA Technical Reports Server (NTRS)

    Egolf, T. A.; Landgrebe, A. J.

    1981-01-01

    The theory for the UTRC Energy Conversion System Performance Analysis (WECSPER) for the prediction of horizontal axis wind turbine performance is presented. Major features of the analysis are the ability to: (1) treat the wind turbine blades as lifting lines with a prescribed wake model; (2) solve for the wake-induced inflow and blade circulation using real nonlinear airfoil data; and (3) iterate internally to obtain a compatible wake transport velocity and blade loading solution. This analysis also provides an approximate treatment of wake distortions due to tower shadow or wind shear profiles. Finally, selected results of internal UTRC application of the analysis to existing wind turbines and correlation with limited test data are described.

  5. Analysis of Workplace Health Education Performed by Occupational Health Managers in Korea.

    PubMed

    Kim, Yeon-Ha; Jung, Moon-Hee

    2016-09-01

    To evaluate workplace health education as practiced by occupational health managers based on standardized job tasks and suggest priority tasks and areas to be trained. The study was conducted between November 10, 2013 and April 30, 2014. The tool used in this study was standardized job tasks of workplace health education for occupational health managers which was developed through methodological steps. It was evaluated by 233 worksite occupational health managers. Data were analyzed using SPSS 21.0. Predicting variables of workplace health education performance were the "analysis and planning" factor, type of enterprise, and form of management. Healthcare professionals and occupational health managers who managed the nonmanufacturing industry showed high importance and low performance level in "analysis and planning" factor. "Analysis and planning" skill is priority training area for healthcare professionals and occupational health managers who managed nonmanufacturing industry. It is necessary to develop a training curriculum for occupational health managers that include improving analysis of worksites and plans for a health education program. Copyright © 2016. Published by Elsevier B.V.

  6. Performance analysis of a Principal Component Analysis ensemble classifier for Emotiv headset P300 spellers.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2014-01-01

    The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.

  7. Lightweight Ablative and Ceramic Thermal Protection System Materials for NASA Exploration Systems Vehicles

    NASA Technical Reports Server (NTRS)

    Valentine, Peter G.; Lawrence, Timothy W.; Gubert, Michael K.; Milos, Frank S.; Kiser, James D.; Ohlhorst, Craig W.; Koenig, John R.

    2006-01-01

    As a collaborative effort among NASA Centers, the "Lightweight Nonmetallic Thermal Protection Materials Technology" Project was set up to assist mission/vehicle design trade studies, to support risk reduction in thermal protection system (TPS) material selections, to facilitate vehicle mass optimization, and to aid development of human-rated TPS qualification and certification plans. Missions performing aerocapture, aerobraking, or direct aeroentry rely on advanced heatshields that allow reductions in spacecraft mass by minimizing propellant requirements. Information will be presented on candidate materials for such reentry approaches and on screening tests conducted (material property and space environmental effects tests) to evaluate viable candidates. Seventeen materials, in three classes (ablatives, tiles, and ceramic matrix composites), were studied. In additional to physical, mechanical, and thermal property tests, high heat flux laser tests and simulated-reentry oxidation tests were performed. Space environmental effects testing, which included exposures to electrons, atomic oxygen, and hypervelocity impacts, was also conducted.

  8. Mixed-Integer Nonconvex Quadratic Optimization Relaxations and Performance Analysis

    DTIC Science & Technology

    2016-10-11

    Analysis of Interior Point Algorithms for Non-Lipschitz and Nonconvex Minimization,” (W. Bian, X. Chen, and Ye), Math Programming, 149 (2015) 301-327...Chen, Ge, Wang, Ye), Math Programming, 143 (1-2) (2014) 371-383. This paper resolved an important open question in cardinality constrained...Statistical Performance, and Algorithmic Theory for Local Solutions,” (H. Liu, T. Yao, R. Li, Y. Ye) manuscript, 2nd revision in Math Programming

  9. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    ERIC Educational Resources Information Center

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  10. Gender Differences in Performance of Script Analysis by Older Adults

    ERIC Educational Resources Information Center

    Helmes, E.; Bush, J. D.; Pike, D. L.; Drake, D. G.

    2006-01-01

    Script analysis as a test of executive functions is presumed sensitive to cognitive changes seen with increasing age. Two studies evaluated if gender differences exist in performance on scripts for familiar and unfamiliar tasks in groups of cognitively intact older adults. In Study 1, 26 older adults completed male and female stereotypical…

  11. Independent Verification of Mars-GRAM 2010 with Mars Climate Sounder Data

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Burns, Kerry L.

    2014-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission and engineering applications. Applications of Mars-GRAM include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Atmospheric influences on landing site selection and long-term mission conceptualization and development can also be addressed utilizing Mars-GRAM. Mars-GRAM's perturbation modeling capability is commonly used, in a Monte Carlo mode, to perform high-fidelity engineering end-to-end simulations for entry, descent, and landing. Mars-GRAM is an evolving software package resulting in improved accuracy and additional features. Mars-GRAM 2005 has been validated against Radio Science data, and both nadir and limb data from the Thermal Emission Spectrometer (TES). From the surface to 80 km altitude, Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). Above 80 km, Mars-GRAM is based on the University of Michigan Mars Thermospheric General Circulation Model (MTGCM). The most recent release of Mars-GRAM 2010 includes an update to Fortran 90/95 and the addition of adjustment factors. These adjustment factors are applied to the input data from the MGCM and the MTGCM for the mapping year 0 user-controlled dust case. The adjustment factors are expressed as a function of height (z), latitude and areocentric solar longitude (Ls).

  12. High Altitude Venus Operational Concept (HAVOC): Proofs of Concept

    NASA Technical Reports Server (NTRS)

    Jones, Christopher A.; Arney, Dale C.; Bassett, George Z.; Clark, James R.; Hennig, Anthony I.; Snyder, Jessica C.

    2015-01-01

    The atmosphere of Venus is an exciting destination for both further scientific study and future human exploration. A recent internal NASA study of a High Altitude Venus Operational Concept (HAVOC) led to the development of an evolutionary program for the exploration of Venus, with focus on the mission architecture and vehicle concept for a 30-day crewed mission into Venus's atmosphere at 50 kilometers. Key technical challenges for the mission include performing the aerocapture maneuvers at Venus and Earth, inserting and inflating the airship at Venus during the entry sequence, and protecting the solar panels and structure from the sulfuric acid in the atmosphere. Two proofs of concept were identified that would aid in addressing some of the key technical challenges. To mitigate the threat posed by the sulfuric acid ambient in the atmosphere of Venus, a material was needed that could protect the systems while being lightweight and not inhibiting the performance of the solar panels. The first proof of concept identified candidate materials and evaluated them, finding FEP-Teflon (Fluorinated Ethylene Propylene-Teflon) to maintain 90 percent transmittance to relevant spectra even after 30 days of immersion in concentrated sulfuric acid. The second proof of concept developed and verified a packaging algorithm for the airship envelope to inform the entry, descent, and inflation analysis.

  13. Performance analysis of a SOFC under direct internal reforming conditions

    NASA Astrophysics Data System (ADS)

    Janardhanan, Vinod M.; Heuveline, Vincent; Deutschmann, Olaf

    This paper presents the performance analysis of a planar solid-oxide fuel cell (SOFC) under direct internal reforming conditions. A detailed solid-oxide fuel cell model is used to study the influences of various operating parameters on cell performance. Significant differences in efficiency and power density are observed for isothermal and adiabatic operational regimes. The influence of air number, specific catalyst area, anode thickness, steam to carbon (s/c) ratio of the inlet fuel, and extend of pre-reforming on cell performance is analyzed. In all cases except for the case of pre-reformed fuel, adiabatic operation results in lower performance compared to isothermal operation. It is further discussed that, though direct internal reforming may lead to cost reduction and increased efficiency by effective utilization of waste heat, the efficiency of the fuel cell itself is higher for pre-reformed fuel compared to non-reformed fuel. Furthermore, criteria for the choice of optimal operating conditions for cell stacks operating under direct internal reforming conditions are discussed.

  14. Long-term performance analysis of CIGS thin-film PV modules

    NASA Astrophysics Data System (ADS)

    Dhere, Neelkanth G.; Kaul, Ashwani; Pethe, Shirish A.

    2011-09-01

    Current accelerated qualification tests of photovoltaic (PV) modules mostly assist in avoiding infant mortality but can neither duplicate changes occurring in the field nor can predict useful lifetime. Therefore, outdoor monitoring of fielddeployed thin-film PV modules was undertaken at FSEC with goals of assessing their performance in hot and humid climate under high system voltage operation and to correlate the PV performance with the meteorological parameters. Significant and comparable degradation rate of -5.13% and -4.5% per year was found by PV USA type regression analysis for the positive and negative strings respectively of 40W glass-to-glass CIGS thin-film PV modules in the hot and humid climate of Florida. With the current-voltage measurements it was found that the performance degradation within the PV array was mainly due to a few (8-12%) modules having a substantially high degradation. The remaining modules within the array continued to show reasonable performance (>96% of the rated power after ~ 4years).

  15. Conceptual Design and Performance Analysis for a Large Civil Compound Helicopter

    NASA Technical Reports Server (NTRS)

    Russell, Carl; Johnson, Wayne

    2012-01-01

    A conceptual design study of a large civil compound helicopter is presented. The objective is to determine how a compound helicopter performs when compared to both a conventional helicopter and a tiltrotor using a design mission that is shorter than optimal for a tiltrotor and longer than optimal for a helicopter. The designs are generated and analyzed using conceptual design software and are further evaluated with a comprehensive rotorcraft analysis code. Multiple metrics are used to determine the suitability of each design for the given mission. Plots of various trade studies and parameter sweeps as well as comprehensive analysis results are presented. The results suggest that the compound helicopter examined for this study would not be competitive with a tiltrotor or conventional helicopter, but multiple possibilities are identified for improving the performance of the compound helicopter in future research.

  16. IPAC-Inlet Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A series of analyses have been developed which permit the calculation of the performance of common inlet designs. The methods presented are useful for determining the inlet weight flows, total pressure recovery, and aerodynamic drag coefficients for given inlet geometric designs. Limited geometric input data is required to use this inlet performance prediction methodology. The analyses presented here may also be used to perform inlet preliminary design studies. The calculated inlet performance parameters may be used in subsequent engine cycle analyses or installed engine performance calculations for existing uninstalled engine data.

  17. FMEA team performance in health care: A qualitative analysis of team member perceptions.

    PubMed

    Wetterneck, Tosha B; Hundt, Ann Schoofs; Carayon, Pascale

    2009-06-01

    : Failure mode and effects analysis (FMEA) is a commonly used prospective risk assessment approach in health care. Failure mode and effects analyses are time consuming and resource intensive, and team performance is crucial for FMEA success. We evaluate FMEA team members' perceptions of FMEA team performance to provide recommendations to improve the FMEA process in health care organizations. : Structured interviews and survey questionnaires were administered to team members of 2 FMEA teams at a Midwest Hospital to evaluate team member perceptions of FMEA team performance and factors influencing team performance. Interview transcripts underwent content analysis, and descriptive statistics were performed on questionnaire results to identify and quantify FMEA team performance. Theme-based nodes were categorized using the input-process-outcome model for team performance. : Twenty-eight interviews and questionnaires were completed by 24 team members. Four persons participated on both teams. There were significant differences between the 2 teams regarding perceptions of team functioning and overall team effectiveness that are explained by difference in team inputs and process (e.g., leadership/facilitation, team objectives, attendance of process owners). : Evaluation of team members' perceptions of team functioning produced useful insights that can be used to model future team functioning. Guidelines for FMEA team success are provided.

  18. An importance-performance analysis of hospital information system attributes: A nurses' perspective.

    PubMed

    Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J

    2016-02-01

    Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. CFD analysis of heat transfer performance of graphene based hybrid nanofluid in radiators

    NASA Astrophysics Data System (ADS)

    Bharadwaj, Bharath R.; Sanketh Mogeraya, K.; Manjunath, D. M.; Rao Ponangi, Babu; Rajendra Prasad, K. S.; Krishna, V.

    2018-04-01

    For Improved performance of an automobile engine, Cooling systems are one of the critical systems that need attention. With increased capacity to carry away large amounts of wasted heat, performance of an engine is increased. Current research on Nano-fluids suggests that they offer higher heat transfer rate compared to that of conventional coolants. Hence this project seeks to investigate the use of hybrid-nanofluids in radiators so as to increase its heat transfer performance. Carboxyl Graphene and Graphene Oxide based nanoparticles were selected due to the very high thermal conductivity of Graphene. System Analysis of the radiator was performed by considering a small part of the whole automobile radiator modelled using SEIMENS NX. CFD analysis was conducted using ANSYS FLUENT® for the nanofluid defined and the increase in effectiveness was compared to that of conventional coolants. Usage of such nanofluids for a fixed cooling requirement in the future can lead to significant downsizing of the radiator.

  20. Poor Gait Performance and Prediction of Dementia: Results From a Meta-Analysis.

    PubMed

    Beauchet, Olivier; Annweiler, Cédric; Callisaya, Michele L; De Cock, Anne-Marie; Helbostad, Jorunn L; Kressig, Reto W; Srikanth, Velandai; Steinmetz, Jean-Paul; Blumen, Helena M; Verghese, Joe; Allali, Gilles

    2016-06-01

    Poor gait performance predicts risk of developing dementia. No structured critical evaluation has been conducted to study this association yet. The aim of this meta-analysis was to systematically examine the association of poor gait performance with incidence of dementia. An English and French Medline search was conducted in June 2015, with no limit of date, using the medical subject headings terms "Gait" OR "Gait Disorders, Neurologic" OR "Gait Apraxia" OR "Gait Ataxia" AND "Dementia" OR "Frontotemporal Dementia" OR "Dementia, Multi-Infarct" OR "Dementia, Vascular" OR "Alzheimer Disease" OR "Lewy Body Disease" OR "Frontotemporal Dementia With Motor Neuron Disease" (Supplementary Concept). Poor gait performance was defined by standardized tests of walking, and dementia was diagnosed according to international consensus criteria. Four etiologies of dementia were identified: any dementia, Alzheimer disease (AD), vascular dementia (VaD), and non-AD (ie, pooling VaD, mixed dementias, and other dementias). Fixed effects meta-analyses were performed on the estimates in order to generate summary values. Of the 796 identified abstracts, 12 (1.5%) were included in this systematic review and meta-analysis. Poor gait performance predicted dementia [pooled hazard ratio (HR) combined with relative risk and odds ratio = 1.53 with P < .001 for any dementia, pooled HR = 1.79 with P < .001 for VaD, HR = 1.89 with P value < .001 for non-AD]. Findings were weaker for predicting AD (HR = 1.03 with P value = .004). This meta-analysis provides evidence that poor gait performance predicts dementia. This association depends on the type of dementia; poor gait performance is a stronger predictor of non-AD dementias than AD. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  1. Mind-wandering, cognition, and performance: a theory-driven meta-analysis of attention regulation.

    PubMed

    Randall, Jason G; Oswald, Frederick L; Beier, Margaret E

    2014-11-01

    The current meta-analysis accumulates empirical findings on the phenomenon of mind-wandering, integrating and interpreting findings in light of psychological theories of cognitive resource allocation. Cognitive resource theory emphasizes both individual differences in attentional resources and task demands together to predict variance in task performance. This theory motivated our conceptual and meta-analysis framework by introducing moderators indicative of task-demand to predict who is more likely to mind-wander under what conditions, and to predict when mind-wandering and task-related thought are more (or less) predictive of task performance. Predictions were tested via a random-effects meta-analysis of correlations obtained from normal adult samples (k = 88) based on measurement of specified episodes of off-task and/or on-task thought frequency and task performance. Results demonstrated that people with fewer cognitive resources tend to engage in more mind-wandering, whereas those with more cognitive resources are more likely to engage in task-related thought. Addressing predictions of resource theory, we found that greater time-on-task-although not greater task complexity-tended to strengthen the negative relation between cognitive resources and mind-wandering. Additionally, increases in mind-wandering were generally associated with decreases in task performance, whereas increases in task-related thought were associated with increased performance. Further supporting resource theory, the negative relation between mind-wandering and performance was more pronounced for more complex tasks, though not longer tasks. Complementarily, the positive association between task-related thought and performance was stronger for more complex tasks and for longer tasks. We conclude by discussing implications and future research directions for mind-wandering as a construct of interest in psychological research. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  2. Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method

    PubMed Central

    Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.

    2007-01-01

    Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information maximization, maximization of non-gaussianity, joint diagonalization of cross-cumulant matrices, and second-order correlation based methods when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study the variability among different ICA algorithms and propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA, and JADE all yield reliable results; each having their strengths in specific areas. EVD, an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for the iterative ICA algorithms, it is important to investigate the variability of the estimates from different runs. We test the consistency of the iterative algorithms, Infomax and FastICA, by running the algorithm a number of times with different initializations and note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis. PMID:17540281

  3. Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.

    2016-01-01

    Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.

  4. Freight performance measures : approach analysis.

    DOT National Transportation Integrated Search

    2010-05-01

    This report reviews the existing state of the art and also the state of the practice of freight performance measurement. Most performance measures at the state level have aimed at evaluating highway or transit infrastructure performance with an empha...

  5. Comparative study on DuPont analysis and DEA models for measuring stock performance using financial ratio

    NASA Astrophysics Data System (ADS)

    Arsad, Roslah; Shaari, Siti Nabilah Mohd; Isa, Zaidi

    2017-11-01

    Determining stock performance using financial ratio is challenging for many investors and researchers. Financial ratio can indicate the strengths and weaknesses of a company's stock performance. There are five categories of financial ratios namely liquidity, efficiency, leverage, profitability and market ratios. It is important to interpret the ratio correctly for proper financial decision making. The purpose of this study is to compare the performance of listed companies in Bursa Malaysia using Data Envelopment Analysis (DEA) and DuPont analysis Models. The study is conducted in 2015 involving 116 consumer products companies listed in Bursa Malaysia. The estimation method of Data Envelopment Analysis computes the efficiency scores and ranks the companies accordingly. The Alirezaee and Afsharian's method of analysis based Charnes, Cooper and Rhodes (CCR) where Constant Return to Scale (CRS) is employed. The DuPont analysis is a traditional tool for measuring the operating performance of companies. In this study, DuPont analysis is used to evaluate three different aspects such as profitability, efficiency of assets utilization and financial leverage. Return on Equity (ROE) is also calculated in DuPont analysis. This study finds that both analysis models provide different rankings of the selected samples. Hypothesis testing based on Pearson's correlation, indicates that there is no correlation between rankings produced by DEA and DuPont analysis. The DEA ranking model proposed by Alirezaee and Asharian is unstable. The method cannot provide complete ranking because the values of Balance Index is equal and zero.

  6. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    NASA Astrophysics Data System (ADS)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  7. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    PubMed

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P < .006). Conclusion In the context of a diverse population of virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  8. Idaho National Laboratory Quarterly Performance Analysis - 2nd Quarter FY2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lisbeth A. Mitchell

    2014-06-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of occurrence reports and other deficiency reports (including not reportable events) identified at INL from January 2014 through March 2014.

  9. Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals

    NASA Astrophysics Data System (ADS)

    Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam

    A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.

  10. Enabling High-performance Interactive Geoscience Data Analysis Through Data Placement and Movement Optimization

    NASA Astrophysics Data System (ADS)

    Zhu, F.; Yu, H.; Rilee, M. L.; Kuo, K. S.; Yu, L.; Pan, Y.; Jiang, H.

    2017-12-01

    Since the establishment of data archive centers and the standardization of file formats, scientists are required to search metadata catalogs for data needed and download the data files to their local machines to carry out data analysis. This approach has facilitated data discovery and access for decades, but it inevitably leads to data transfer from data archive centers to scientists' computers through low-bandwidth Internet connections. Data transfer becomes a major performance bottleneck in such an approach. Combined with generally constrained local compute/storage resources, they limit the extent of scientists' studies and deprive them of timely outcomes. Thus, this conventional approach is not scalable with respect to both the volume and variety of geoscience data. A much more viable solution is to couple analysis and storage systems to minimize data transfer. In our study, we compare loosely coupled approaches (exemplified by Spark and Hadoop) and tightly coupled approaches (exemplified by parallel distributed database management systems, e.g., SciDB). In particular, we investigate the optimization of data placement and movement to effectively tackle the variety challenge, and boost the popularization of parallelization to address the volume challenge. Our goal is to enable high-performance interactive analysis for a good portion of geoscience data analysis exercise. We show that tightly coupled approaches can concentrate data traffic between local storage systems and compute units, and thereby optimizing bandwidth utilization to achieve a better throughput. Based on our observations, we develop a geoscience data analysis system that tightly couples analysis engines with storages, which has direct access to the detailed map of data partition locations. Through an innovation data partitioning and distribution scheme, our system has demonstrated scalable and interactive performance in real-world geoscience data analysis applications.

  11. Performance analysis of a coherent free space optical communication system based on experiment.

    PubMed

    Cao, Jingtai; Zhao, Xiaohui; Liu, Wei; Gu, Haijun

    2017-06-26

    Based on our previous study and designed experimental AO system with a 97-element continuous surface deformable mirror, we conduct the performance analysis of a coherent free space optical communication (FSOC) system for mixing efficiency (ME), bit error rate (BER) and outage probability under different Greenwood frequency and atmospheric coherent length. The results show that the influence of the atmospheric temporal characteristics on the performance is slightly stronger than that of the spatial characteristics when the receiving aperture and the number of sub-apertures are given. This analysis result provides a reference for the design of the coherent FSOC system.

  12. Seventy-meter antenna performance predictions: GTD analysis compared with traditional ray-tracing methods

    NASA Technical Reports Server (NTRS)

    Schredder, J. M.

    1988-01-01

    A comparative analysis was performed, using both the Geometrical Theory of Diffraction (GTD) and traditional pathlength error analysis techniques, for predicting RF antenna gain performance and pointing corrections. The NASA/JPL 70 meter antenna with its shaped surface was analyzed for gravity loading over the range of elevation angles. Also analyzed were the effects of lateral and axial displacements of the subreflector. Significant differences were noted between the predictions of the two methods, in the effect of subreflector displacements, and in the optimal subreflector positions to focus a gravity-deformed main reflector. The results are of relevance to future design procedure.

  13. A Shot Number Based Approach to Performance Analysis in Table Tennis

    PubMed Central

    Yoshida, Kazuto; Yamada, Koshi

    2017-01-01

    Abstract The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method. PMID:28210334

  14. Analysis of latency performance of bluetooth low energy (BLE) networks.

    PubMed

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2014-12-23

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes.

  15. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  16. Deep Space Optical Link ARQ Performance Analysis

    NASA Technical Reports Server (NTRS)

    Clare, Loren; Miles, Gregory

    2016-01-01

    Substantial advancements have been made toward the use of optical communications for deep space exploration missions, promising a much higher volume of data to be communicated in comparison with present -day Radio Frequency (RF) based systems. One or more ground-based optical terminals are assumed to communicate with the spacecraft. Both short-term and long-term link outages will arise due to weather at the ground station(s), space platform pointing stability, and other effects. To mitigate these outages, an Automatic Repeat Query (ARQ) retransmission method is assumed, together with a reliable back channel for acknowledgement traffic. Specifically, the Licklider Transmission Protocol (LTP) is used, which is a component of the Disruption-Tolerant Networking (DTN) protocol suite that is well suited for high bandwidth-delay product links subject to disruptions. We provide an analysis of envisioned deep space mission scenarios and quantify buffering, latency and throughput performance, using a simulation in which long-term weather effects are modeled with a Gilbert -Elliot Markov chain, short-term outages occur as a Bernoulli process, and scheduled outages arising from geometric visibility or operational constraints are represented. We find that both short- and long-term effects impact throughput, but long-term weather effects dominate buffer sizing and overflow losses as well as latency performance.

  17. Factor-Analysis Methods for Higher-Performance Neural Prostheses

    PubMed Central

    Santhanam, Gopal; Yu, Byron M.; Gilja, Vikash; Ryu, Stephen I.; Afshar, Afsheen; Sahani, Maneesh; Shenoy, Krishna V.

    2009-01-01

    Neural prostheses aim to provide treatment options for individuals with nervous-system disease or injury. It is necessary, however, to increase the performance of such systems before they can be clinically viable for patients with motor dysfunction. One performance limitation is the presence of correlated trial-to-trial variability that can cause neural responses to wax and wane in concert as the subject is, for example, more attentive or more fatigued. If a system does not properly account for this variability, it may mistakenly interpret such variability as an entirely different intention by the subject. We report here the design and characterization of factor-analysis (FA)–based decoding algorithms that can contend with this confound. We characterize the decoders (classifiers) on experimental data where monkeys performed both a real reach task and a prosthetic cursor task while we recorded from 96 electrodes implanted in dorsal premotor cortex. The decoder attempts to infer the underlying factors that comodulate the neurons' responses and can use this information to substantially lower error rates (one of eight reach endpoint predictions) by ≲75% (e.g., ∼20% total prediction error using traditional independent Poisson models reduced to ∼5%). We also examine additional key aspects of these new algorithms: the effect of neural integration window length on performance, an extension of the algorithms to use Poisson statistics, and the effect of training set size on the decoding accuracy of test data. We found that FA-based methods are most effective for integration windows >150 ms, although still advantageous at shorter timescales, that Gaussian-based algorithms performed better than the analogous Poisson-based algorithms and that the FA algorithm is robust even with a limited amount of training data. We propose that FA-based methods are effective in modeling correlated trial-to-trial neural variability and can be used to substantially increase overall

  18. HabEx Optical Telescope Concepts: Design and Performance Analysis

    NASA Astrophysics Data System (ADS)

    Stahl, H. Philip; NASA MSFC HabEx Telescope Design Team

    2018-01-01

    The Habitable-Exoplanet Imaging Mission (HabEx) engineering study team has been tasked by NASA with developing a compelling and feasible exoplanet direct imaging concept as part of the 2020 Decadal Survey. This paper summarizes design concepts for two off-axis unobscured telescope concepts: a 4-meter monolithic aperture and a 6-meter segmented aperutre. HabEx telescopes are designed for launch vehicle accommodation. Analysis includes prediction of on-orbit dynamic structural and thermal optical performance.

  19. Development and Implementation of a Generic Analysis Template for Structural-Thermal-Optical-Performance Modeling

    NASA Technical Reports Server (NTRS)

    Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad

    2016-01-01

    Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.

  20. Development and implementation of a generic analysis template for structural-thermal-optical-performance modeling

    NASA Astrophysics Data System (ADS)

    Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad

    2016-09-01

    Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.

  1. Testing and Performance Analysis of the Multichannel Error Correction Code Decoder

    NASA Technical Reports Server (NTRS)

    Soni, Nitin J.

    1996-01-01

    This report provides the test results and performance analysis of the multichannel error correction code decoder (MED) system for a regenerative satellite with asynchronous, frequency-division multiple access (FDMA) uplink channels. It discusses the system performance relative to various critical parameters: the coding length, data pattern, unique word value, unique word threshold, and adjacent-channel interference. Testing was performed under laboratory conditions and used a computer control interface with specifically developed control software to vary these parameters. Needed technologies - the high-speed Bose Chaudhuri-Hocquenghem (BCH) codec from Harris Corporation and the TRW multichannel demultiplexer/demodulator (MCDD) - were fully integrated into the mesh very small aperture terminal (VSAT) onboard processing architecture and were demonstrated.

  2. Linear static structural and vibration analysis on high-performance computers

    NASA Technical Reports Server (NTRS)

    Baddourah, M. A.; Storaasli, O. O.; Bostic, S. W.

    1993-01-01

    Parallel computers offer the oppurtunity to significantly reduce the computation time necessary to analyze large-scale aerospace structures. This paper presents algorithms developed for and implemented on massively-parallel computers hereafter referred to as Scalable High-Performance Computers (SHPC), for the most computationally intensive tasks involved in structural analysis, namely, generation and assembly of system matrices, solution of systems of equations and calculation of the eigenvalues and eigenvectors. Results on SHPC are presented for large-scale structural problems (i.e. models for High-Speed Civil Transport). The goal of this research is to develop a new, efficient technique which extends structural analysis to SHPC and makes large-scale structural analyses tractable.

  3. Performation Metrics Development Analysis for Information and Communications Technology Outsourcing: A Case Study

    ERIC Educational Resources Information Center

    Travis, James L., III

    2014-01-01

    This study investigated how and to what extent the development and use of the OV-5a operational architecture decomposition tree (OADT) from the Department of Defense (DoD) Architecture Framework (DoDAF) affects requirements analysis with respect to complete performance metrics for performance-based services acquisition of ICT under rigid…

  4. A Finite Rate Chemical Analysis of Nitric Oxide Flow Contamination Effects on Scramjet Performance

    NASA Technical Reports Server (NTRS)

    Cabell, Karen F.; Rock, Kenneth E.

    2003-01-01

    The level of nitric oxide contamination in the test gas of the Langley Research Center Arc-Heated Scramjet Test Facility and the effect of the contamination on scramjet test engine performance were investigated analytically. A finite rate chemical analysis was performed to determine the levels of nitric oxide produced in the facility at conditions corresponding to Mach 6 to 8 flight simulations. Results indicate that nitric oxide levels range from one to three mole percent, corroborating previously obtained measurements. A three-stream combustor code with finite rate chemistry was used to investigate the effects of nitric oxide on scramjet performance. Results indicate that nitric oxide in the test gas causes a small increase in heat release and thrust performance for the test conditions investigated. However, a rate constant uncertainty analysis suggests that the effect of nitric oxide ranges from no net effect, to an increase of about 10 percent in thrust performance.

  5. Identification of human operator performance models utilizing time series analysis

    NASA Technical Reports Server (NTRS)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  6. Evaluating Language Environment Analysis System Performance for Chinese: A Pilot Study in Shanghai

    ERIC Educational Resources Information Center

    Gilkerson, Jill; Zhang, Yiwen; Xu, Dongxin; Richards, Jeffrey A.; Xu, Xiaojuan; Jiang, Fan; Harnsberger, James; Topping, Keith

    2015-01-01

    Purpose: The purpose of this study was to evaluate performance of the Language Environment Analysis (LENA) automated language-analysis system for the Chinese Shanghai dialect and Mandarin (SDM) languages. Method: Volunteer parents of 22 children aged 3-23 months were recruited in Shanghai. Families provided daylong in-home audio recordings using…

  7. Propeller performance analysis and multidisciplinary optimization using a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Burger, Christoph

    A propeller performance analysis program has been developed and integrated into a Genetic Algorithm for design optimization. The design tool will produce optimal propeller geometries for a given goal, which includes performance and/or acoustic signature. A vortex lattice model is used for the propeller performance analysis and a subsonic compact source model is used for the acoustic signature determination. Compressibility effects are taken into account with the implementation of Prandtl-Glauert domain stretching. Viscous effects are considered with a simple Reynolds number based model to account for the effects of viscosity in the spanwise direction. An empirical flow separation model developed from experimental lift and drag coefficient data of a NACA 0012 airfoil is included. The propeller geometry is generated using a recently introduced Class/Shape function methodology to allow for efficient use of a wide design space. Optimizing the angle of attack, the chord, the sweep and the local airfoil sections, produced blades with favorable tradeoffs between single and multiple point optimizations of propeller performance and acoustic noise signatures. Optimizations using a binary encoded IMPROVE(c) Genetic Algorithm (GA) and a real encoded GA were obtained after optimization runs with some premature convergence. The newly developed real encoded GA was used to obtain the majority of the results which produced generally better convergence characteristics when compared to the binary encoded GA. The optimization trade-offs show that single point optimized propellers have favorable performance, but circulation distributions were less smooth when compared to dual point or multiobjective optimizations. Some of the single point optimizations generated propellers with proplets which show a loading shift to the blade tip region. When noise is included into the objective functions some propellers indicate a circulation shift to the inboard sections of the propeller as well as a

  8. Performance evaluation of the Engineering Analysis and Data Systems (EADS) 2

    NASA Technical Reports Server (NTRS)

    Debrunner, Linda S.

    1994-01-01

    The Engineering Analysis and Data System (EADS)II (1) was installed in March 1993 to provide high performance computing for science and engineering at Marshall Space Flight Center (MSFC). EADS II increased the computing capabilities over the existing EADS facility in the areas of throughput and mass storage. EADS II includes a Vector Processor Compute System (VPCS), a Virtual Memory Compute System (CFS), a Common Output System (COS), as well as Image Processing Station, Mini Super Computers, and Intelligent Workstations. These facilities are interconnected by a sophisticated network system. This work considers only the performance of the VPCS and the CFS. The VPCS is a Cray YMP. The CFS is implemented on an RS 6000 using the UniTree Mass Storage System. To better meet the science and engineering computing requirements, EADS II must be monitored, its performance analyzed, and appropriate modifications for performance improvement made. Implementing this approach requires tool(s) to assist in performance monitoring and analysis. In Spring 1994, PerfStat 2.0 was purchased to meet these needs for the VPCS and the CFS. PerfStat(2) is a set of tools that can be used to analyze both historical and real-time performance data. Its flexible design allows significant user customization. The user identifies what data is collected, how it is classified, and how it is displayed for evaluation. Both graphical and tabular displays are supported. The capability of the PerfStat tool was evaluated, appropriate modifications to EADS II to optimize throughput and enhance productivity were suggested and implemented, and the effects of these modifications on the systems performance were observed. In this paper, the PerfStat tool is described, then its use with EADS II is outlined briefly. Next, the evaluation of the VPCS, as well as the modifications made to the system are described. Finally, conclusions are drawn and recommendations for future worked are outlined.

  9. Performance Analysis of a NASA Integrated Network Array

    NASA Technical Reports Server (NTRS)

    Nessel, James A.

    2012-01-01

    The Space Communications and Navigation (SCaN) Program is planning to integrate its individual networks into a unified network which will function as a single entity to provide services to user missions. This integrated network architecture is expected to provide SCaN customers with the capabilities to seamlessly use any of the available SCaN assets to support their missions to efficiently meet the collective needs of Agency missions. One potential optimal application of these assets, based on this envisioned architecture, is that of arraying across existing networks to significantly enhance data rates and/or link availabilities. As such, this document provides an analysis of the transmit and receive performance of a proposed SCaN inter-network antenna array. From the study, it is determined that a fully integrated internetwork array does not provide any significant advantage over an intra-network array, one in which the assets of an individual network are arrayed for enhanced performance. Therefore, it is the recommendation of this study that NASA proceed with an arraying concept, with a fundamental focus on a network-centric arraying.

  10. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  11. Performance Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis with Different Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel

    Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less

  12. Performance Analysis of Visible Light Communication Using CMOS Sensors.

    PubMed

    Do, Trong-Hop; Yoo, Myungsik

    2016-02-29

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis.

  13. Performance Analysis and Optimization on the UCLA Parallel Atmospheric General Circulation Model Code

    NASA Technical Reports Server (NTRS)

    Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos

    1996-01-01

    An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.

  14. Zarya Energy Balance Analysis: The Effect of Spacecraft Shadowing on Solar Array Performance

    NASA Technical Reports Server (NTRS)

    Hoffman, David J.; Kolosov, Vladimir

    1999-01-01

    The first element of the International Space Station (ISS). Zarya, was funded by NASA and built by the Russian aerospace company Khrunichev State Research and Production Space Center (KhSC). NASA Glenn Research Center (GRC) and KhSC collaborated in performing analytical predictions of the on-orbit electrical performance of Zarya's solar arrays. GRC assessed the pointing characteristics of and shadow patterns on Zarya's solar arrays to determine the average solar energy incident on the arrays. KHSC used the incident energy results to determine Zarya's electrical power generation capability and orbit-average power balance. The power balance analysis was performed over a range of solar beta angles and vehicle operational conditions. This analysis enabled identification of problems that could impact the power balance for specific flights during ISS assembly and was also used as the primary means of verifying that Zarya complied with electrical power requirements. Analytical results are presented for select stages in the ISS assembly sequence along with a discussion of the impact of shadowing on the electrical performance of Zarya's solar arrays.

  15. Physiological stress and performance analysis to karate combat.

    PubMed

    Chaabene, Helmi; Hellara, Ilhem; Ghali, Faten B; Franchini, Emerson; Neffati, Fedoua; Tabben, Montassar; Najjar, Mohamed F; Hachana, Younés

    2016-10-01

    This study aimed to evaluate the relationship between physiological, and parameters of performance analysis during karate contest. Nine elite-level karate athletes participated in this study. Saliva sample was collected pre- and post-karate combat. Salivary cortisol (sC) post-combat 2 raised significantly compared to that recorded at pre-combat 1 (Δ%=105.3%; P=0.04; dz=0.78). The largest decrease of the salivary T/C ratio (sR) compared to pre-combat 1 was recorded post-combat 2 (Δ%=-43.5%; P=0.03). Moreover, blood lactate concentration post-combat 1 correlated positively to sCpost-combat 1 (r=0.66; P=0.05) and negatively to both salivary testosterone (sT) (r=-0.76; P=0.01) and sRpost-combat 1 (r=-0.76; P=0.01). There was no significant relationship between hormonal measures and parameters of match analysis. Although under simulated condition, karate combat poses large physiological stress to the karateka. Additionally, physiological strain to karate combat led to a catabolic hormonal response.

  16. SFDT-1 Camera Pointing and Sun-Exposure Analysis and Flight Performance

    NASA Technical Reports Server (NTRS)

    White, Joseph; Dutta, Soumyo; Striepe, Scott

    2015-01-01

    The Supersonic Flight Dynamics Test (SFDT) vehicle was developed to advance and test technologies of NASA's Low Density Supersonic Decelerator (LDSD) Technology Demonstration Mission. The first flight test (SFDT-1) occurred on June 28, 2014. In order to optimize the usefulness of the camera data, analysis was performed to optimize parachute visibility in the camera field of view during deployment and inflation and to determine the probability of sun-exposure issues with the cameras given the vehicle heading and launch time. This paper documents the analysis, results and comparison with flight video of SFDT-1.

  17. An analysis for high speed propeller-nacelle aerodynamic performance prediction. Volume 1: Theory and application

    NASA Technical Reports Server (NTRS)

    Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.

    1988-01-01

    A computer program, the Propeller Nacelle Aerodynamic Performance Prediction Analysis (PANPER), was developed for the prediction and analysis of the performance and airflow of propeller-nacelle configurations operating over a forward speed range inclusive of high speed flight typical of recent propfan designs. A propeller lifting line, wake program was combined with a compressible, viscous center body interaction program, originally developed for diffusers, to compute the propeller-nacelle flow field, blade loading distribution, propeller performance, and the nacelle forebody pressure and viscous drag distributions. The computer analysis is applicable to single and coaxial counterrotating propellers. The blade geometries can include spanwise variations in sweep, droop, taper, thickness, and airfoil section type. In the coaxial mode of operation the analysis can treat both equal and unequal blade number and rotational speeds on the propeller disks. The nacelle portion of the analysis can treat both free air and tunnel wall configurations including wall bleed. The analysis was applied to many different sets of flight conditions using selected aerodynamic modeling options. The influence of different propeller nacelle-tunnel wall configurations was studied. Comparisons with available test data for both single and coaxial propeller configurations are presented along with a discussion of the results.

  18. Acoustic Analysis and Electroglottography in Elite Vocal Performers.

    PubMed

    Villafuerte-Gonzalez, Rocio; Valadez-Jimenez, Victor M; Sierra-Ramirez, Jose A; Ysunza, Pablo Antonio; Chavarria-Villafuerte, Karen; Hernandez-Lopez, Xochiquetzal

    2017-05-01

    Acoustic analysis of voice (AAV) and electroglottography (EGG) have been used for assessing vocal quality in patients with voice disorders. The effectiveness of these procedures for detecting mild disturbances in vocal quality in elite vocal performers has been controversial. To compare acoustic parameters obtained by AAV and EGG before and after vocal training to determine the effectiveness of these procedures for detecting vocal improvements in elite vocal performers. Thirty-three elite vocal performers were studied. The study group included 14 males and 19 females, ages 18-40 years, without a history of voice disorders. Acoustic parameters were obtained through AAV and EGG before and after vocal training using the Linklater method. Nonsignificant differences (P > 0.05) were found between values of fundamental frequency (F 0 ), shimmer, and jitter obtained by both procedures before vocal training. Mean F 0 was similar after vocal training. Jitter percentage as measured by AAV showed nonsignificant differences (P > 0.05) before and after vocal training. Shimmer percentage as measured by AAV demonstrated a significant reduction (P < 0.05) after vocal training. As measured by EGG after vocal training, shimmer and jitter were significantly reduced (P < 0.05); open quotient was significantly increased (P < 0.05); and irregularity was significantly reduced (P < 0.05). AAV and EGG were effective for detecting improvements in vocal function after vocal training in male and female elite vocal performers undergoing vocal training. EGG demonstrated better efficacy for detecting improvements and provided additional parameters as compared to AAV. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  19. Performance testing and analysis results of AMTEC cells for space applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borkowski, C.A.; Barkan, A.; Hendricks, T.J.

    1998-01-01

    Testing and analysis has shown that AMTEC (Alkali Metal Thermal to Electric Conversion) (Weber, 1974) cells can reach the performance (power) levels required by a variety of space applications. The performance of an AMTEC cell is highly dependent on the thermal environment to which it is subjected. A guard heater assembly has been designed, fabricated, and used to expose individual AMTEC cells to various thermal environments. The design and operation of the guard heater assembly will be discussed. Performance test results of an AMTEC cell operated under guard heated conditions to simulate an adiabatic cell wall thermal environment are presented.more » Experimental data and analytic model results are compared to illustrate validation of the model. {copyright} {ital 1998 American Institute of Physics.}« less

  20. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.

    1994-01-01

    This research program deals with the application of high-performance computing methods for the analysis of complete jet engines. We have entitled this program by applying the two dimensional parallel aeroelastic codes to the interior gas flow problem of a bypass jet engine. The fluid mesh generation, domain decomposition, and solution capabilities were successfully tested. We then focused attention on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion that results from these structural displacements. This is treated by a new arbitrary Lagrangian-Eulerian (ALE) technique that models the fluid mesh motion as that of a fictitious mass-spring network. New partitioned analysis procedures to treat this coupled three-component problem are developed. These procedures involved delayed corrections and subcycling. Preliminary results on the stability, accuracy, and MPP computational efficiency are reported.

  1. Improving Department of Defense Global Distribution Performance Through Network Analysis

    DTIC Science & Technology

    2016-06-01

    network performance increase. 14. SUBJECT TERMS supply chain metrics, distribution networks, requisition shipping time, strategic distribution database...peace and war” (p. 4). USTRANSCOM Metrics and Analysis Branch defines, develops, tracks, and maintains outcomes- based supply chain metrics to...2014a, p. 8). The Joint Staff defines a TDD standard as the maximum number of days the supply chain can take to deliver requisitioned materiel

  2. Analysis of Latency Performance of Bluetooth Low Energy (BLE) Networks

    PubMed Central

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2015-01-01

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes. PMID:25545266

  3. Transient analysis techniques in performing impact and crash dynamic studies

    NASA Technical Reports Server (NTRS)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  4. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    PubMed

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  5. Performance Analysis of Visible Light Communication Using CMOS Sensors

    PubMed Central

    Do, Trong-Hop; Yoo, Myungsik

    2016-01-01

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis. PMID:26938535

  6. Performance analysis and improvement of WPAN MAC for home networks.

    PubMed

    Mehta, Saurabh; Kwak, Kyung Sup

    2010-01-01

    The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking.

  7. Performance Analysis and Improvement of WPAN MAC for Home Networks

    PubMed Central

    Mehta, Saurabh; Kwak, Kyung Sup

    2010-01-01

    The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking. PMID:22319274

  8. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing. The PRIMA Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-­end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-­performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensivelymore » across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-­fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to

  9. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing: the PRIMA Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensivelymore » across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to

  10. Performance analysis of a laser propelled interorbital tansfer vehicle

    NASA Technical Reports Server (NTRS)

    Minovitch, M. A.

    1976-01-01

    Performance capabilities of a laser-propelled interorbital transfer vehicle receiving propulsive power from one ground-based transmitter was investigated. The laser transmits propulsive energy to the vehicle during successive station fly-overs. By applying a series of these propulsive maneuvers, large payloads can be economically transferred between low earth orbits and synchronous orbits. Operations involving the injection of large payloads onto escape trajectories are also studied. The duration of each successive engine burn must be carefully timed so that the vehicle reappears over the laser station to receive additional propulsive power within the shortest possible time. The analytical solution for determining these time intervals is presented, as is a solution to the problem of determining maximum injection payloads. Parameteric computer analysis based on these optimization studies is presented. The results show that relatively low beam powers, on the order of 50 MW to 60 MW, produce significant performance capabilities.

  11. Commissioning and Performance Analysis of WhisperGen Stirling Engine

    NASA Astrophysics Data System (ADS)

    Pradip, Prashant Kaliram

    Stirling engine based cogeneration systems have potential to reduce energy consumption and greenhouse gas emission, due to their high cogeneration efficiency and emission control due to steady external combustion. To date, most studies on this unit have focused on performance based on both experimentation and computer models, and lack experimental data for diversified operating ranges. This thesis starts with the commissioning of a WhisperGen Stirling engine with components and instrumentation to evaluate power and thermal performance of the system. Next, a parametric study on primary engine variables, including air, diesel, and coolant flowrate and temperature were carried out to further understand their effect on engine power and efficiency. Then, this trend was validated with the thermodynamic model developed for the energy analysis of a Stirling cycle. Finally, the energy balance of the Stirling engine was compared without and with heat recovery from the engine block and the combustion chamber exhaust.

  12. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    NASA Astrophysics Data System (ADS)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  13. The Effect of Birth Weight on Academic Performance: Instrumental Variable Analysis.

    PubMed

    Lin, Shi Lin; Leung, Gabriel Matthew; Schooling, C Mary

    2017-05-01

    Observationally, lower birth weight is usually associated with poorer academic performance; whether this association is causal or the result of confounding is unknown. To investigate this question, we obtained an effect estimate, which can have a causal interpretation under specific assumptions, of birth weight on educational attainment using instrumental variable analysis based on single nucleotide polymorphisms determining birth weight combined with results from the Social Science Genetic Association Consortium study of 126,559 Caucasians. We similarly obtained an estimate of the effect of birth weight on academic performance in 4,067 adolescents from Hong Kong's (Chinese) Children of 1997 birth cohort (1997-2016), using twin status as an instrumental variable. Birth weight was not associated with years of schooling (per 100-g increase in birth weight, -0.006 years, 95% confidence interval (CI): -0.02, 0.01) or college completion (odds ratio = 1.00, 95% CI: 0.96, 1.03). Birth weight was also unrelated to academic performance in adolescents (per 100-g increase in birth weight, -0.004 grade, 95% CI: -0.04, 0.04) using instrumental variable analysis, although conventional regression gave a small positive association (0.02 higher grade, 95% CI: 0.01, 0.03). Observed associations of birth weight with academic performance may not be causal, suggesting that interventions should focus on the contextual factors generating this correlation. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Rotorcraft performance data for AEDT : Methods of using the NASA Design and Analysis of Rotorcraft tool for developing data for AEDT's Rotorcraft Performance Model

    DOT National Transportation Integrated Search

    2016-09-01

    This report documents use of the NASA Design and Analysis of Rotorcraft (NDARC) helicopter performance software tool in developing data for the FAAs Aviation Environmental Design Tool (AEDT). These data support the Rotorcraft Performance Model (RP...

  15. Performance Analysis of ICA in Sensor Array

    PubMed Central

    Cai, Xin; Wang, Xiang; Huang, Zhitao; Wang, Fenghua

    2016-01-01

    As the best-known scheme in the field of Blind Source Separation (BSS), Independent Component Analysis (ICA) has been intensively used in various domains, including biomedical and acoustics applications, cooperative or non-cooperative communication, etc. While sensor arrays are involved in most of the applications, the influence on the performance of ICA of practical factors therein has not been sufficiently investigated yet. In this manuscript, the issue is researched by taking the typical antenna array as an illustrative example. Factors taken into consideration include the environment noise level, the properties of the array and that of the radiators. We analyze the analytic relationship between the noise variance, the source variance, the condition number of the mixing matrix and the optimal signal to interference-plus-noise ratio, as well as the relationship between the singularity of the mixing matrix and practical factors concerned. The situations where the mixing process turns (nearly) singular have been paid special attention to, since such circumstances are critical in applications. Results and conclusions obtained should be instructive when applying ICA algorithms on mixtures from sensor arrays. Moreover, an effective countermeasure against the cases of singular mixtures has been proposed, on the basis of previous analysis. Experiments validating the theoretical conclusions as well as the effectiveness of the proposed scheme have been included. PMID:27164100

  16. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  17. Importance-performance analysis as a guide for hospitals in improving their provision of services.

    PubMed

    Whynes, D K; Reed, G

    1995-11-01

    As a result of the 1990 National Health Services Act, hospitals now compete with one another to win service contracts. A high level of service quality represents an important ingredient of a successful competitive strategy, yet, in general, hospitals have little external information on which to base quality decisions. Specifically, in their efforts to win contracts from fundholding general practitioners, hospitals require information on that which these purchasers deem important with respect to quality, and on how these purchasers assess the quality of their current service performance. The problem is complicated by the fact that hospital service quality, in itself, is multi-dimensional. In other areas of economic activity, the information problem has been resolved by importance-performance analysis and this paper reports the findings of such an analysis conducted for hosptials in the Trent region. The importance and performance service quality ratings of fundholders were obtained from a questionnaire survey and used in a particular variant of importance-performance analysis, which possesses certain advantages over more conventional approaches. In addition to providing empirical data on the determinants of service quality, as perceived by the purchasers of hospital services, this paper demonstrates how such information can be successfully employed in a quality enhancement strategy.

  18. Analysis of EDP performance

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The objective of this contract was the investigation of the potential performance gains that would result from an upgrade of the Space Station Freedom (SSF) Data Management System (DMS) Embedded Data Processor (EDP) '386' design with the Intel Pentium (registered trade-mark of Intel Corp.) '586' microprocessor. The Pentium ('586') is the latest member of the industry standard Intel X86 family of CISC (Complex Instruction Set Computer) microprocessors. This contract was scheduled to run in parallel with an internal IBM Federal Systems Company (FSC) Internal Research and Development (IR&D) task that had the goal to generate a baseline flight design for an upgraded EDP using the Pentium. This final report summarizes the activities performed in support of Contract NAS2-13758. Our plan was to baseline performance analyses and measurements on the latest state-of-the-art commercially available Pentium processor, representative of the proposed space station design, and then phase to an IBM capital funded breadboard version of the flight design (if available from IR&D and Space Station work) for additional evaluation of results. Unfortunately, the phase-over to the flight design breadboard did not take place, since the IBM Data Management System (DMS) for the Space Station Freedom was terminated by NASA before the referenced capital funded EDP breadboard could be completed. The baseline performance analyses and measurements, however, were successfully completed, as planned, on the commercial Pentium hardware. The results of those analyses, evaluations, and measurements are presented in this final report.

  19. Autotasked Performance in the NAS Workload: A Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.

  20. The Diagnostic Performance of Stool DNA Testing for Colorectal Cancer: A Systematic Review and Meta-Analysis.

    PubMed

    Zhai, Rong-Lin; Xu, Fei; Zhang, Pei; Zhang, Wan-Li; Wang, Hui; Wang, Ji-Liang; Cai, Kai-Lin; Long, Yue-Ping; Lu, Xiao-Ming; Tao, Kai-Xiong; Wang, Guo-Bin

    2016-02-01

    This meta-analysis was designed to evaluate the diagnostic performance of stool DNA testing for colorectal cancer (CRC) and compare the performance between single-gene and multiple-gene tests.MEDLINE, Cochrane, EMBASE databases were searched using keywords colorectal cancers, stool/fecal, sensitivity, specificity, DNA, and screening. Sensitivity analysis, quality assessments, and performance bias were performed for the included studies.Fifty-three studies were included in the analysis with a total sample size of 7524 patients. The studies were heterogeneous with regard to the genes being analyzed for fecal genetic biomarkers of CRC, as well as the laboratory methods being used for each assay. The sensitivity of the different assays ranged from 2% to 100% and the specificity ranged from 81% to 100%. The meta-analysis found that the pooled sensitivities for single- and multigene assays were 48.0% and 77.8%, respectively, while the pooled specificities were 97.0% and 92.7%. Receiver operator curves and diagnostic odds ratios showed no significant difference between both tests with regard to sensitivity or specificity.This meta-analysis revealed that using assays that evaluated multiple genes compared with single-gene assays did not increase the sensitivity or specificity of stool DNA testing in detecting CRC.

  1. Funding Ohio Community Colleges: An Analysis of the Performance Funding Model

    ERIC Educational Resources Information Center

    Krueger, Cynthia A.

    2013-01-01

    This study examined Ohio's community college performance funding model that is based on seven student success metrics. A percentage of the regular state subsidy is withheld from institutions; funding is earned back based on the three-year average of success points achieved in comparison to other community colleges in the state. Analysis of…

  2. New Trends in Gender and Mathematics Performance: A Meta-Analysis

    PubMed Central

    Lindberg, Sara M.; Hyde, Janet Shibley; Petersen, Jennifer L.; Linn, Marcia C.

    2010-01-01

    In this paper, we use meta-analysis to analyze gender differences in recent studies of mathematics performance. First, we meta-analyzed data from 242 studies published between 1990 and 2007, representing the testing of 1,286,350 people. Overall, d = .05, indicating no gender difference, and VR = 1.08, indicating nearly equal male and female variances. Second, we analyzed data from large data sets based on probability sampling of U.S. adolescents over the past 20 years: the NLSY, NELS88, LSAY, and NAEP. Effect sizes for the gender difference ranged between −0.15 and +0.22. Variance ratios ranged from 0.88 to 1.34. Taken together these findings support the view that males and females perform similarly in mathematics. PMID:21038941

  3. Performance of blind source separation algorithms for fMRI analysis using a group ICA method.

    PubMed

    Correa, Nicolle; Adali, Tülay; Calhoun, Vince D

    2007-06-01

    Independent component analysis (ICA) is a popular blind source separation technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist; however, the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely, information maximization, maximization of non-Gaussianity, joint diagonalization of cross-cumulant matrices and second-order correlation-based methods, when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study variability among different ICA algorithms, and we propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA and joint approximate diagonalization of eigenmatrices (JADE) all yield reliable results, with each having its strengths in specific areas. Eigenvalue decomposition (EVD), an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for iterative ICA algorithms, it is important to investigate the variability of estimates from different runs. We test the consistency of the iterative algorithms Infomax and FastICA by running the algorithm a number of times with different initializations, and we note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis.

  4. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  5. Meta-analysis of the technical performance of an imaging procedure: Guidelines and statistical methodology

    PubMed Central

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2017-01-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353

  6. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  7. Performance of an Axisymmetric Rocket Based Combined Cycle Engine During Rocket Only Operation Using Linear Regression Analysis

    NASA Technical Reports Server (NTRS)

    Smith, Timothy D.; Steffen, Christopher J., Jr.; Yungster, Shaye; Keller, Dennis J.

    1998-01-01

    The all rocket mode of operation is shown to be a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. An axisymmetric RBCC engine was used to determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and multiple linear regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inlet diameter ratio. A perfect gas computational fluid dynamics analysis, using both the Spalart-Allmaras and k-omega turbulence models, was performed with the NPARC code to obtain values of vacuum specific impulse. Results from the multiple linear regression analysis showed that for both the full flow and gas generator configurations increasing mixer-ejector area ratio and rocket area ratio increase performance, while increasing mixer-ejector inlet area ratio and mixer-ejector length-to-diameter ratio decrease performance. Increasing injected secondary flow increased performance for the gas generator analysis, but was not statistically significant for the full flow analysis. Chamber pressure was found to be not statistically significant.

  8. The age of peak performance in Ironman triathlon: a cross-sectional and longitudinal data analysis

    PubMed Central

    2013-01-01

    Background The aims of the present study were, firstly, to investigate in a cross-sectional analysis the age of peak Ironman performance within one calendar year in all qualifiers for Ironman Hawaii and Ironman Hawaii; secondly, to determine in a longitudinal analysis on a qualifier for Ironman Hawaii whether the age of peak Ironman performance and Ironman performance itself change across years; and thirdly, to determine the gender difference in performance. Methods In a cross-sectional analysis, the age of the top ten finishers for all qualifier races for Ironman Hawaii and Ironman Hawaii was determined in 2010. For a longitudinal analysis, the age and the performance of the annual top ten female and male finishers in a qualifier for Ironman Hawaii was determined in Ironman Switzerland between 1995 and 2010. Results In 19 of the 20 analyzed triathlons held in 2010, there was no difference in the age of peak Ironman performance between women and men (p > 0.05). The only difference in the age of peak Ironman performance between genders was in ‘Ironman Canada’ where men were older than women (p = 0.023). For all 20 races, the age of peak Ironman performance was 32.2 ± 1.5 years for men and 33.0 ± 1.6 years for women (p > 0.05). In Ironman Switzerland, there was no difference in the age of peak Ironman performance between genders for top ten women and men from 1995 to 2010 (F = 0.06, p = 0.8). The mean age of top ten women and men was 31.4 ± 1.7 and 31.5 ± 1.7 years (Cohen's d = 0.06), respectively. The gender difference in performance in the three disciplines and for overall race time decreased significantly across years. Men and women improved overall race times by approximately 1.2 and 4.2 min/year, respectively. Conclusions Women and men peak at a similar age of 32–33 years in an Ironman triathlon with no gender difference. In a qualifier for Ironman Hawaii, the age of peak Ironman performance remained unchanged across years. In contrast, gender

  9. Working Performance Analysis of Rolling Bearings Used in Mining Electric Excavator Crowd Reducer

    NASA Astrophysics Data System (ADS)

    Zhang, Y. H.; Hou, G.; Chen, G.; Liang, J. F.; Zheng, Y. M.

    2017-12-01

    Refer to the statistical load data of digging process, on the basis of simulation analysis of crowd reducer system dynamics, the working performance simulation analysis of rolling bearings used in crowd reducer of large mining electric excavator is completed. The contents of simulation analysis include analysis of internal load distribution, rolling elements contact stresses and rolling bearing fatigue life. The internal load characteristics of rolling elements in cylindrical roller bearings are obtained. The results of this study identified that all rolling bearings satisfy the requirements of contact strength and fatigue life. The rationality of bearings selection and arrangement is also verified.

  10. Thermal Performance Analysis of Solar Collectors Installed for Combisystem in the Apartment Building

    NASA Astrophysics Data System (ADS)

    Žandeckis, A.; Timma, L.; Blumberga, D.; Rochas, C.; Rošā, M.

    2012-01-01

    The paper focuses on the application of wood pellet and solar combisystem for space heating and hot water preparation at apartment buildings under the climate of Northern Europe. A pilot project has been implemented in the city of Sigulda (N 57° 09.410 E 024° 52.194), Latvia. The system was designed and optimised using TRNSYS - a dynamic simulation tool. The pilot project was continuously monitored. To the analysis the heat transfer fluid flow rate and the influence of the inlet temperature on the performance of solar collectors were subjected. The thermal performance of a solar collector loop was studied using a direct method. A multiple regression analysis was carried out using STATGRAPHICS Centurion 16.1.15 with the aim to identify the operational and weather parameters of the system which cause the strongest influence on the collector's performance. The parameters to be used for the system's optimisation have been evaluated.

  11. SERVER DEVELOPMENT FOR NSLS-II PHYSICS APPLICATIONS AND PERFORMANCE ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, G.; Kraimer, M.

    2011-03-28

    The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. The server software under development is available via an open source sourceforge project named epics-pvdata, which consists of modules pvData, pvAccess, pvIOC, and pvService. Examples of two services that already exist in the pvService module are itemFinder, and gather. Each service uses pvData to store in-memory transient data, pvService to transfer data over the network, and pvIOC as the service engine. The performance benchmarking for pvAccess and both gather service and item finder service are presented inmore » this paper. The performance comparison between pvAccess and Channel Access are presented also. For an ultra low emittance synchrotron radiation light source like NSLS II, the control system requirements, especially for beam control are tight. To control and manipulate the beam effectively, a use case study has been performed to satisfy the requirement and theoretical evaluation has been performed. The analysis shows that model based control is indispensable for beam commissioning and routine operation. However, there are many challenges such as how to re-use a design model for on-line model based control, and how to combine the numerical methods for modeling of a realistic lattice with the analytical techniques for analysis of its properties. To satisfy the requirements and challenges, adequate system architecture for the software framework for beam commissioning and operation is critical. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating and plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing

  12. A computer analysis of the RF performance of a ground-mounted, air-supported radome

    NASA Astrophysics Data System (ADS)

    Punnett, M. B.; Joy, E. B.

    Several reports and actual operating experience have highlighted the degradation of RF Performance which can occur when SSR or IFF antenna are mounted above primary search antenna within metal space frame or dielectric space frame radomes. These effects are usually attributed to both the high incidence angles and sensitivity of the low gain antennae to sidelobe changes due to scattered energy. Although it has been widely accepted that thin membrane radomes would provide superior performance for this application, there has been little supporting documentation. A plane-wave-spectrum (PWS) computer-based radome analysis was conducted to assess the performance of a specific air-supported radome for the SSR application. In conducting the analysis a mathematical model of a modern SSR antenna was combined with a model of an existing Birdair radome design.

  13. Performance Analysis of a Static Synchronous Compensator (STATCOM)

    NASA Astrophysics Data System (ADS)

    Kambey, M. M.; Ticoh, J. D.

    2018-02-01

    Reactive power and voltage are some of the problems in electric power supply and A Gate Turn Off (GTO) Static Synchronous Compensator (STATCOM) is one of the type of FACTS with shunt which can supply variable reactive power and regulate the voltage of the bus where it is connected. This study only discuss about the performance characteristic of the three phase six-pulse STATCOM by analysing the current wave flowing through DC Capacitor which depend on switching current and capacitor voltage wave. Simulation methods used in this research is started with a mathematical analysis of the ac current, dc voltage and current equations that pass STATCOM from a literature. The result shows the presence of the capacitor voltage ripple also alters the ac current waveform, even though the errors to be not very significant and the constraint of the symmetry circuit is valid if the source voltages have no zero sequence components and the impedances in all the three phases are identical. There for to improve STATCOM performance it is necessary to use multi-pulse 12, 24, 36, 48 or more, and/or with a multilevel converter.

  14. Removing Grit During Wastewater Treatment: CFD Analysis of HDVS Performance.

    PubMed

    Meroney, Robert N; Sheker, Robert E

    2016-05-01

    Computational Fluid Dynamics (CFD) was used to simulate the grit and sand separation effectiveness of a typical hydrodynamic vortex separator (HDVS) system. The analysis examined the influences on the separator efficiency of: flow rate, fluid viscosities, total suspended solids (TSS), and particle size and distribution. It was found that separator efficiency for a wide range of these independent variables could be consolidated into a few curves based on the particle fall velocity to separator inflow velocity ratio, Ws/Vin. Based on CFD analysis it was also determined that systems of different sizes with length scale ratios ranging from 1 to 10 performed similarly when Ws/Vin and TSS were held constant. The CFD results have also been compared to a limited range of experimental data.

  15. Independent component analysis algorithm FPGA design to perform real-time blind source separation

    NASA Astrophysics Data System (ADS)

    Meyer-Baese, Uwe; Odom, Crispin; Botella, Guillermo; Meyer-Baese, Anke

    2015-05-01

    The conditions that arise in the Cocktail Party Problem prevail across many fields creating a need for of Blind Source Separation. The need for BSS has become prevalent in several fields of work. These fields include array processing, communications, medical signal processing, and speech processing, wireless communication, audio, acoustics and biomedical engineering. The concept of the cocktail party problem and BSS led to the development of Independent Component Analysis (ICA) algorithms. ICA proves useful for applications needing real time signal processing. The goal of this research was to perform an extensive study on ability and efficiency of Independent Component Analysis algorithms to perform blind source separation on mixed signals in software and implementation in hardware with a Field Programmable Gate Array (FPGA). The Algebraic ICA (A-ICA), Fast ICA, and Equivariant Adaptive Separation via Independence (EASI) ICA were examined and compared. The best algorithm required the least complexity and fewest resources while effectively separating mixed sources. The best algorithm was the EASI algorithm. The EASI ICA was implemented on hardware with Field Programmable Gate Arrays (FPGA) to perform and analyze its performance in real time.

  16. Numerical performance analysis of quartz tuning fork-based force sensors

    NASA Astrophysics Data System (ADS)

    Dagdeviren, Omur E.; Schwarz, Udo D.

    2017-01-01

    Quartz tuning fork-based force sensors where one prong is immobilized onto a holder while the other one is allowed to oscillate freely (‘qPlus’ configuration) are in widespread use for high-resolution scanning probe microscopy applications. Due to the small size of the tuning forks (≈3 mm) and the complexity of the sensor assemblies, the reliable and repeatable manufacturing of the sensors has been challenging. In this paper, we investigate the contribution of the amount and location of the epoxy glue used to attach the tuning fork to its holder on the sensor’s performance. Towards this end, we use finite element analysis to model the entire sensor assembly and to perform static and dynamic numerical simulations. Our analysis reveals that increasing the thickness of the epoxy layer between prong and holder results in a decrease of the sensor’s spring constant, eigenfrequency, and quality factor while showing an increasing deviation from oscillation in its primary modal shape. Adding epoxy at the sides of the tuning fork also leads to a degradation of the quality factor even though in this case, spring constant and eigenfrequency rise in tandem with a lessening of the deviation from its ideal modal shape.

  17. Topology design and performance analysis of an integrated communication network

    NASA Technical Reports Server (NTRS)

    Li, V. O. K.; Lam, Y. F.; Hou, T. C.; Yuen, J. H.

    1985-01-01

    A research study on the topology design and performance analysis for the Space Station Information System (SSIS) network is conducted. It is begun with a survey of existing research efforts in network topology design. Then a new approach for topology design is presented. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. The algorithm for generating subsets is described in detail, and various aspects of the overall design procedure are discussed. Two more efficient versions of this algorithm (applicable in specific situations) are also given. Next, two important aspects of network performance analysis: network reliability and message delays are discussed. A new model is introduced to study the reliability of a network with dependent failures. For message delays, a collection of formulas from existing research results is given to compute or estimate the delays of messages in a communication network without making the independence assumption. The design algorithm coded in PASCAL is included as an appendix.

  18. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  19. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  20. Dependability and performability analysis

    NASA Technical Reports Server (NTRS)

    Trivedi, Kishor S.; Ciardo, Gianfranco; Malhotra, Manish; Sahner, Robin A.

    1993-01-01

    Several practical issues regarding specifications and solution of dependability and performability models are discussed. Model types with and without rewards are compared. Continuous-time Markov chains (CTMC's) are compared with (continuous-time) Markov reward models (MRM's) and generalized stochastic Petri nets (GSPN's) are compared with stochastic reward nets (SRN's). It is shown that reward-based models could lead to more concise model specifications and solution of a variety of new measures. With respect to the solution of dependability and performability models, three practical issues were identified: largeness, stiffness, and non-exponentiality, and a variety of approaches are discussed to deal with them, including some of the latest research efforts.

  1. Diagnostic Performance of Mammographic Texture Analysis in the Differential Diagnosis of Benign and Malignant Breast Tumors.

    PubMed

    Li, Zhiming; Yu, Lan; Wang, Xin; Yu, Haiyang; Gao, Yuanxiang; Ren, Yande; Wang, Gang; Zhou, Xiaoming

    2017-11-09

    The purpose of this study was to investigate the diagnostic performance of mammographic texture analysis in the differential diagnosis of benign and malignant breast tumors. Digital mammography images were obtained from the Picture Archiving and Communication System at our institute. Texture features of mammographic images were calculated. Mann-Whitney U test was used to identify differences between the benign and malignant group. The receiver operating characteristic (ROC) curve analysis was used to assess the diagnostic performance of texture features. Significant differences of texture features of histogram, gray-level co-occurrence matrix (GLCM) and run length matrix (RLM) were found between the benign and malignant breast group (P < .05). The area under the ROC (AUROC) of histogram, GLCM, and RLM were 0.800, 0.787, and 0.761, with no differences between them (P > .05). The AUROCs of imaging-based diagnosis, texture analysis, and imaging-based diagnosis combined with texture analysis were 0.873, 0.863, and 0.961, respectively. When imaging-based diagnosis was combined with texture analysis, the AUROC was higher than that of imaging-based diagnosis or texture analysis (P < .05). Mammographic texture analysis is a reliable technique for differential diagnosis of benign and malignant breast tumors. Furthermore, the combination of imaging-based diagnosis and texture analysis can significantly improve diagnostic performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Performance Evaluation and Analysis for Gravity Matching Aided Navigation.

    PubMed

    Wu, Lin; Wang, Hubiao; Chai, Hua; Zhang, Lu; Hsu, Houtse; Wang, Yong

    2017-04-05

    Simulation tests were accomplished in this paper to evaluate the performance of gravity matching aided navigation (GMAN). Four essential factors were focused in this study to quantitatively evaluate the performance: gravity database (DB) resolution, fitting degree of gravity measurements, number of samples in matching, and gravity changes in the matching area. Marine gravity anomaly DB derived from satellite altimetry was employed. Actual dynamic gravimetry accuracy and operating conditions were referenced to design the simulation parameters. The results verified that the improvement of DB resolution, gravimetry accuracy, number of measurement samples, or gravity changes in the matching area generally led to higher positioning accuracies, while the effects of them were different and interrelated. Moreover, three typical positioning accuracy targets of GMAN were proposed, and the conditions to achieve these targets were concluded based on the analysis of several different system requirements. Finally, various approaches were provided to improve the positioning accuracy of GMAN.

  3. Performance Evaluation and Analysis for Gravity Matching Aided Navigation

    PubMed Central

    Wu, Lin; Wang, Hubiao; Chai, Hua; Zhang, Lu; Hsu, Houtse; Wang, Yong

    2017-01-01

    Simulation tests were accomplished in this paper to evaluate the performance of gravity matching aided navigation (GMAN). Four essential factors were focused in this study to quantitatively evaluate the performance: gravity database (DB) resolution, fitting degree of gravity measurements, number of samples in matching, and gravity changes in the matching area. Marine gravity anomaly DB derived from satellite altimetry was employed. Actual dynamic gravimetry accuracy and operating conditions were referenced to design the simulation parameters. The results verified that the improvement of DB resolution, gravimetry accuracy, number of measurement samples, or gravity changes in the matching area generally led to higher positioning accuracies, while the effects of them were different and interrelated. Moreover, three typical positioning accuracy targets of GMAN were proposed, and the conditions to achieve these targets were concluded based on the analysis of several different system requirements. Finally, various approaches were provided to improve the positioning accuracy of GMAN. PMID:28379178

  4. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  5. Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models

    PubMed Central

    Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon

    2010-01-01

    Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained. PMID:22163510

  6. Enhanced terahertz imaging system performance analysis and design tool for concealed weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Franck, Charmaine C.; Espinola, Richard L.; Petkie, Douglas T.; De Lucia, Frank C.; Jacobs, Eddie L.

    2011-11-01

    The U.S. Army Research Laboratory (ARL) and the U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) have developed a terahertz-band imaging system performance model/tool for detection and identification of concealed weaponry. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security & Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). This paper will provide a comprehensive review of an enhanced, user-friendly, Windows-executable, terahertz-band imaging system performance analysis and design tool that now includes additional features such as a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures. This newly enhanced THz imaging system design tool is an extension of the advanced THz imaging system performance model that was developed under the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. This paper will also provide example system component (active-illumination source and detector) trade-study analyses using the new features of this user-friendly THz imaging system performance analysis and design tool.

  7. 41 CFR 102-80.130 - Who must perform the equivalent level of safety analysis?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Who must perform the equivalent level of safety analysis? 102-80.130 Section 102-80.130 Public Contracts and Property Management...

  8. Performance Evaluation of Technical Institutions: An Application of Data Envelopment Analysis

    ERIC Educational Resources Information Center

    Debnath, Roma Mitra; Shankar, Ravi; Kumar, Surender

    2008-01-01

    Technical institutions (TIs) are playing an important role in making India a knowledge hub of this century. There is still great diversity in their relative performance, which is a matter of concern to the education planner. This article employs the method of data envelopment analysis (DEA) to compare the relative efficiency of TIs in India. The…

  9. Application of Data Envelopment Analysis on the Indicators Contributing to Learning and Teaching Performance

    ERIC Educational Resources Information Center

    Montoneri, Bernard; Lin, Tyrone T.; Lee, Chia-Chi; Huang, Shio-Ling

    2012-01-01

    This paper applies data envelopment analysis (DEA) to explore the quantitative relative efficiency of 18 classes of freshmen students studying a course of English conversation in a university of Taiwan from the academic year 2004-2006. A diagram of teaching performance improvement mechanism is designed to identify key performance indicators for…

  10. Plausibility assessment of a 2-state self-paced mental task-based BCI using the no-control performance analysis.

    PubMed

    Faradji, Farhad; Ward, Rabab K; Birch, Gary E

    2009-06-15

    The feasibility of having a self-paced brain-computer interface (BCI) based on mental tasks is investigated. The EEG signals of four subjects performing five mental tasks each are used in the design of a 2-state self-paced BCI. The output of the BCI should only be activated when the subject performs a specific mental task and should remain inactive otherwise. For each subject and each task, the feature coefficient and the classifier that yield the best performance are selected, using the autoregressive coefficients as the features. The classifier with a zero false positive rate and the highest true positive rate is selected as the best classifier. The classifiers tested include: linear discriminant analysis, quadratic discriminant analysis, Mahalanobis discriminant analysis, support vector machine, and radial basis function neural network. The results show that: (1) some classifiers obtained the desired zero false positive rate; (2) the linear discriminant analysis classifier does not yield acceptable performance; (3) the quadratic discriminant analysis classifier outperforms the Mahalanobis discriminant analysis classifier and performs almost as well as the radial basis function neural network; and (4) the support vector machine classifier has the highest true positive rates but unfortunately has nonzero false positive rates in most cases.

  11. A Divergence Statistics Extension to VTK for Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical,more » "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.« less

  12. Task versus relationship conflict, team performance, and team member satisfaction: a meta-analysis.

    PubMed

    De Dreu, Carsten K W; Weingart, Laurie R

    2003-08-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, results revealed strong and negative correlations between relationship conflict, team performance, and team member satisfaction. In contrast to what has been suggested in both academic research and introductory textbooks, however, results also revealed strong and negative (instead of the predicted positive) correlations between task conflict team performance, and team member satisfaction. As predicted, conflict had stronger negative relations with team performance in highly complex (decision making, project, mixed) than in less complex (production) tasks. Finally, task conflict was less negatively related to team performance when task conflict and relationship conflict were weakly, rather than strongly, correlated.

  13. MTF measurements on real time for performance analysis of electro-optical systems

    NASA Astrophysics Data System (ADS)

    Stuchi, Jose Augusto; Signoreto Barbarini, Elisa; Vieira, Flavio Pascoal; dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fatima Maria Mitsue; Castro Neto, Jarbas C.; Linhari Rodrigues, Evandro Luis

    2012-06-01

    The need of methods and tools that assist in determining the performance of optical systems is actually increasing. One of the most used methods to perform analysis of optical systems is to measure the Modulation Transfer Function (MTF). The MTF represents a direct and quantitative verification of the image quality. This paper presents the implementation of the software, in order to calculate the MTF of electro-optical systems. The software was used for calculating the MTF of Digital Fundus Camera, Thermal Imager and Ophthalmologic Surgery Microscope. The MTF information aids the analysis of alignment and measurement of optical quality, and also defines the limit resolution of optical systems. The results obtained with the Fundus Camera and Thermal Imager was compared with the theoretical values. For the Microscope, the results were compared with MTF measured of Microscope Zeiss model, which is the quality standard of ophthalmological microscope.

  14. Mobility performance analysis of an innovation lunar rover with diameter-variable wheel

    NASA Astrophysics Data System (ADS)

    Sun, Gang; Gao, Feng; Sun, Peng; Xu, Guoyan

    2007-11-01

    To achieve excellent mobility performance, a four-wheel, all-wheel drive lunar rover with diameter-variable wheel was presented, the wheel can be contracted and extended by the motor equipped in the wheel hub, accompanied with wheel diameter varying from 200mm to 390mm. The wheel sinkage and drawbar pull force were predicated with terramechanics formulae and lunar regolith mechanic parameters employed, furthermore, the slope traversability was investigated through quasi-static modeling mechanic analysis, also the obstacle resistance and the maximum negotiable obstacle height for different wheel radius were derived from the equations of static equilibrium of the rover. Analysis results show that for the innovation lunar rover presented, it will bring much better slope traveling stability and obstacle climbing capability than rovers with normal wheels, these will improve the rover mobility performance and stabilize the rover's frame, smooth the motion of sensors.

  15. Multivariate meta-analysis of individual participant data helped externally validate the performance and implementation of a prediction model.

    PubMed

    Snell, Kym I E; Hua, Harry; Debray, Thomas P A; Ensor, Joie; Look, Maxime P; Moons, Karel G M; Riley, Richard D

    2016-01-01

    Our aim was to improve meta-analysis methods for summarizing a prediction model's performance when individual participant data are available from multiple studies for external validation. We suggest multivariate meta-analysis for jointly synthesizing calibration and discrimination performance, while accounting for their correlation. The approach estimates a prediction model's average performance, the heterogeneity in performance across populations, and the probability of "good" performance in new populations. This allows different implementation strategies (e.g., recalibration) to be compared. Application is made to a diagnostic model for deep vein thrombosis (DVT) and a prognostic model for breast cancer mortality. In both examples, multivariate meta-analysis reveals that calibration performance is excellent on average but highly heterogeneous across populations unless the model's intercept (baseline hazard) is recalibrated. For the cancer model, the probability of "good" performance (defined by C statistic ≥0.7 and calibration slope between 0.9 and 1.1) in a new population was 0.67 with recalibration but 0.22 without recalibration. For the DVT model, even with recalibration, there was only a 0.03 probability of "good" performance. Multivariate meta-analysis can be used to externally validate a prediction model's calibration and discrimination performance across multiple populations and to evaluate different implementation strategies. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  16. Transition play in team performance of volleyball: a log-linear analysis.

    PubMed

    Eom, H J; Schutz, R W

    1992-09-01

    The purpose of this study was to develop and test a method to analyze and evaluate sequential skill performances in a team sport. An on-line computerized system was developed to record and summarize the sequential skill performances in volleyball. Seventy-two sample games from the third Federation of International Volleyball Cup men's competition were videotaped and grouped into two categories according to the final team standing and game outcome. Log-linear procedures were used to investigate the nature and degree of the relationship in the first-order (pass-to-set, set-to-spike) and second-order (pass-to-spike) transition plays. Results showed that there was a significant dependency in both the first-order and second-order transition plays, indicating that the outcome of a skill performance is highly influenced by the quality of a preceding skill performance. In addition, the pattern of the transition plays was stable and consistent, regardless of the classification status: Game Outcome, Team Standing, or Transition Process. The methodology and subsequent results provide valuable aids for a thorough understanding of the characteristics of transition plays in volleyball. In addition, the concept of sequential performance analysis may serve as an example for sport scientists in investigating probabilistic patterns of motor performance.

  17. Performance analysis of underwater pump for water-air dual-use engine

    NASA Astrophysics Data System (ADS)

    Xia, Jun; Wang, Yun; Chen, Yu

    2017-10-01

    To make water-air dual-use engine work both in air and under water, the compressor of the engine should not only meet the requirements of air flight, but also must have the ability to work underwater. To verify the performance of the compressor when the water-air dual-use engine underwater propulsion mode, the underwater pumping water model of the air compressor is simulated by commercial CFD software, and the flow field analysis is carried out. The results show that conventional air compressors have a certain ability to work in the water environment, however, the blade has a great influence on the flow, and the compressor structure also affects the pump performance. Compressor can initially take into account the two modes of water and air. In order to obtain better performance, the structure of the compressor needs further improvement and optimization.

  18. Analysis of integrated healthcare networks' performance: a contingency-strategic management perspective.

    PubMed

    Lin, B Y; Wan, T T

    1999-12-01

    Few empirical analyses have been done in the organizational researches of integrated healthcare networks (IHNs) or integrated healthcare delivery systems. Using a contingency derived contact-process-performance model, this study attempts to explore the relationships among an IHN's strategic direction, structural design, and performance. A cross-sectional analysis of 100 IHNs suggests that certain contextual factors such as market competition and network age and tax status have statistically significant effects on the implementation of an IHN's service differentiation strategy, which addresses coordination and control in the market. An IHN's service differentiation strategy is positively related to its integrated structural design, which is characterized as integration of administration, patient care, and information system across different settings. However, no evidence supports that the development of integrated structural design may benefit an IHN's performance in terms of clinical efficiency and financial viability.

  19. Consequences of sludge composition on combustion performance derived from thermogravimetry analysis.

    PubMed

    Li, Meiyan; Xiao, Benyi; Wang, Xu; Liu, Junxin

    2015-01-01

    Wastewater treatment plants produce millions of tons of sewage sludge. Sewage sludge is recognized as a promising feedstock for power generation via combustion and can be used for energy crisis adaption. We aimed to investigate the quantitative effects of various sludge characteristics on the overall sludge combustion process performance. Different types of sewage sludge were derived from numerous wastewater treatment plants in Beijing for further thermogravimetric analysis. Thermogravimetric-differential thermogravimetric curves were used to compare the performance of the studied samples. Proximate analytical data, organic compositions, elementary composition, and calorific value of the samples were determined. The relationship between combustion performance and sludge composition was also investigated. Results showed that the performance of sludge combustion was significantly affected by the concentration of protein, which is the main component of volatiles. Carbohydrates and lipids were not correlated with combustion performance, unlike protein. Overall, combustion performance varied with different sludge organic composition. The combustion rate of carbohydrates was higher than those of protein and lipid, and carbohydrate weight loss mainly occurred during the second stage (175-300°C). Carbohydrates have a substantial effect on the rate of system combustion during the second stage considering the specific combustion feature. Additionally, the combustion performance of digested sewage sludge is more negative than the others. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Robust, affordable, semi-direct Mars mission

    NASA Astrophysics Data System (ADS)

    Salotti, Jean-Marc

    2016-10-01

    A new architecture is proposed for the first manned Mars mission, based on current NASA developments (SLS and Orion), chemical propulsion for interplanetary transit, aerocapture for all vehicles, a split strategy, and a long stay on the surface. Two important choices make this architecture affordable and appropriate for the first mission. The first is splitting the Earth return vehicle into two parts that are launched separately and dock in Mars orbit. This is necessary to make aerocapture feasible and efficient, which considerably reduces mass. The second is reducing the crew to 3 astronauts. This simplifies the mission and reduces the SLS payload mass under the 45-metric ton limit for a direct TMI (trans-Mars injection) burn without LEO assembly. Only 4 SLS launches are required. The first takes the Mars ascent vehicle and in situ resource utilization systems to the planet's surface. The second takes the first part of the Earth return vehicle, the habitat, into Mars orbit. Two years later, two further SLS launches take a dual-use habitat (outbound trip and surface), Orion, and an enhanced service module to LEO, and then into Mars orbit, followed by the landing of the habitat on the surface. Transit time is demonstrated to be easily reduced to less than 6 months, with relatively low impact on propellant mass and none at all on the architecture.

  1. Analysis of satellite multibeam antennas’ performances

    NASA Astrophysics Data System (ADS)

    Sterbini, Guido

    2006-07-01

    In this work, we discuss the application of frequency reuse's concept in satellite communications, stressing the importance for a design-oriented mathematical model as first step for dimensioning antenna systems. We consider multibeam reflector antennas. The first part of the work consists in reorganizing, making uniform and completing the models already developed in the scientific literature. In doing it, we adopt the multidimensional Taylor development formalism. For computing the spillover efficiency of the antenna, we consider different feed's illuminations and we propose a completely original mathematical model, obtained by the interpolation of simulator results. The second part of the work is dedicated to characterize the secondary far field pattern. Combining this model together with the information on the cellular coverage geometry is possible to evaluate the isolation and the minimum directivity on the cell. As third part, in order to test the model and its analysis and synthesis capabilities, we implement a software tool that helps the designer in the rapid tuning of the fundamental quantities for the optimization of the performance: the proposed model shows an optimum agreement with the results of the simulations.

  2. A performance analysis of DS-CDMA and SCPC VSAT networks

    NASA Technical Reports Server (NTRS)

    Hayes, David P.; Ha, Tri T.

    1990-01-01

    Spread-spectrum and single-channel-per-carrier (SCPC) transmission techniques work well in very small aperture terminal (VSAT) networks for multiple-access purposes while allowing the earth station antennas to remain small. Direct-sequence code-division multiple-access (DS-CDMA) is the simplest spread-spectrum technique to use in a VSAT network since a frequency synthesizer is not required for each terminal. An examination is made of the DS-CDMA and SCPC Ku-band VSAT satellite systems for low-density (64-kb/s or less) communications. A method for improving the standardf link analysis of DS-CDMA satellite-switched networks by including certain losses is developed. The performance of 50-channel full mesh and star network architectures is analyzed. The selection of operating conditions producing optimum performance is demonstrated.

  3. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    PubMed Central

    Tamayo, Alain; Granell, Carlos; Huerta, Joaquín

    2012-01-01

    Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  4. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  5. The pragmatics of "madness": performance analysis of a Bangladeshi woman's "aberrant" lament.

    PubMed

    Wilce, J M

    1998-03-01

    A fine-grained analysis of the transcript of a Bangladeshi woman's lament is used to argue for an anthropology of "madness" that attends closely to performance and performativity. The emergent, interactive production of wept speech, together with the conflicting use to which it is put by the performer and her relatives, is linked problematically to performance genres and to ethnopsychiatric indexes of madness. Tuneful weeping is taken by relatives to be performative of madness, in a sense something like Austin's. Yet, exploration of the divergent linguistic ideologies which are brought to bear on the lament not only enables more nuanced ethnographic treatment but also has reflexive ramifications for medical and psychological anthropology. This leads to a critique of the referentialism in our own treatment of language. The role played by transparent reference is overshadowed by indexicality and by dialogical processes of proposing and resisting labels for speech genres attributed to the "mad."

  6. Consequences of sludge composition on combustion performance derived from thermogravimetry analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Meiyan; Xiao, Benyi; Wang, Xu

    Highlights: • Volatiles, particularly proteins, play a key role in sludge combustion. • Sludge combustion performance varies with different sludge organic concentrations. • Carbohydrates significantly affect the combustion rate in the second stage. • Combustion performance of digested sludge is more negative compared with others. - Abstract: Wastewater treatment plants produce millions of tons of sewage sludge. Sewage sludge is recognized as a promising feedstock for power generation via combustion and can be used for energy crisis adaption. We aimed to investigate the quantitative effects of various sludge characteristics on the overall sludge combustion process performance. Different types of sewagemore » sludge were derived from numerous wastewater treatment plants in Beijing for further thermogravimetric analysis. Thermogravimetric–differential thermogravimetric curves were used to compare the performance of the studied samples. Proximate analytical data, organic compositions, elementary composition, and calorific value of the samples were determined. The relationship between combustion performance and sludge composition was also investigated. Results showed that the performance of sludge combustion was significantly affected by the concentration of protein, which is the main component of volatiles. Carbohydrates and lipids were not correlated with combustion performance, unlike protein. Overall, combustion performance varied with different sludge organic composition. The combustion rate of carbohydrates was higher than those of protein and lipid, and carbohydrate weight loss mainly occurred during the second stage (175–300 °C). Carbohydrates have a substantial effect on the rate of system combustion during the second stage considering the specific combustion feature. Additionally, the combustion performance of digested sewage sludge is more negative than the others.« less

  7. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lisbeth A.

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  8. Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1993-01-01

    The period from Jan. 1993 thru Aug. 1993 is covered. The primary tasks during this period were the development of a single and multi-vibrational temperature preferential vibration-dissociation coupling model, the development of a normal shock nonequilibrium radiation-gasdynamic coupling model based upon the blunt body model, and the comparison of results obtained with these models with experimental data. In addition, an extensive series of computations were conducted using the blunt body model to develop a set of reference results covering a wide range of vehicle sizes, altitudes, and entry velocities.

  9. Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1990-01-01

    The continued development and improvement of the viscous shock layer (VSL) nonequilibrium chemistry blunt body engineering code, the incorporation in a coupled manner of radiation models into the VSL code, and the initial development of appropriate precursor models are presented.

  10. Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1990-01-01

    The primary tasks during January 1990 to June 1990 have been the development and evaluation of various electron and electron-electronic energy equation models, the continued development of improved nonequilibrium radiation models for molecules and atoms, and the continued development and investigation of precursor models and their effects. In addition, work was initiated to develop a vibrational model for the viscous shock layer (VSL) nonequilibrium chemistry blunt body engineering code. Also, an effort was started associated with the effects of including carbon species, say from an ablator, in the flowfield.

  11. Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1994-01-01

    The primary accomplishments of the project were as follows: (1) From an overall standpoint, the primary accomplishment of this research was the development of a complete gasdynamic-radiatively coupled nonequilibrium viscous shock layer solution method for axisymmetric blunt bodies. This method can be used for rapid engineering modeling of nonequilibrium re-entry flowfields over a wide range of conditions. (2) Another significant accomplishment was the development of an air radiation model that included local thermodynamic nonequilibrium (LTNE) phenomena. (3) As part of this research, three electron-electronic energy models were developed. The first was a quasi-equilibrium electron (QEE) model which determined an effective free electron temperature and assumed that the electronic states were in equilibrium with the free electrons. The second was a quasi-equilibrium electron-electronic (QEEE) model which computed an effective electron-electronic temperature. The third model was a full electron-electronic (FEE) differential equation model which included convective, collisional, viscous, conductive, vibrational coupling, and chemical effects on electron-electronic energy. (4) Since vibration-dissociation coupling phenomena as well as vibrational thermal nonequilibrium phenomena are important in the nonequilibrium zone behind a shock front, a vibrational energy and vibration-dissociation coupling model was developed and included in the flowfield model. This model was a modified coupled vibrational dissociation vibrational (MCVDV) model and also included electron-vibrational coupling. (5) Another accomplishment of the project was the usage of the developed models to investigate radiative heating. (6) A multi-component diffusion model which properly models the multi-component nature of diffusion in complex gas mixtures such as air, was developed and incorporated into the blunt body model. (7) A model was developed to predict the magnitude and characteristics of the shock wave precursor ahead of vehicles entering the Earth's atmosphere. (8) Since considerable data exists for radiating nonequilibrium flow behind normal shock waves, a normal shock wave version of the blunt body code was developed. (9) By comparing predictions from the models and codes with available normal shock data and the flight data of Fire II, it is believed that the developed flowfield and nonequilibrium radiation models have been essentially validated for engineering applications.

  12. Analysis of the Influence of Quantile Regression Model on Mainland Tourists' Service Satisfaction Performance

    PubMed Central

    Wang, Wen-Cheng; Cho, Wen-Chien; Chen, Yin-Jen

    2014-01-01

    It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models. PMID:24574916

  13. Analysis of the influence of quantile regression model on mainland tourists' service satisfaction performance.

    PubMed

    Wang, Wen-Cheng; Cho, Wen-Chien; Chen, Yin-Jen

    2014-01-01

    It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models.

  14. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  15. Portable Life Support Subsystem Thermal Hydraulic Performance Analysis

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce; Pinckney, John; Conger, Bruce

    2010-01-01

    This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.

  16. A theoretical analysis of vacuum arc thruster performance

    NASA Technical Reports Server (NTRS)

    Polk, James E.; Sekerak, Mike; Ziemer, John K.; Schein, Jochen; Qi, Niansheng; Binder, Robert; Anders, Andre

    2001-01-01

    In vacuum arc discharges the current is conducted through vapor evaporated from the cathode surface. In these devices very dense, highly ionized plasmas can be created from any metallic or conducting solid used as the cathode. This paper describes theoretical models of performance for several thruster configurations which use vacuum arc plasma sources. This analysis suggests that thrusters using vacuum arc sources can be operated efficiently with a range of propellant options that gives great flexibility in specific impulse. In addition, the efficiency of plasma production in these devices appears to be largely independent of scale because the metal vapor is ionized within a few microns of the cathode electron emission sites, so this approach is well-suited for micropropulsion.

  17. High-performance equation solvers and their impact on finite element analysis

    NASA Technical Reports Server (NTRS)

    Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. Dale, Jr.

    1990-01-01

    The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number of operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.

  18. High-performance equation solvers and their impact on finite element analysis

    NASA Technical Reports Server (NTRS)

    Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. D., Jr.

    1992-01-01

    The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number od operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.

  19. A Review and Analysis of Performance Appraisal Processes, Volume III. Performance Appraisal for Professional Service Employees: Non-Technical Report. Professionalism in Schools Series.

    ERIC Educational Resources Information Center

    Ondrack, D. A.; Oliver, C.

    The third of three volumes, this report summarizes the findings of, first, a review and analysis of published literature on performance appraisal in general and particularly on the use of appraisals in public education systems, and, second, a series of field-site investigations of performance appraisal systems in action. The field site studies of…

  20. Performance and analysis of MAC protocols based on application

    NASA Astrophysics Data System (ADS)

    Yadav, Ravi; Daniel, A. K.

    2018-04-01

    Wireless Sensor Network is one of the rapid emerging technology in recent decades. It covers large application area as civilian and military. Wireless Sensor Network primary consists of sensor nodes having low-power, low cost and multifunctional activities to collaborates and communicates via wireless medium. The deployment of sensor nodes are adhoc in nature, so sensor nodes are auto organize themselves in such a way to communicate with each other. The characteristics make more challenging areas on WSNs. This paper gives overview about characteristics of WSNs, Architecture and Contention Based MAC protocol. The paper present analysis of various protocol based on performance.

  1. Performance analysis of jump-gliding locomotion for miniature robotics.

    PubMed

    Vidyasagar, A; Zufferey, Jean-Christohphe; Floreano, Dario; Kovač, M

    2015-03-26

    Recent work suggests that jumping locomotion in combination with a gliding phase can be used as an effective mobility principle in robotics. Compared to pure jumping without a gliding phase, the potential benefits of hybrid jump-gliding locomotion includes the ability to extend the distance travelled and reduce the potentially damaging impact forces upon landing. This publication evaluates the performance of jump-gliding locomotion and provides models for the analysis of the relevant dynamics of flight. It also defines a jump-gliding envelope that encompasses the range that can be achieved with jump-gliding robots and that can be used to evaluate the performance and improvement potential of jump-gliding robots. We present first a planar dynamic model and then a simplified closed form model, which allow for quantification of the distance travelled and the impact energy on landing. In order to validate the prediction of these models, we validate the model with experiments using a novel jump-gliding robot, named the 'EPFL jump-glider'. It has a mass of 16.5 g and is able to perform jumps from elevated positions, perform steered gliding flight, land safely and traverse on the ground by repetitive jumping. The experiments indicate that the developed jump-gliding model fits very well with the measured flight data using the EPFL jump-glider, confirming the benefits of jump-gliding locomotion to mobile robotics. The jump-glide envelope considerations indicate that the EPFL jump-glider, when traversing from a 2 m height, reaches 74.3% of optimal jump-gliding distance compared to pure jumping without a gliding phase which only reaches 33.4% of the optimal jump-gliding distance. Methods of further improving flight performance based on the models and inspiration from biological systems are presented providing mechanical design pathways to future jump-gliding robot designs.

  2. Driver performance measurement and analysis system (DPMAS). Volume 1, Description and operations manual

    DOT National Transportation Integrated Search

    1976-08-01

    A prototype driver performance measurement and analysis system (DPMAS) has been developed for the National Highway Traffic Safety Administration (NHTSA). This system includes a completely instrumented 1974 Chevrolet Impala capable of digitally record...

  3. Experimental Analysis of Small-Group Performance Effectiveness: Behavioral and Biological Interactions.

    DTIC Science & Technology

    1982-04-01

    processes requiring systematic experimental analysis. Accordingly, group performance effectiveness studies were initiated to 61 assess the effects on...the experiment. 67 active processes associated with Joining the respective established groups, but the absence of baseline levels precludes such an...novitiate in comparison to such values observed during baseline days suggested an active process associated with the joining of the group and emphasized the

  4. Performance analysis of Supply Chain Management with Supply Chain Operation reference model

    NASA Astrophysics Data System (ADS)

    Hasibuan, Abdurrozzaq; Arfah, Mahrani; Parinduri, Luthfi; Hernawati, Tri; Suliawati; Harahap, Bonar; Rahmah Sibuea, Siti; Krianto Sulaiman, Oris; purwadi, Adi

    2018-04-01

    This research was conducted at PT. Shamrock Manufacturing Corpora, the company is required to think creatively to implement competition strategy by producing goods/services that are more qualified, cheaper. Therefore, it is necessary to measure the performance of Supply Chain Management in order to improve the competitiveness. Therefore, the company is required to optimize its production output to meet the export quality standard. This research begins with the creation of initial dimensions based on Supply Chain Management process, ie Plan, Source, Make, Delivery, and Return with hierarchy based on Supply Chain Reference Operation that is Reliability, Responsiveness, Agility, Cost, and Asset. Key Performance Indicator identification becomes a benchmark in performance measurement whereas Snorm De Boer normalization serves to equalize Key Performance Indicator value. Analiytical Hierarchy Process is done to assist in determining priority criteria. Measurement of Supply Chain Management performance at PT. Shamrock Manufacturing Corpora produces SC. Responsiveness (0.649) has higher weight (priority) than other alternatives. The result of performance analysis using Supply Chain Reference Operation model of Supply Chain Management performance at PT. Shamrock Manufacturing Corpora looks good because its monitoring system between 50-100 is good.

  5. Human performance consequences of stages and levels of automation: an integrated meta-analysis.

    PubMed

    Onnasch, Linda; Wickens, Christopher D; Li, Huiyang; Manzey, Dietrich

    2014-05-01

    We investigated how automation-induced human performance consequences depended on the degree of automation (DOA). Function allocation between human and automation can be represented in terms of the stages and levels taxonomy proposed by Parasuraman, Sheridan, and Wickens. Higher DOAs are achieved both by later stages and higher levels within stages. A meta-analysis based on data of 18 experiments examines the mediating effects of DOA on routine system performance, performance when the automation fails, workload, and situation awareness (SA). The effects of DOA on these measures are summarized by level of statistical significance. We found (a) a clear automation benefit for routine system performance with increasing DOA, (b) a similar but weaker pattern for workload when automation functioned properly, and (c) a negative impact of higher DOA on failure system performance and SA. Most interesting was the finding that negative consequences of automation seem to be most likely when DOA moved across a critical boundary, which was identified between automation supporting information analysis and automation supporting action selection. Results support the proposed cost-benefit trade-off with regard to DOA. It seems that routine performance and workload on one hand, and the potential loss of SA and manual skills on the other hand, directly trade off and that appropriate function allocation can serve only one of the two aspects. Findings contribute to the body of research on adequate function allocation by providing an overall picture through quantitatively combining data from a variety of studies across varying domains.

  6. How motivation affects academic performance: a structural equation modelling analysis.

    PubMed

    Kusurkar, R A; Ten Cate, Th J; Vos, C M P; Westers, P; Croiset, G

    2013-03-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous Motivation (RAM, a measure of the balance between AM and CM) affects academic performance through good study strategy and higher study effort and compare this model between subgroups: males and females; students selected via two different systems namely qualitative and weighted lottery selection. Data on motivation, study strategy and effort was collected from 383 medical students of VU University Medical Center Amsterdam and their academic performance results were obtained from the student administration. Structural Equation Modelling analysis technique was used to test a hypothesized model in which high RAM would positively affect Good Study Strategy (GSS) and study effort, which in turn would positively affect academic performance in the form of grade point averages. This model fit well with the data, Chi square = 1.095, df = 3, p = 0.778, RMSEA model fit = 0.000. This model also fitted well for all tested subgroups of students. Differences were found in the strength of relationships between the variables for the different subgroups as expected. In conclusion, RAM positively correlated with academic performance through deep strategy towards study and higher study effort. This model seems valid in medical education in subgroups such as males, females, students selected by qualitative and weighted lottery selection.

  7. Integration of Pharmacy Practice and Pharmaceutical Analysis: Quality Assessment of Laboratory Performance.

    ERIC Educational Resources Information Center

    McGill, Julian E.; Holly, Deborah R.

    1996-01-01

    Laboratory portions of courses in pharmacy practice and pharmaceutical analysis at the Medical University of South Carolina are integrated and coordinated to provide feedback on student performance in compounding medications. Students analyze the products they prepare, with early exposure to compendia requirements and other references. Student…

  8. Pavement marking performance analysis

    DOT National Transportation Integrated Search

    2009-06-30

    This research evaluated pavement marking performance and developed useful degradation models for thermoplastic : and paint pavement markings which can help North Carolina meet the pending FHWA minimum retroreflectivity : requirements. The impacts of ...

  9. Managing in-hospital quality improvement: An importance-performance analysis to set priorities for ST-elevation myocardial infarction care.

    PubMed

    Aeyels, Daan; Seys, Deborah; Sinnaeve, Peter R; Claeys, Marc J; Gevaert, Sofie; Schoors, Danny; Sermeus, Walter; Panella, Massimiliano; Bruyneel, Luk; Vanhaecht, Kris

    2018-02-01

    A focus on specific priorities increases the success rate of quality improvement efforts for broad and complex-care processes. Importance-performance analysis presents a possible approach to set priorities around which to design and implement effective quality improvement initiatives. Persistent variation in hospital performance makes ST-elevation myocardial infarction care relevant to consider for importance-performance analysis. The purpose of this study was to identify quality improvement priorities in ST-elevation myocardial infarction care. Importance and performance levels of ST-elevation myocardial infarction key interventions were combined in an importance-performance analysis. Content validity indexes on 23 ST-elevation myocardial infarction key interventions of a multidisciplinary RAND Delphi Survey defined importance levels. Structured review of 300 patient records in 15 acute hospitals determined performance levels. The significance of between-hospital variation was determined by a Kruskal-Wallis test. A performance heat-map allowed for hospital-specific priority setting. Seven key interventions were each rated as an overall improvement priority. Priority key interventions related to risk assessment, timely reperfusion by percutaneous coronary intervention and secondary prevention. Between-hospital performance varied significantly for the majority of key interventions. The type and number of priorities varied strongly across hospitals. Guideline adherence in ST-elevation myocardial infarction care is low and improvement priorities vary between hospitals. Importance-performance analysis helps clinicians and management in demarcation of the nature, number and order of improvement priorities. By offering a tailored improvement focus, this methodology makes improvement efforts more specific and achievable.

  10. Statistical model to perform error analysis of curve fits of wind tunnel test data using the techniques of analysis of variance and regression analysis

    NASA Technical Reports Server (NTRS)

    Alston, D. W.

    1981-01-01

    The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.

  11. Network analysis of patient flow in two UK acute care hospitals identifies key sub-networks for A&E performance

    PubMed Central

    Stringer, Clive; Beeknoo, Neeraj

    2017-01-01

    The topology of the patient flow network in a hospital is complex, comprising hundreds of overlapping patient journeys, and is a determinant of operational efficiency. To understand the network architecture of patient flow, we performed a data-driven network analysis of patient flow through two acute hospital sites of King’s College Hospital NHS Foundation Trust. Administration databases were queried for all intra-hospital patient transfers in an 18-month period and modelled as a dynamic weighted directed graph. A ‘core’ subnetwork containing only 13–17% of all edges channelled 83–90% of the patient flow, while an ‘ephemeral’ network constituted the remainder. Unsupervised cluster analysis and differential network analysis identified sub-networks where traffic is most associated with A&E performance. Increased flow to clinical decision units was associated with the best A&E performance in both sites. The component analysis also detected a weekend effect on patient transfers which was not associated with performance. We have performed the first data-driven hypothesis-free analysis of patient flow which can enhance understanding of whole healthcare systems. Such analysis can drive transformation in healthcare as it has in industries such as manufacturing. PMID:28968472

  12. Data Link Performance Analysis for LVLASO Experiments

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1998-01-01

    Low-visibility Landing and Surface Operations System (LVLASO) is currently being prototyped and tested at NASA Langley Research Center. Since the main objective of the system is to maintain the aircraft landings and take-offs even during low-visibility conditions, timely exchange of positional and other information between the aircraft and the ground control is critical. For safety and reliability reasons, there are several redundant sources on the ground (e.g., ASDE, AMASS) that collect and disseminate information about the environment to the aircrafts. The data link subsystem of LVLASO is responsible for supporting the timely transfer of information between the aircrafts and the ground controllers. In fact, if not properly designed, the data link subsystem could become a bottleneck in the proper functioning of LVLASO. Currently, the other components of the system are being designed assuming that the data link has adequate capacity and is capable of delivering the information in a timely manner. During August 1-28, 1997, several flight experiments were conducted to test the prototypes of subsystems developed under LVLASO project, The back-round and details of the tests are described in the next section. The test results have been collected in two CDs by FAA and Rockwell-Collins. Under the current grant, we have analyzed the data and evaluated the performance of the Mode S datalink. In this report, we summarize the results of our analysis. Much of the results are shown in terms of graphs or histograms. The test date (or experiment number) was often taken as the X-axis and the Y-axis denotes whatever metric of focus in that chart. In interpreting these charts, one need to take into account the vehicular traffic during a particular experiment. In general, the performance of the data link was found to be quite satisfactory in terms of delivering long and short Mode S squitters from the vehicles to the ground receiver, Similarly, its performance in delivering control

  13. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2006-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  14. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  15. Development of PRIME for irradiation performance analysis of U-Mo/Al dispersion fuel

    NASA Astrophysics Data System (ADS)

    Jeong, Gwan Yoon; Kim, Yeon Soo; Jeong, Yong Jin; Park, Jong Man; Sohn, Dong-Seong

    2018-04-01

    A prediction code for the thermo-mechanical performance of research reactor fuel (PRIME) has been developed with the implementation of developed models to analyze the irradiation behavior of U-Mo dispersion fuel. The code is capable of predicting the two-dimensional thermal and mechanical performance of U-Mo dispersion fuel during irradiation. A finite element method was employed to solve the governing equations for thermal and mechanical equilibria. Temperature- and burnup-dependent material properties of the fuel meat constituents and cladding were used. The numerical solution schemes in PRIME were verified by benchmarking solutions obtained using a commercial finite element analysis program (ABAQUS). The code was validated using irradiation data from RERTR, HAMP-1, and E-FUTURE tests. The measured irradiation data used in the validation were IL thickness, volume fractions of fuel meat constituents for the thermal analysis, and profiles of the plate thickness changes and fuel meat swelling for the mechanical analysis. The prediction results were in good agreement with the measurement data for both thermal and mechanical analyses, confirming the validity of the code.

  16. A Comparison of Platforms for the Aerial Exploration of Titan

    NASA Technical Reports Server (NTRS)

    Wright, Henry S.; Gasbarre, Joseph F.; Levine, Joel S.

    2005-01-01

    Exploration of Titan, envisioned as a follow-on to the highly successful Cassini-Huygens mission, is described in this paper. A mission blending measurements from a dedicated orbiter and an in-situ aerial explorer is discussed. Summary description of the science rationale and the mission architecture, including the orbiter, is provided. The mission has been sized to ensure it can be accommodated on an existing expendable heavy-lift launch vehicle. A launch to Titan in 2018 with a 6-year time of flight to Titan using a combination of Solar Electric Propulsion and aeroassist (direct entry and aerocapture) forms the basic mission architecture. A detailed assessment of different platforms for aerial exploration of Titan has been performed. A rationale for the selection of the airship as the baseline platform is provided. Detailed description of the airship, its subsystems, and its operational strategies are provided.

  17. New trends in gender and mathematics performance: a meta-analysis.

    PubMed

    Lindberg, Sara M; Hyde, Janet Shibley; Petersen, Jennifer L; Linn, Marcia C

    2010-11-01

    In this article, we use meta-analysis to analyze gender differences in recent studies of mathematics performance. First, we meta-analyzed data from 242 studies published between 1990 and 2007, representing the testing of 1,286,350 people. Overall, d = 0.05, indicating no gender difference, and variance ratio = 1.08, indicating nearly equal male and female variances. Second, we analyzed data from large data sets based on probability sampling of U.S. adolescents over the past 20 years: the National Longitudinal Surveys of Youth, the National Education Longitudinal Study of 1988, the Longitudinal Study of American Youth, and the National Assessment of Educational Progress. Effect sizes for the gender difference ranged between -0.15 and +0.22. Variance ratios ranged from 0.88 to 1.34. Taken together, these findings support the view that males and females perform similarly in mathematics.

  18. Authentic Performance in the Instrumental Analysis Laboratory: Building a Visible Spectrophotometer Prototype

    ERIC Educational Resources Information Center

    Wilson, Mark V.; Wilson, Erin

    2017-01-01

    In this work we describe an authentic performance project for Instrumental Analysis in which students designed, built, and tested spectrophotometers made from simple components. The project addressed basic course content such as instrument design principles, UV-vis spectroscopy, and spectroscopic instrument components as well as skills such as…

  19. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    PubMed

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  20. Instantaneous BeiDou-GPS attitude determination: A performance analysis

    NASA Astrophysics Data System (ADS)

    Nadarajah, Nandakumaran; Teunissen, Peter J. G.; Raziq, Noor

    2014-09-01

    The advent of modernized and new global navigation satellite systems (GNSS) has enhanced the availability of satellite based positioning, navigation, and timing (PNT) solutions. Specifically, it increases redundancy and yields operational back-up or independence in case of failure or unavailability of one system. Among existing GNSS, the Chinese BeiDou system (BDS) is being developed and will consist of geostationary (GEO) satellites, inclined geosynchronous orbit (IGSO) satellites, and medium-Earth-orbit (MEO) satellites. In this contribution, a BeiDou-GPS robustness analysis is carried out for instantaneous, unaided attitude determination. Precise attitude determination using multiple GNSS antennas mounted on a platform relies on the successful resolution of the integer carrier phase ambiguities. The constrained Least-squares AMBiguity Decorrelation Adjustment (C-LAMBDA) method has been developed for the quadratically constrained GNSS compass model that incorporates the known baseline length. In this contribution the method is used to analyse the attitude determination performance when using the GPS and BeiDou systems. The attitude determination performance is evaluated using GPS/BeiDou data sets from a real data campaign in Australia spanning several days. The study includes the performance analyses of both stand-alone and mixed constellation (GPS/BeiDou) attitude estimation under various satellite deprived environments. We demonstrate and quantify the improved availability and accuracy of attitude determination using the combined constellation.

  1. Idaho National Laboratory Quarterly Performance Analysis for the 2nd Quarter FY 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lisbeth A.

    2015-04-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of events for the 2nd Qtr FY-15.

  2. Sleep Disturbance, Daytime Symptoms, and Functional Performance in Patients With Stable Heart Failure: A Mediation Analysis.

    PubMed

    Jeon, Sangchoon; Redeker, Nancy S

    2016-01-01

    Sleep disturbance is common among patients with heart failure (HF) who also experience symptom burden and poor functional performance. We evaluated the extent to which sleep-related, daytime symptoms (fatigue, excessive daytime sleepiness, and depressive symptoms) mediate the relationship between sleep disturbance and functional performance among patients with stable HF. We recruited patients with stable HF for this secondary analysis of data from a cross-sectional, observational study. Participants completed unattended ambulatory polysomnography from which the Respiratory Disturbance Index was calculated, along with a Six-Minute Walk Test, questionnaires to elicit sleep disturbance (Pittsburgh Sleep Quality Index, Insomnia Symptoms from the Sleep Habits Questionnaire), daytime symptoms (Center for Epidemiologic Studies Depression Scale, Global Fatigue Index, Epworth Sleepiness Scale), and self-reported functional performance (Medical Outcomes Study SF36 V2 Physical Function Scale). We used structural equation modeling with latent variables for the key analysis. Follow-up, exploratory regression analysis with bootstrapped samples was used to examine the extent to which individual daytime symptoms mediated effects of sleep disturbance on functional performance after controlling for clinical and demographic covariates. The sample included 173 New York Heart Association Class I-IV HF patients (n = 60/34.7% women; M = 60.7, SD = 16.07 years of age). Daytime symptoms mediated the relationship between sleep disturbance and functional performance. Fatigue and depression mediated the relationship between insomnia symptoms and self-reported functional performance, whereas fatigue and sleepiness mediated the relationship between sleep quality and functional performance. Sleepiness mediated the relationship between the respiratory index and self-reported functional performance only in people who did not report insomnia. Daytime symptoms explain the relationships between sleep

  3. Bimodal Nuclear Thermal Rocket Sizing and Trade Matrix for Lunar, Near Earth Asteroid and Mars Missions

    NASA Astrophysics Data System (ADS)

    McCurdy, David R.; Krivanek, Thomas M.; Roche, Joseph M.; Zinolabedini, Reza

    2006-01-01

    The concept of a human rated transport vehicle for various near earth missions is evaluated using a liquid hydrogen fueled Bimodal Nuclear Thermal Propulsion (BNTP) approach. In an effort to determine the preliminary sizing and optimal propulsion system configuration, as well as the key operating design points, an initial investigation into the main system level parameters was conducted. This assessment considered not only the performance variables but also the more subjective reliability, operability, and maintainability attributes. The SIZER preliminary sizing tool was used to facilitate rapid modeling of the trade studies, which included tank materials, propulsive versus an aero-capture trajectory, use of artificial gravity, reactor chamber operating pressure and temperature, fuel element scaling, engine thrust rating, engine thrust augmentation by adding oxygen to the flow in the nozzle for supersonic combustion, and the baseline turbopump configuration to address mission redundancy and safety requirements. A high level system perspective was maintained to avoid focusing solely on individual component optimization at the expense of system level performance, operability, and development cost.

  4. Mission and Design Sensitivities for Human Mars Landers Using Hypersonic Inflatable Aerodynamic Decelerators

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara P.; Thomas, Herbert D.; Dwyer Ciancio, Alicia; Collins, Tim; Samareh, Jamshid

    2017-01-01

    Landing humans on Mars is one of NASA's long term goals. NASA's Evolvable Mars Campaign (EMC) is focused on evaluating architectural trade options to define the capabilities and elements needed to sustain human presence on the surface of Mars. The EMC study teams have considered a variety of in-space propulsion options and surface mission options. Understanding how these choices affect the performance of the lander will allow a balanced optimization of this complex system of systems problem. This paper presents the effects of mission and vehicle design options on lander mass and performance. Beginning with Earth launch, options include fairing size assumptions, co-manifesting elements with the lander, and Earth-Moon vicinity operations. Capturing into Mars orbit using either aerocapture or propulsive capture is assessed. For entry, descent, and landing both storable as well as oxygen and methane propellant combinations are considered, engine thrust level is assessed, and sensitivity to landed payload mass is presented. This paper focuses on lander designs using the Hypersonic Inflatable Aerodynamic Decelerators, one of several entry system technologies currently considered for human missions.

  5. Magnetohydrodynamic Augmented Propulsion Experiment: I. Performance Analysis and Design

    NASA Technical Reports Server (NTRS)

    Litchford, R. J.; Cole, J. W.; Lineberry, J. T.; Chapman, J. N.; Schmidt, H. J.; Lineberry, C. W.

    2003-01-01

    The performance of conventional thermal propulsion systems is fundamentally constrained by the specific energy limitations associated with chemical fuels and the thermal limits of available materials. Electromagnetic thrust augmentation represents one intriguing possibility for improving the fuel composition of thermal propulsion systems, thereby increasing overall specific energy characteristics; however, realization of such a system requires an extremely high-energy-density electrical power source as well as an efficient plasma acceleration device. This Technical Publication describes the development of an experimental research facility for investigating the use of cross-field magnetohydrodynamic (MHD) accelerators as a possible thrust augmentation device for thermal propulsion systems. In this experiment,a 1.5-MW(sub e) Aerotherm arc heater is used to drive a 2-MW(sub e) MHD accelerator. The heatsink MHD accelerator is configured as an externally diagonalized, segmented channel, which is inserted into a large-bore, 2-T electromagnet. The performance analysis and engineering design of the flow path are described as well as the parameter measurements and flow diagnostics planned for the initial series of test runs.

  6. Performance Analysis of Joule-Thomson Cooler Supplied with Gas Mixtures

    NASA Astrophysics Data System (ADS)

    Piotrowska, A.; Chorowski, M.; Dorosz, P.

    2017-02-01

    Joule-Thomson (J-T) cryo-coolers working in closed cycles and supplied with gas mixtures are the subject of intensive research in different laboratories. The replacement of pure nitrogen by nitrogen-hydrocarbon mixtures allows to improve both thermodynamic parameters and economy of the refrigerators. It is possible to avoid high pressures in the heat exchanger and to use standard refrigeration compressor instead of gas bottles or high-pressure oil free compressor. Closed cycle and mixture filled Joule-Thomson cryogenic refrigerator providing 10-20 W of cooling power at temperature range 90-100 K has been designed and manufactured. Thermodynamic analysis including the optimization of the cryo-cooler mixture has been performed with ASPEN HYSYS software. The paper describes the design of the cryo-cooler and provides thermodynamic analysis of the system. The test results are presented and discussed.

  7. NEXT Performance Curve Analysis and Validation

    NASA Technical Reports Server (NTRS)

    Saripalli, Pratik; Cardiff, Eric; Englander, Jacob

    2016-01-01

    Performance curves of the NEXT thruster are highly important in determining the thruster's ability in performing towards mission-specific goals. New performance curves are proposed and examined here. The Evolutionary Mission Trajectory Generator (EMTG) is used to verify variations in mission solutions based on both available thruster curves and the new curves generated. Furthermore, variations in BOL and EOL curves are also examined. Mission design results shown here validate the use of EMTG and the new performance curves.

  8. Analysis of Factors Affecting System Performance in the ASpIRE Challenge

    DTIC Science & Technology

    2015-12-13

    performance in the ASpIRE (Automatic Speech recognition In Reverberant Environments) challenge. In particular, overall word error rate (WER) of the solver...systems is analyzed as a function of room, distance between talker and microphone, and microphone type. We also analyze speech activity detection...analysis will inform the design of future challenges and provide insight into the efficacy of current solutions addressing noisy reverberant speech

  9. High-performance liquid chromatography coupled with tandem mass spectrometry technology in the analysis of Chinese Medicine Formulas: A bibliometric analysis (1997-2015).

    PubMed

    He, Xi-Ran; Li, Chun-Guang; Zhu, Xiao-Shu; Li, Yuan-Qing; Jarouche, Mariam; Bensoussan, Alan; Li, Ping-Ping

    2017-01-01

    There is a recognized challenge in analyzing traditional Chinese medicine formulas because of their complex chemical compositions. The application of modern analytical techniques such as high-performance liquid chromatography coupled with a tandem mass spectrometry has improved the characterization of various compounds from traditional Chinese medicine formulas significantly. This study aims to conduct a bibliometric analysis to recognize the overall trend of high-performance liquid chromatography coupled with tandem mass spectrometry approaches in the analysis of traditional Chinese medicine formulas, its significance and possible underlying interactions between individual herbs in these formulas. Electronic databases were searched systematically, and the identified studies were collected and analyzed using Microsoft Access 2010, Graph Pad 5.0 software and Ucinet software package. 338 publications between 1997 and 2015 were identified, and analyzed in terms of annual growth and accumulated publications, top journals, forms of traditional Chinese medicine preparations and highly studied formulas and single herbs, as well as social network analysis of single herbs. There is a significant increase trend in using high-performance liquid chromatography coupled with tandem mass spectrometry related techniques in analysis of commonly used forms of traditional Chinese medicine formulas in the last 3 years. Stringent quality control is of great significance for the modernization and globalization of traditional Chinese medicine, and this bibliometric analysis provided the first and comprehensive summary within this field. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Analysis of Shroud Options in Support of the Human Exploration of Mars

    NASA Technical Reports Server (NTRS)

    Feldman, Stuart; Borowski, Stanley; Engelund, Walter; Hundley, Jason; Monk, Timothy; Munk, Michelle

    2010-01-01

    In support of the Mars Design Reference Architecture (DRA) 5.0, the NASA study team analyzed several shroud options for use on the Ares V launch vehicle.1,2 These shroud options included conventional "large encapsulation" shrouds with outer diameters ranging from 8.4 to 12.9 meters (m) and overall lengths of 22.0 to 54.3 meters, along with a "nosecone-only" shroud option used for Mars transfer vehicle component delivery. Also examined was a "multi-use" aerodynamic encapsulation shroud used for launch, Mars aerocapture, and entry, descent, and landing of the cargo and habitat landers. All conventional shroud options assessed for use on the Mars launch vehicles were the standard biconic design derived from the reference shroud utilized in the Constellation Program s lunar campaign. It is the purpose of this paper to discuss the technical details of each of these shroud options including material properties, structural mass, etc., while also discussing both the volume and mass of the various space transportation and surface system payload elements required to support a "minimum launch" Mars mission strategy, as well as the synergy, potential differences and upgrade paths that may be required between the Lunar and Mars mission shrouds.

  11. Creatine Supplementation and Upper Limb Strength Performance: A Systematic Review and Meta-Analysis.

    PubMed

    Lanhers, Charlotte; Pereira, Bruno; Naughton, Geraldine; Trousselard, Marion; Lesage, François-Xavier; Dutheil, Frédéric

    2017-01-01

    Creatine is the most widely used supplementation to increase performance in strength; however, the most recent meta-analysis focused specifically on supplementation responses in muscles of the lower limbs without regard to upper limbs. We aimed to systematically review the effect of creatine supplementation on upper limb strength performance. We conducted a systematic review and meta-analyses of all randomized controlled trials comparing creatine supplementation with a placebo, with strength performance measured in exercises shorter than 3 min in duration. The search strategy used the keywords 'creatine', 'supplementation', and 'performance'. Independent variables were age, sex and level of physical activity at baseline, while dependent variables were creatine loading, total dose, duration, time interval between baseline (T0) and the end of the supplementation (T1), and any training during supplementation. We conducted three meta-analyses: at T0 and T1, and on changes between T0 and T1. Each meta-analysis was stratified within upper limb muscle groups. We included 53 studies (563 individuals in the creatine supplementation group and 575 controls). Results did not differ at T0, while, at T1, the effect size (ES) for bench press and chest press were 0.265 (95 % CI 0.132-0.398; p < 0.001) and 0.677 (95 % CI 0.149-1.206; p = 0.012), respectively. Overall, pectoral ES was 0.289 (95 % CI 0.160-0.419; p = 0.000), and global upper limb ES was 0.317 (95 % CI 0.185-0.449; p < 0.001). Meta-analysis of changes between T0 and T1 gave similar results. The meta-regression showed no link with characteristics of population or supplementation, demonstrating the efficacy of creatine independently of all listed conditions. Creatine supplementation is effective in upper limb strength performance for exercise with a duration of less than 3  min, independent of population characteristics, training protocols, and supplementary doses or duration.

  12. A comparison of hierarchical cluster analysis and league table rankings as methods for analysis and presentation of district health system performance data in Uganda.

    PubMed

    Tashobya, Christine K; Dubourg, Dominique; Ssengooba, Freddie; Speybroeck, Niko; Macq, Jean; Criel, Bart

    2016-03-01

    In 2003, the Uganda Ministry of Health introduced the district league table for district health system performance assessment. The league table presents district performance against a number of input, process and output indicators and a composite index to rank districts. This study explores the use of hierarchical cluster analysis for analysing and presenting district health systems performance data and compares this approach with the use of the league table in Uganda. Ministry of Health and district plans and reports, and published documents were used to provide information on the development and utilization of the Uganda district league table. Quantitative data were accessed from the Ministry of Health databases. Statistical analysis using SPSS version 20 and hierarchical cluster analysis, utilizing Wards' method was used. The hierarchical cluster analysis was conducted on the basis of seven clusters determined for each year from 2003 to 2010, ranging from a cluster of good through moderate-to-poor performers. The characteristics and membership of clusters varied from year to year and were determined by the identity and magnitude of performance of the individual variables. Criticisms of the league table include: perceived unfairness, as it did not take into consideration district peculiarities; and being oversummarized and not adequately informative. Clustering organizes the many data points into clusters of similar entities according to an agreed set of indicators and can provide the beginning point for identifying factors behind the observed performance of districts. Although league table ranking emphasize summation and external control, clustering has the potential to encourage a formative, learning approach. More research is required to shed more light on factors behind observed performance of the different clusters. Other countries especially low-income countries that share many similarities with Uganda can learn from these experiences. © The Author 2015

  13. A comparison of hierarchical cluster analysis and league table rankings as methods for analysis and presentation of district health system performance data in Uganda†

    PubMed Central

    Tashobya, Christine K; Dubourg, Dominique; Ssengooba, Freddie; Speybroeck, Niko; Macq, Jean; Criel, Bart

    2016-01-01

    In 2003, the Uganda Ministry of Health introduced the district league table for district health system performance assessment. The league table presents district performance against a number of input, process and output indicators and a composite index to rank districts. This study explores the use of hierarchical cluster analysis for analysing and presenting district health systems performance data and compares this approach with the use of the league table in Uganda. Ministry of Health and district plans and reports, and published documents were used to provide information on the development and utilization of the Uganda district league table. Quantitative data were accessed from the Ministry of Health databases. Statistical analysis using SPSS version 20 and hierarchical cluster analysis, utilizing Wards’ method was used. The hierarchical cluster analysis was conducted on the basis of seven clusters determined for each year from 2003 to 2010, ranging from a cluster of good through moderate-to-poor performers. The characteristics and membership of clusters varied from year to year and were determined by the identity and magnitude of performance of the individual variables. Criticisms of the league table include: perceived unfairness, as it did not take into consideration district peculiarities; and being oversummarized and not adequately informative. Clustering organizes the many data points into clusters of similar entities according to an agreed set of indicators and can provide the beginning point for identifying factors behind the observed performance of districts. Although league table ranking emphasize summation and external control, clustering has the potential to encourage a formative, learning approach. More research is required to shed more light on factors behind observed performance of the different clusters. Other countries especially low-income countries that share many similarities with Uganda can learn from these experiences. PMID:26024882

  14. Analysis of material parameter effects on fluidlastic isolators performance

    NASA Astrophysics Data System (ADS)

    Cheng, Q. Y.; Deng, J. H.; Feng, Z. Z.; Qian, F.

    2018-01-01

    Control of vibration in helicopters has always been a complex and challenging task. The fluidlastic isolators become more and more widely used because the fluids are non-toxic, non-corrosive, nonflammable, and compatible with most elastomers and adhesives. In the field of the fluidlastic isolators design, the selection of design parameters of fluid and rubber is very important to obtain efficient vibration-suppressed. Aiming at getting the property of fluidlastic isolator to material design parameters, a dynamic equation is set up based on the dynamic theory. And the dynamic analysis is carried out. The influences of design parameters on the property of fluidlastic isolator are calculated. The material parameters examined are the properties of fluid and rubber. Analysis results showed that the design parameters such as density of fluid, viscosity coefficient of fluid, stiffness of rubber (K1) and loss coefficient of rubber have obvious influence on the performance of isolator. Base on the results of the study it is concluded that the efficient vibration-suppressed can be obtained by the selection of design parameters.

  15. SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osman, A; American University of Biuret Medical Center, Biuret; Maalej, N

    2016-06-15

    Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dosemore » difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.« less

  16. Introduction on performance analysis and profiling methodologies for KVM on ARM virtualization

    NASA Astrophysics Data System (ADS)

    Motakis, Antonios; Spyridakis, Alexander; Raho, Daniel

    2013-05-01

    The introduction of hardware virtualization extensions on ARM Cortex-A15 processors has enabled the implementation of full virtualization solutions for this architecture, such as KVM on ARM. This trend motivates the need to quantify and understand the performance impact, emerged by the application of this technology. In this work we start looking into some interesting performance metrics on KVM for ARM processors, which can provide us with useful insight that may lead to potential improvements in the future. This includes measurements such as interrupt latency and guest exit cost, performed on ARM Versatile Express and Samsung Exynos 5250 hardware platforms. Furthermore, we discuss additional methodologies that can provide us with a deeper understanding in the future of the performance footprint of KVM. We identify some of the most interesting approaches in this field, and perform a tentative analysis on how these may be implemented in the KVM on ARM port. These take into consideration hardware and software based counters for profiling, and issues related to the limitations of the simulators which are often used, such as the ARM Fast Models platform.

  17. Analysis of Multi-Antenna GNSS Receiver Performance under Jamming Attacks.

    PubMed

    Vagle, Niranjana; Broumandan, Ali; Lachapelle, Gérard

    2016-11-17

    Although antenna array-based Global Navigation Satellite System (GNSS) receivers can be used to mitigate both narrowband and wideband electronic interference sources, measurement distortions induced by array processing methods are not suitable for high precision applications. The measurement distortions have an adverse effect on the carrier phase ambiguity resolution, affecting the navigation solution. Depending on the array attitude information availability and calibration parameters, different spatial processing methods can be implemented although they distort carrier phase measurements in some cases. This paper provides a detailed investigation of the effect of different array processing techniques on array-based GNSS receiver measurements and navigation performance. The main novelty of the paper is to provide a thorough analysis of array-based GNSS receivers employing different beamforming techniques from tracking to navigation solution. Two beamforming techniques, namely Power Minimization (PM) and Minimum Power Distortionless Response (MPDR), are being investigated. In the tracking domain, the carrier Doppler, Phase Lock Indicator (PLI), and Carrier-to-Noise Ratio (C/N₀) are analyzed. Pseudorange and carrier phase measurement distortions and carrier phase position performance are also evaluated. Performance analyses results from simulated GNSS signals and field tests are provided.

  18. Pyrolysis of coal, biomass and their blends: performance assessment by thermogravimetric analysis.

    PubMed

    Ferrara, Francesca; Orsini, Alessandro; Plaisant, Alberto; Pettinau, Alberto

    2014-11-01

    With the aim to support the experimental tests in a gasification pilot plant, the thermal decomposition of coal, biomass and their mixtures has been carried out through a thermogravimetric analysis (TGA) and a simplified kinetic analysis. The TGA of pure fuels indicates the low reactivity of South African coal and the relatively high reactivity of Sardinian Sulcis coal during pyrolysis. Among the tested fuels, biomass (stone pine wood chips) is the most reactive one. These results fully confirm those obtained during the experimental tests in the gasification pilot plant. As for the fuel blends, the analysis shows that the synergic effects between the considered coals and biomass are negligible when they are co-pyrolyzed. The results of the analysis confirm that TGA could be very useful to generally predict the gasification performance and to optimize the experimental campaigns in pilot-scale gasification plants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Performance Analysis of a Cost-Effective Electret Condenser Microphone Directional Array

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Gerhold, Carl H.; Zuckerwar, Allan J.; Herring, Gregory C.; Bartram, Scott M.

    2003-01-01

    Microphone directional array technology continues to be a critical part of the overall instrumentation suite for experimental aeroacoustics. Unfortunately, high sensor cost remains one of the limiting factors in the construction of very high-density arrays (i.e., arrays containing several hundred channels or more) which could be used to implement advanced beamforming algorithms. In an effort to reduce the implementation cost of such arrays, the authors have undertaken a systematic performance analysis of a prototype 35-microphone array populated with commercial electret condenser microphones. An ensemble of microphones coupling commercially available electret cartridges with passive signal conditioning circuitry was fabricated for use with the Langley Large Aperture Directional Array (LADA). A performance analysis consisting of three phases was then performed: (1) characterize the acoustic response of the microphones via laboratory testing and calibration, (2) evaluate the beamforming capability of the electret-based LADA using a series of independently controlled point sources in an anechoic environment, and (3) demonstrate the utility of an electret-based directional array in a real-world application, in this case a cold flow jet operating at high subsonic velocities. The results of the investigation revealed a microphone frequency response suitable for directional array use over a range of 250 Hz - 40 kHz, a successful beamforming evaluation using the electret-populated LADA to measure simple point sources at frequencies up to 20 kHz, and a successful demonstration using the array to measure noise generated by the cold flow jet. This paper presents an overview of the tests conducted along with sample data obtained from those tests.

  20. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.