NASA Technical Reports Server (NTRS)
LaBel, Kenneth A.; Barth, Janet L.; Brewer, Dana A.
2003-01-01
This viewgraph presentation provides information on flight validation experiments for technologies to determine solar effects. The experiments are intended to demonstrate tolerance to a solar variant environment. The technologies tested are microelectronics, photonics, materials, and sensors.
The New Millennium Program: Validating Advanced Technologies for Future Space Missions
NASA Technical Reports Server (NTRS)
Minning, Charles P.; Luers, Philip
1999-01-01
This presentation reviews the activities of the New Millennium Program (NMP) in validating advanced technologies for space missions. The focus of these breakthrough technologies are to enable new capabilities to fulfill the science needs, while reducing costs of future missions. There is a broad spectrum of NMP partners, including government agencies, universities and private industry. The DS-1 was launched on October 24, 1998. Amongst the technologies validated by the NMP on DS-1 are: a Low Power Electronics Experiment, the Power Activation and Switching Module, Multi-Functional Structures. The first two of these technologies are operational and the data analysis is still ongoing. The third program is also operational, and its performance parameters have been verified. The second program, DS-2, was launched January 3 1999. It is expected to impact near Mars southern polar region on 3 December 1999. The technologies used on this mission awaiting validation are an advanced microcontroller, a power microelectronics unit, an evolved water experiment and soil thermal conductivity experiment, Lithium-Thionyl Chloride batteries, the flexible cable interconnect, aeroshell/entry system, and a compact telecom system. EO-1 on schedule for launch in December 1999 carries several technologies to be validated. Amongst these are: a Carbon-Carbon Radiator, an X-band Phased Array Antenna, a pulsed plasma thruster, a wideband advanced recorder processor, an atmospheric corrector, lightweight flexible solar arrays, Advanced Land Imager and the Hyperion instrument
Electrolysis Performance Improvement and Validation Experiment
NASA Technical Reports Server (NTRS)
Schubert, Franz H.
1992-01-01
Viewgraphs on electrolysis performance improvement and validation experiment are presented. Topics covered include: water electrolysis: an ever increasing need/role for space missions; static feed electrolysis (SFE) technology: a concept developed for space applications; experiment objectives: why test in microgravity environment; and experiment description: approach, hardware description, test sequence and schedule.
NASA's In-Space Technology Experiments Program
NASA Technical Reports Server (NTRS)
Levine, J.; Prusha, S. L.
1992-01-01
The objective of the In-Space Technology Experiments Program is to evaluate and validate innovative space technologies and to provide better knowledge of the effects of microgravity and the space environment. The history, organization, methodology, and current program characteristics are presented. Results of the tank pressure control experiment and the middeck zero-gravity dynamics experiment are described to demonstrate the types of technologies that have flown and the experimental results obtained from these low-cost space flight experiments.
Cloud computing and validation of expandable in silico livers.
Ropella, Glen E P; Hunt, C Anthony
2010-12-03
In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware.
NASA Technical Reports Server (NTRS)
Ku, Jentung; Ottenstein, Laura; Douglas, Donya; Hoang, Triem
2010-01-01
Under NASA s New Millennium Program Space Technology 8 (ST 8) Project, four experiments Thermal Loop, Dependable Microprocessor, SAILMAST, and UltraFlex - were conducted to advance the maturity of individual technologies from proof of concept to prototype demonstration in a relevant environment , i.e. from a technology readiness level (TRL) of 3 to a level of 6. This paper presents the new technologies and validation approach of the Thermal Loop experiment. The Thermal Loop is an advanced thermal control system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers designed for future small system applications requiring low mass, low power, and compactness. The MLHP retains all features of state-of-the-art loop heat pipes (LHPs) and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. Details of the thermal loop concept, technical advances, benefits, objectives, level 1 requirements, and performance characteristics are described. Also included in the paper are descriptions of the test articles and mathematical modeling used for the technology validation. An MLHP breadboard was built and tested in the laboratory and thermal vacuum environments for TRL 4 and TRL 5 validations, and an MLHP proto-flight unit was built and tested in a thermal vacuum chamber for the TRL 6 validation. In addition, an analytical model was developed to simulate the steady state and transient behaviors of the MLHP during various validation tests. Capabilities and limitations of the analytical model are also addressed.
New millennium program ST6: autonomous technologies for future NASA spacecraft
NASA Technical Reports Server (NTRS)
Chmielewski, Arthur B.; Chien, Steve; Sherwood, Robert; Wyman, William; Brady, T.; Buckley, S.; Tillier, C.
2005-01-01
The purpose of NASA's New Millennium Program (NMP) is to validate advanced technologies in space and thus lower the risk for the first mission user. The focus of NMP is only on those technologies which need space environment for proper validation. The ST6 project has developed two advanced, experimental technologies for use on spacecraft of the future. These technologies are the Autonomous Sciencecraft Experiment and the Inertial Stellar Compass. These technologies will improve spacecraft's ability to: make decisions on what information to gather and send back to the ground, determine its own attitude and adjust its pointing.
Large Space Systems Technology, Part 2, 1981
NASA Technical Reports Server (NTRS)
Boyer, W. J. (Compiler)
1982-01-01
Four major areas of interest are covered: technology pertinent to large antenna systems; technology related to the control of large space systems; basic technology concerning structures, materials, and analyses; and flight technology experiments. Large antenna systems and flight technology experiments are described. Design studies, structural testing results, and theoretical applications are presented with accompanying validation data. These research studies represent state-of-the art technology that is necessary for the development of large space systems. A total systems approach including structures, analyses, controls, and antennas is presented as a cohesive, programmatic plan for large space systems.
ERIC Educational Resources Information Center
Edmunds, Rob; Thorpe, Mary; Conole, Grainne
2012-01-01
The increasing use of information and communication technology (ICT) in higher education has been explored largely in relation to student experience of coursework and university life. Students' lives and experience beyond the university have been largely unexplored. Research into student experience of ICT used a validated model--the technology…
Cloud computing and validation of expandable in silico livers
2010-01-01
Background In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. Results The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. Conclusions The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware. PMID:21129207
Autonomous rendezvous and docking: A commercial approach to on-orbit technology validation
NASA Technical Reports Server (NTRS)
Tchoryk, Peter, Jr.; Whitten, Raymond P.
1991-01-01
SpARC, in conjunction with its corporate affiliates, is planning an on-orbit validation of autonomous rendezvous and docking (ARD) technology. The emphasis in this program is to utilize existing technology and commercially available components wherever possible. The primary subsystems to be validated by this demonstration include GPS receivers for navigation, a video-based sensor for proximity operations, a fluid connector mechanism to demonstrate fluid resupply capability, and a compliant, single-point docking mechanism. The focus for this initial experiment will be ELV based and will make use of two residual Commercial Experiment Transporter (COMET) service modules. The first COMET spacecraft will be launched in late 1992 and will serve as the target vehicle. After the second COMET spacecraft has been launched in late 1994, the ARD demonstration will take place. The service module from the second COMET will serve as the chase vehicle.
U.S. perspective on technology demonstration experiments for adaptive structures
NASA Technical Reports Server (NTRS)
Aswani, Mohan; Wada, Ben K.; Garba, John A.
1991-01-01
Evaluation of design concepts for adaptive structures is being performed in support of several focused research programs. These include programs such as Precision Segmented Reflector (PSR), Control Structure Interaction (CSI), and the Advanced Space Structures Technology Research Experiment (ASTREX). Although not specifically designed for adaptive structure technology validation, relevant experiments can be performed using the Passive and Active Control of Space Structures (PACOSS) testbed, the Space Integrated Controls Experiment (SPICE), the CSI Evolutionary Model (CEM), and the Dynamic Scale Model Test (DSMT) Hybrid Scale. In addition to the ground test experiments, several space flight experiments have been planned, including a reduced gravity experiment aboard the KC-135 aircraft, shuttle middeck experiments, and the Inexpensive Flight Experiment (INFLEX).
Technology for the Future: In-Space Technology Experiments Program, part 1
NASA Technical Reports Server (NTRS)
Breckenridge, Roger A. (Compiler); Clark, Lenwood G. (Compiler); Willshire, Kelli F. (Compiler); Beck, Sherwin M. (Compiler); Collier, Lisa D. (Compiler)
1991-01-01
The purpose of the Office of Aeronautics and Space Technology (OAST) In-Space Technology Experiment Program (In-STEP) 1988 Workshop was to identify and prioritize technologies that are critical for future national space programs and require validation in the space environment, and review current NASA (In-Reach) and industry/university (Out-Reach) experiments. A prioritized list of the critical technology needs was developed for the following eight disciplines: structures; environmental effects; power systems and thermal management; fluid management and propulsion systems; automation and robotics; sensors and information systems; in-space systems; and humans in space. This is part one of two parts and is the executive summary and experiment description. The executive summary portion contains keynote addresses, strategic planning information, and the critical technology needs summaries for each theme. The experiment description portion contains brief overviews of the objectives, technology needs and backgrounds, descriptions, and development schedules for current industry, university, and NASA space flight technology experiments.
NASA IN-STEP Cryo System Experiment flight test
NASA Astrophysics Data System (ADS)
Russo, S. C.; Sugimura, R. S.
The Cryo System Experiment (CSE), a NASA In-Space Technology Experiments Program (IN-STEP) flight experiment, was flown on Space Shuttle Discovery (STS 63) in February 1995. The experiment was developed by Hughes Aircraft Company to validate in zero- g space a 65 K cryogenic system for focal planes, optics, instruments or other equipment (gamma-ray spectrometers and infrared and submillimetre imaging instruments) that requires continuous cryogenic cooling. The CSE is funded by the NASA Office of Advanced Concepts and Technology's IN-STEP and managed by the Jet Propulsion Laboratory (JPL). The overall goal of the CSE was to validate and characterize the on-orbit performance of the two thermal management technologies that comprise a hybrid cryogenic system. These thermal management technologies consist of (1) a second-generation long-life, low-vibration, Stirling-cycle 65 K cryocooler that was used to cool a simulated thermal energy storage device (TRP) and (2) a diode oxygen heat pipe thermal switch that enables physical separation between a cryogenic refrigerator and a TRP. All CSE experiment objectives and 100% of the experiment success criteria were achieved. The level of confidence provided by this flight experiment is an important NASA and Department of Defense (DoD) milestone prior to multi-year mission commitment. Presented are generic lessons learned from the system integration of cryocoolers for a flight experiment and the recorded zero- g performance of the Stirling cryocooler and the diode oxygen heat pipe.
A statistical approach to selecting and confirming validation targets in -omics experiments
2012-01-01
Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145
In-Flight Thermal Performance of the Lidar In-Space Technology Experiment
NASA Technical Reports Server (NTRS)
Roettker, William
1995-01-01
The Lidar In-Space Technology Experiment (LITE) was developed at NASA s Langley Research Center to explore the applications of lidar operated from an orbital platform. As a technology demonstration experiment, LITE was developed to gain experience designing and building future operational orbiting lidar systems. Since LITE was the first lidar system to be flown in space, an important objective was to validate instrument design principles in such areas as thermal control, laser performance, instrument alignment and control, and autonomous operations. Thermal and structural analysis models of the instrument were developed during the design process to predict the behavior of the instrument during its mission. In order to validate those mathematical models, extensive engineering data was recorded during all phases of LITE's mission. This inflight engineering data was compared with preflight predictions and, when required, adjustments to the thermal and structural models were made to more accurately match the instrument s actual behavior. The results of this process for the thermal analysis and design of LITE are presented in this paper.
Zapf, Susan A; Scherer, Marcia J; Baxter, Mary F; H Rintala, Diana
2016-01-01
The purpose of this study was to measure the predictive validity, internal consistency and clinical utility of the Matching Assistive Technology to Child & Augmentative Communication Evaluation Simplified (MATCH-ACES) assessment. Twenty-three assistive technology team evaluators assessed 35 children using the MATCH-ACES assessment. This quasi-experimental study examined the internal consistency, predictive validity and clinical utility of the MATCH-ACES assessment. The MATCH-ACES assessment predisposition scales had good internal consistency across all three scales. A significant relationship was found between (a) high student perseverance and need for assistive technology and (b) high teacher comfort and interest in technology use (p = (0).002). Study results indicate that the MATCH-ACES assessment has good internal consistency and validity. Predisposition characteristics of student and teacher combined can influence the level of assistive technology use; therefore, assistive technology teams should assess predisposition factors of the user when recommending assistive technology. Implications for Rehabilitation Educational and medical professionals should be educated on evidence-based assistive technology assessments. Personal experience and psychosocial factors can influence the outcome use of assistive technology. Assistive technology assessments must include an intervention plan for assistive technology service delivery to measure effective outcome use.
Technology for the Future: In-Space Technology Experiments Program, part 2
NASA Technical Reports Server (NTRS)
Breckenridge, Roger A. (Compiler); Clark, Lenwood G. (Compiler); Willshire, Kelli F. (Compiler); Beck, Sherwin M. (Compiler); Collier, Lisa D. (Compiler)
1991-01-01
The purpose of the Office of Aeronautics and Space Technology (OAST) In-Space Technology Experiments Program In-STEP 1988 Workshop was to identify and prioritize technologies that are critical for future national space programs and require validation in the space environment, and review current NASA (In-Reach) and industry/ university (Out-Reach) experiments. A prioritized list of the critical technology needs was developed for the following eight disciplines: structures; environmental effects; power systems and thermal management; fluid management and propulsion systems; automation and robotics; sensors and information systems; in-space systems; and humans in space. This is part two of two parts and contains the critical technology presentations for the eight theme elements and a summary listing of critical space technology needs for each theme.
Stellar Interferometer Technology Experiment (SITE)
NASA Technical Reports Server (NTRS)
Crawley, Edward F.; Miller, David; Laskin, Robert; Shao, Michael
1995-01-01
The MIT Space Engineering Research Center and the Jet Propulsion Laboratory stand ready to advance science sensor technology for discrete-aperture astronomical instruments such as space-based optical interferometers. The objective of the Stellar Interferometer Technology Experiment (SITE) is to demonstrate system-level functionality of a space-based stellar interferometer through the use of enabling and enhancing Controlled-Structures Technologies (CST). SITE mounts to the Mission Peculiar Experiment Support System inside the Shuttle payload bay. Starlight, entering through two apertures, is steered to a combining plate where it is interferred. Interference requires 27 nanometer pathlength (phasing) and 0.29 archsecond wavefront-tilt (pointing) control. The resulting 15 milli-archsecond angular resolution exceeds that of current earth-orbiting telescopes while maintaining low cost by exploiting active optics and structural control technologies. With these technologies, unforeseen and time-varying disturbances can be rejected while relaxing reliance on ground alignment and calibration. SITE will reduce the risk and cost of advanced optical space systems by validating critical technologies in their operational environment. Moreover, these technologies are directly applicable to commercially driven applications such as precision matching, optical scanning, and vibration and noise control systems for the aerospace, medical, and automotive sectors. The SITE team consists of experienced university, government, and industry researchers, scientists, and engineers with extensive expertise in optical interferometry, nano-precision opto-mechanical control and spaceflight experimentation. The experience exists and the technology is mature. SITE will validate these technologies on a functioning interferometer science sensor in order to confirm definitely their readiness to be baselined for future science missions.
Autonomous rendezvous and docking: A commercial approach to on-orbit technology validation
NASA Technical Reports Server (NTRS)
Tchoryk, Peter, Jr.; Dobbs, Michael E.; Conrad, David J.; Apley, Dale J.; Whitten, Raymond P.
1991-01-01
The Space Automation and Robotics Center (SpARC), a NASA-sponsored Center for the Commercial Development of Space (CCDS), in conjunction with its corporate affiliates, is planning an on-orbit validation of autonomous rendezvous and docking (ARD) technology. The emphasis in this program is to utilize existing technology and commercially available components whenever possible. The primary subsystems that will be validated by this demonstration include GPS receivers for navigation, a video-based sensor for proximity operations, a fluid connector mechanism to demonstrate fluid resupply capability, and a compliant, single-point docking mechanism. The focus for this initial experiment will be expendable launch vehicle (ELV) based and will make use of two residual Commercial Experiment Transporter (COMET) service modules. The first COMET spacecraft will be launched in late 1992 and will serve as the target vehicle. The ARD demonstration will take place in late 1994, after the second COMET spacecraft has been launched. The service module from the second COMET will serve as the chase vehicle.
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Ly, Vuong; Frye, Stuart
2006-01-01
One of the shared problems for new space mission developers is that it is extremely difficult to infuse new technology into new missions unless that technology has been flight validated. Therefore, the issue is that new technology is required to fly on a successful mission for flight validation. We have been experimenting with new technology on existing satellites by retrofitting primarily the flight software while the missions are on-orbit to experiment with new operations concepts. Experiments have been using Earth Observing 1 (EO-1), which is part of the New Millennium Program at NASA. EO-1 finished its prime mission one year after its launch on November 21,2000. From November 21,2001 until the present, EO-1 has been used in parallel with additional science data gathering to test out various sensor web concepts. Similarly, the Cosmic Hot Interstellar Plasma Spectrometer (CHIPS) satellite was also a one year mission flown by the University of Berkeley, sponsored by NASA and whose prime mission ended August 30,2005. Presently, CHIPS is being used to experiment with a seamless space to ground interface by installing Core Flight System (cFS), a "plug-and-play" architecture developed by the Flight Software Branch at NASA/GSFC on top of the existing space-to-ground Internet Protocol (IP) interface that CHIPS implemented. For example, one targeted experiment is to connect CHIPS to a rover via this interface and the Internet, and trigger autonomous actions on CHIPS, the rover or both. Thus far, having satellites to experiment with new concepts has turned out to be an inexpensive way to infuse new technology for future missions. Relevant experiences thus far and future plans will be discussed in this presentation.
NASA Technical Reports Server (NTRS)
Brady, Tye; Bailey, Erik; Crain, Timothy; Paschall, Stephen
2011-01-01
NASA has embarked on a multiyear technology development effort to develop a safe and precise lunar landing capability. The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is investigating a range of landing hazard detection methods while developing a hazard avoidance capability to best field test the proper set of relevant autonomous GNC technologies. Ultimately, the advancement of these technologies through the ALHAT Project will provide an ALHAT System capable of enabling next generation lunar lander vehicles to globally land precisely and safely regardless of lighting condition. This paper provides an overview of the ALHAT System and describes recent validation experiments that have advanced the highly capable GNC architecture.
NASA Technical Reports Server (NTRS)
O'Donnell, James R.; Hsu, Oscar C.; Maghami, Peirman G.; Markley, F. Landis
2006-01-01
As originally proposed, the Space Technology-7 Disturbance Reduction System (DRS) project, managed out of the Jet Propulsion Laboratory, was designed to validate technologies required for future missions such as the Laser Interferometer Space Antenna (LISA). The two technologies to be demonstrated by DRS were Gravitational Reference Sensors (GRSs) and Colloidal MicroNewton Thrusters (CMNTs). Control algorithms being designed by the Dynamic Control System (DCS) team at the Goddard Space Flight Center would control the spacecraft so that it flew about a freely-floating GRS test mass, keeping it centered within its housing. For programmatic reasons, the GRSs were descoped from DRS. The primary goals of the new mission are to validate the performance of the CMNTs and to demonstrate precise spacecraft position control. DRS will fly as a part of the European Space Agency (ESA) LISA Pathfinder (LPF) spacecraft along with a similar ESA experiment, the LISA Technology Package (LTP). With no GRS, the DCS attitude and drag-free control systems make use of the sensor being developed by ESA as a part of the LTP. The control system is designed to maintain the spacecraft s position with respect to the test mass, to within 10 nm/the square root of Hz over the DRS science frequency band of 1 to 30 mHz.
OAST Technology for the Future. Executive Summary
NASA Technical Reports Server (NTRS)
1988-01-01
NASA's Office of Aeronautics and Space Technology (OAST) conducted a workshop on the In-Space Technology Experiments Program (IN-STEP) December 6-9, 1988, in Atlanta, Georgia. The purpose of this workshop was to identify and prioritize space technologies which are critical for future national space programs and which require validation in the space environment. A secondary objective was to review the current NASA (In-Reach) and Industry/University (Out-Reach) experiments. Finally, the aerospace community was requested to review and comment on the proposed plans for the continuation of the In-Space Technology Experiments Program. In particular, the review included the proposed process for focusing the next experiment selection on specific, critical technologies and the process for implementing the hardware development and integration on the Space Shuttle vehicle. The product of the workshop was a prioritized listing of the critical space technology needs in each of eight technology disciplines. These listings were the cumulative recommendations of nearly 400 participants, which included researchers, technologists, and managers from aerospace industries, universities, and government organizations.
Space technology research plans
NASA Technical Reports Server (NTRS)
Hook, W. Ray
1992-01-01
Development of new technologies is the primary purpose of the Office of Aeronautics and Space Technology (OAST). OAST's mission includes the following two goals: (1) to conduct research to provide fundamental understanding, develop advanced technology and promote technology transfer to assure U.S. preeminence in aeronautics and to enhance and/or enable future civil space missions: and (2) to provide unique facilities and technical expertise to support national aerospace needs. OAST includes both NASA Headquarters operations as well as programmatic and institutional management of the Ames Research Center, the Langley Research Center and the Lewis Research Center. In addition. a considerable portion of OAST's Space R&T Program is conducted through the flight and science program field centers of NASA. Within OAST, the Space Technology Directorate is responsible for the planning and implementation of the NASA Space Research and Technology Program. The Space Technology Directorate's mission is 'to assure that OAST shall provide technology for future civil space missions and provide a base of research and technology capabilities to serve all national space goals.' Accomplishing this mission entails the following objectives: y Identify, develop, validate and transfer technology to: (1) increase mission safety and reliability; (2) reduce flight program development and operations costs; (3) enhance mission performance; and (4) enable new missions. Provide the capability to: (1) advance technology in critical disciplines; and (2) respond to unanticipated mission needs. In-space experiments are an integral part of OAST's program and provides for experimental studies, development and support for in-space flight research and validation of advanced space technologies. Conducting technology experiments in space is a valuable and cost effective way to introduce advanced technologies into flight programs. These flight experiments support both the R&T base and the focussed programs within OAST.
NASA Technical Reports Server (NTRS)
Cable, D. A.; Diewald, C. A.; Hills, T. C.; Parmentier, T. J.; Spencer, R. A.; Stone, G. E.
1984-01-01
Volume 2 contains the Technical Report of the approach and results of the Phase 2 study. The phase 2 servicing study was initiated in June 1983, and is being reported in this document. The scope of the contract was to: (1) define in detail five selected technology development missions (TDM); (2) conduct a design requirement analysis to refine definitions of satellite servicing requirements at the space station; and (3) develop a technology plan that would identify and schedule prerequisite precursor technology development, associated. STS flight experiments and space station experiments needed to provide onorbit validation of the evolving technology.
NASA Technical Reports Server (NTRS)
Didion, Jeffrey R.
2018-01-01
Electrically Driven Thermal Management is an active research and technology development initiative incorporating ISS technology flight demonstrations (STP-H5), development of Microgravity Science Glovebox (MSG) flight experiment, and laboratory-based investigations of electrically based thermal management techniques. The program targets integrated thermal management for future generations of RF electronics and power electronic devices. This presentation reviews four program elements: i.) results from the Electrohydrodynamic (EHD) Long Term Flight Demonstration launched in February 2017 ii.) development of the Electrically Driven Liquid Film Boiling Experiment iii.) two University based research efforts iv.) development of Oscillating Heat Pipe evaluation at Goddard Space Flight Center.
OAST Technology for the Future. Volume 3 - Critical Technologies, Themes 5-8
NASA Technical Reports Server (NTRS)
1988-01-01
NASA's Office of Aeronautics and Space Technology (OAST) conducted a workshop on the In-Space Technology Experiments Program IN-STEP) December 6-9, 1988, in Atlanta, Georgia. The purpose of this workshop was to identify and prioritize space technologies which are critical for future national space programs and which require validation in the 5 ace environment. A secondary objective was to review the current NASA (In-Reach and Industry/University (Out-Reach) experiments. Finally, the aerospace community was requested to review and comment on the proposed plans for the continuation of the In-Space Technology Experiments Program. In particular, the review included the proposed process for focusing the next experiment selection on specific, critical technologies and the process for implementing the hardware development and integration on the Space Shuttle vehicle. The product of the workshop was a prioritized listing of the critical space technology needs in each of eight technology disciplines. These listings were the cumulative recommendations of nearly 400 participants, which included researchers, technologists, and managers from aerospace industries, universities, and government organizations.
OAST Technology for the Future. Volume 2 - Critical Technologies, Themes 1-4
NASA Technical Reports Server (NTRS)
1988-01-01
NASA's Office of Aeronautics and Space Technology (OAST) conducted a workshop on the In-Space Technology Experiments Program IN-STEP) December 6-9, 1988, in Atlanta, Georgia. The purpose of this workshop was to identify and prioritize space technologies which are critical for future national space programs and which. require validation in the space environment. A secondary objective was to review the current NASA (InReach) and Industry/University (Out-Reach) experiments. Finally, the aerospace community was requested to review and comment on the proposed plans for the continuation of the In-Space Technology Experiments Program. In particular, the review included the proposed process for focusing the next experiment selection on specific, critical technologies and the process for implementing the hardware development and integration on the Space Shuttle vehicle. The product of the workshop was a prioritized listing of the critical space technology needs in each of eight technology disciplines. These listings were the cumulative recommendations of nearly 400 participants, which included researchers, technologists, and managers from aerospace industries, universities, and government organizations.
Multi-Evaporator Miniature Loop Heat Pipe for Small Spacecraft Thermal Control
NASA Technical Reports Server (NTRS)
Ku, Jentung; Ottenstein, Laura; Douglas, Donya
2008-01-01
This paper presents the development of the Thermal Loop experiment under NASA's New Millennium Program Space Technology 8 (ST8) Project. The Thermal Loop experiment was originally planned for validating in space an advanced heat transport system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers. Details of the thermal loop concept, technical advances and benefits, Level 1 requirements and the technology validation approach are described. An MLHP breadboard has been built and tested in the laboratory and thermal vacuum environments, and has demonstrated excellent performance that met or exceeded the design requirements. The MLHP retains all features of state-of-the-art loop heat pipes and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. In addition, an analytical model has been developed to simulate the steady state and transient operation of the MHLP, and the model predictions agreed very well with experimental results. A protoflight MLHP has been built and is being tested in a thermal vacuum chamber to validate its performance and technical readiness for a flight experiment.
In-step inflatable antenna experiment
NASA Astrophysics Data System (ADS)
Freeland, R. E.; Bilyeu, G.
Large deployable space antennas are needed to accommodate a number of applications that include mobile communications, earth observation radiometry, active microwave sensing, space-orbiting very long baseline interferometry, and Department of Defense (DoD) space-based radar. The criteria for evaluating candidate structural concepts for essentially all the applications is the same; high deployment reliability, low cost, low weight, low launch volume, and high aperture precision. A new class of space structures, called inflatable deployable structures, have tremendous potential for completely satisfying the first four criteria and good potential for accommodating the longer wavelength applications. An inflatable deployable antenna under development by L'Garde Inc. of Tustin, California, represents such a concept. Its level of technology is mature enough to support a meaningful orbital technology experiment. The NASA Office of Aeronautics and Space Technology initiated the In-Space Technology Experiments Program (IN-STEP) specifically to sponsor the verification and/or validation of unique and innovative space technologies in the space environment. The potential of the L'Garde concept has been recognized and resulted in its selection for an IN-STEP experiment. The objective of the experiment is to (a) validate the deployment of a 14-meter, inflatable parabolic reflector structure, (b) measure the reflector surface accuracy, and (c) investigate structural damping characteristics under operational conditions. The experiment approach will be to use the NASA Spartan Spacecraft to carry the experiment on orbit. Reflector deployment will be monitored by two high-resolution video cameras. Reflector surface quality will be measured with a digital imaging radiometer. Structural damping will be based on measuring the decay of reflector structure amplitude. The experiment is being managed by the Jet Propulsion Laboratory. The experiment definition phase (Phase B) will be completed by the end of fiscal year (FY) 1992; hardware development (Phase C/D) is expected to start by early FY 1993; and launch is scheduled for 1995. The paper describes the accomplishments to date and the approach for the remainder of the experiment.
The Objectives of NASA's Living with a Star Space Environment Testbed
NASA Technical Reports Server (NTRS)
Barth, Janet L.; LaBel, Kenneth A.; Brewer, Dana; Kauffman, Billy; Howard, Regan; Griffin, Geoff; Day, John H. (Technical Monitor)
2001-01-01
NASA is planning to fly a series of Space Environment Testbeds (SET) as part of the Living With A Star (LWS) Program. The goal of the testbeds is to improve and develop capabilities to mitigate and/or accommodate the affects of solar variability in spacecraft and avionics design and operation. This will be accomplished by performing technology validation in space to enable routine operations, characterize technology performance in space, and improve and develop models, guidelines and databases. The anticipated result of the LWS/SET program is improved spacecraft performance, design, and operation for survival of the radiation, spacecraft charging, meteoroid, orbital debris and thermosphere/ionosphere environments. The program calls for a series of NASA Research Announcements (NRAs) to be issued to solicit flight validation experiments, improvement in environment effects models and guidelines, and collateral environment measurements. The selected flight experiments may fly on the SET experiment carriers and flights of opportunity on other commercial and technology missions. This paper presents the status of the project so far, including a description of the types of experiments that are intended to fly on SET-1 and a description of the SET-1 carrier parameters.
Expertise Chemical / Systems Engineer with over 15 years of experience developing, evaluating, validating . Department of Energy. Research Interests Developing new technologies from concept to commercial application
Validation of Automated Scoring of Oral Reading
ERIC Educational Resources Information Center
Balogh, Jennifer; Bernstein, Jared; Cheng, Jian; Van Moere, Alistair; Townshend, Brent; Suzuki, Masanori
2012-01-01
A two-part experiment is presented that validates a new measurement tool for scoring oral reading ability. Data collected by the U.S. government in a large-scale literacy assessment of adults were analyzed by a system called VersaReader that uses automatic speech recognition and speech processing technologies to score oral reading fluency. In the…
In-Space Structural Validation Plan for a Stretched-Lens Solar Array Flight Experiment
NASA Technical Reports Server (NTRS)
Pappa, Richard S.; Woods-Vedeler, Jessica A.; Jones, Thomas W.
2001-01-01
This paper summarizes in-space structural validation plans for a proposed Space Shuttle-based flight experiment. The test article is an innovative, lightweight solar array concept that uses pop-up, refractive stretched-lens concentrators to achieve a power/mass density of at least 175 W/kg, which is more than three times greater than current capabilities. The flight experiment will validate this new technology to retire the risk associated with its first use in space. The experiment includes structural diagnostic instrumentation to measure the deployment dynamics, static shape, and modes of vibration of the 8-meter-long solar array and several of its lenses. These data will be obtained by photogrammetry using the Shuttle payload-bay video cameras and miniature video cameras on the array. Six accelerometers are also included in the experiment to measure base excitations and small-amplitude tip motions.
Experimenting with Sensor Webs Using Earth Observing 1
NASA Technical Reports Server (NTRS)
Mandl, Dan
2004-01-01
The New Millennium Program (NMP) Earth Observing 1 ( EO-1) satellite was launched November 21, 2000 as a one year technology validation mission. After an almost flawless first year of operations, EO-1 continued to operate in a test bed d e to validate additional technologies and concepts that will be applicable to future sensor webs. A sensor web is a group of sensors, whether space-based, ground-based or air plane-based which act in a collaborative autonomous manner to produce more value than would otherwise result from the individual observations.
2009-03-01
applications. RIGEX was an Air Force Institute of Technology graduate-student-built Space Shuttle cargo bay experiment intended to heat and inflate...suggestions for future experiments and applications are provided. RIGEX successfully accomplished its mission statement by validating the heating and...Inflatable/Rigidizable Solar Arrays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.6. RIGEX Student Involvement
NASA Astrophysics Data System (ADS)
Nakamura, Hiroo; Riccardi, B.; Loginov, N.; Ara, K.; Burgazzi, L.; Cevolani, S.; Dell'Orco, G.; Fazio, C.; Giusti, D.; Horiike, H.; Ida, M.; Ise, H.; Kakui, H.; Matsui, H.; Micciche, G.; Muroga, T.; Nakamura, Hideo; Shimizu, K.; Sugimoto, M.; Suzuki, A.; Takeuchi, H.; Tanaka, S.; Yoneoka, T.
2004-08-01
During the three year key element technology phase of the International Fusion Materials Irradiation Facility (IFMIF) project, completed at the end of 2002, key technologies have been validated. In this paper, these results are summarized. A water jet experiment simulating Li flow validated stable flow up to 20 m/s with a double reducer nozzle. In addition, a small Li loop experiment validated stable Li flow up to 14 m/s. To control the nitrogen content in Li below 10 wppm will require surface area of a V-Ti alloy getter of 135 m 2. Conceptual designs of diagnostics have been carried out. Moreover, the concept of a remote handling system to replace the back wall based on `cut and reweld' and `bayonet' options has been established. Analysis by FMEA showed safe operation of the target system. Recent activities in the transition phase, started in 2003, and plan for the next phase are also described.
ERIC Educational Resources Information Center
Foster, W. Tad; Shahhosseini, A. Mehran; Maughan, George
2016-01-01
Facilitating student growth and development in diagnosing and solving technical problems remains a challenge for technology and engineering educators. With funding from the National Science Foundation, this team of researchers developed a self-guided, computer-based instructional program to experiment with conceptual mapping as a treatment to…
Evaluation of Future Internet Technologies for Processing and Distribution of Satellite Imagery
NASA Astrophysics Data System (ADS)
Becedas, J.; Perez, R.; Gonzalez, G.; Alvarez, J.; Garcia, F.; Maldonado, F.; Sucari, A.; Garcia, J.
2015-04-01
Satellite imagery data centres are designed to operate a defined number of satellites. For instance, difficulties when new satellites have to be incorporated in the system appear. This occurs because traditional infrastructures are neither flexible nor scalable. With the appearance of Future Internet technologies new solutions can be provided to manage large and variable amounts of data on demand. These technologies optimize resources and facilitate the appearance of new applications and services in the traditional Earth Observation (EO) market. The use of Future Internet technologies for the EO sector were validated with the GEO-Cloud experiment, part of the Fed4FIRE FP7 European project. This work presents the final results of the project, in which a constellation of satellites records the whole Earth surface on a daily basis. The satellite imagery is downloaded into a distributed network of ground stations and ingested in a cloud infrastructure, where the data is processed, stored, archived and distributed to the end users. The processing and transfer times inside the cloud, workload of the processors, automatic cataloguing and accessibility through the Internet are evaluated to validate if Future Internet technologies present advantages over traditional methods. Applicability of these technologies is evaluated to provide high added value services. Finally, the advantages of using federated testbeds to carry out large scale, industry driven experiments are analysed evaluating the feasibility of an experiment developed in the European infrastructure Fed4FIRE and its migration to a commercial cloud: SoftLayer, an IBM Company.
SDG and qualitative trend based model multiple scale validation
NASA Astrophysics Data System (ADS)
Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike
2017-09-01
Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.
What Makes AS Marking Reliable? An Experiment with Some Stages from the Standardisation Process
ERIC Educational Resources Information Center
Greatorex, Jackie; Bell, John F.
2008-01-01
It is particularly important that GCSE and A-level marking is valid and reliable as it affects the life chances of many young people in England. Current developments in marking technology are coinciding with potential changes in procedures to ensure valid and reliable marking. In this research the effectiveness of procedures to facilitate the…
Improving STEM Program Quality in Out-of-School-Time: Tool Development and Validation
ERIC Educational Resources Information Center
Shah, Ashima Mathur; Wylie, Caroline; Gitomer, Drew; Noam, Gil
2018-01-01
In and out-of-school time (OST) experiences are viewed as complementary in contributing to students' interest, engagement, and performance in science, technology, engineering, and mathematics (STEM). While tools exist to measure quality in general afterschool settings and others to measure structured science classroom experiences, there is a need…
Hanauer, David I.; Bauerle, Cynthia
2015-01-01
Science, technology, engineering, and mathematics education reform efforts have called for widespread adoption of evidence-based teaching in which faculty members attend to student outcomes through assessment practice. Awareness about the importance of assessment has illuminated the need to understand what faculty members know and how they engage with assessment knowledge and practice. The Faculty Self-Reported Assessment Survey (FRAS) is a new instrument for evaluating science faculty assessment knowledge and experience. Instrument validation was composed of two distinct studies: an empirical evaluation of the psychometric properties of the FRAS and a comparative known-groups validation to explore the ability of the FRAS to differentiate levels of faculty assessment experience. The FRAS was found to be highly reliable (α = 0.96). The dimensionality of the instrument enabled distinction of assessment knowledge into categories of program design, instrumentation, and validation. In the known-groups validation, the FRAS distinguished between faculty groups with differing levels of assessment experience. Faculty members with formal assessment experience self-reported higher levels of familiarity with assessment terms, higher frequencies of assessment activity, increased confidence in conducting assessment, and more positive attitudes toward assessment than faculty members who were novices in assessment. These results suggest that the FRAS can reliably and validly differentiate levels of expertise in faculty knowledge of assessment. PMID:25976653
NASA Technical Reports Server (NTRS)
Ku, Jentung; Ottenstein, Laura; Douglas, Donya; Hoang, Triem
2010-01-01
Under NASA s New Millennium Program Space Technology 8 (ST 8) Project, Goddard Space Fight Center has conducted a Thermal Loop experiment to advance the maturity of the Thermal Loop technology from proof of concept to prototype demonstration in a relevant environment , i.e. from a technology readiness level (TRL) of 3 to a level of 6. The thermal Loop is an advanced thermal control system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers designed for future small system applications requiring low mass, low power, and compactness. The MLHP retains all features of state-of-the-art loop heat pipes (LHPs) and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. An MLHP breadboard was built and tested in the laboratory and thermal vacuum environments for the TRL 4 and TRL 5 validations, respectively, and an MLHP proto-flight unit was built and tested in a thermal vacuum chamber for the TRL 6 validation. In addition, an analytical model was developed to simulate the steady state and transient behaviors of the MLHP during various validation tests. The MLHP demonstrated excellent performance during experimental tests and the analytical model predictions agreed very well with experimental data. All success criteria at various TRLs were met. Hence, the Thermal Loop technology has reached a TRL of 6. This paper presents the validation results, both experimental and analytical, of such a technology development effort.
A Hardware Model Validation Tool for Use in Complex Space Systems
NASA Technical Reports Server (NTRS)
Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.
2010-01-01
One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.
NASA Technical Reports Server (NTRS)
Knouse, G.; Weber, W.
1985-01-01
A three phase development program for ground and space segment technologies which will enhance and enable the second and third generation mobile satellite systems (MSS) is outlined. Phase 1, called the Mobile Satellite Experiment (MSAT-X), is directed toward the development of ground segment technology needed for future MSS generations. Technology validation and preoperational experiments with other government agencies will be carried out during the two year period following launch. The satellite channel capacity needed to carry out these experiments will be obtained from industry under a barter type agreement in exchange for NASA provided launch services. Phase 2 will develop and flight test the multibeam spacecraft antenna technology needed to obtain substantial frequency reuse for second generation commercial systems. Industry will provide the antenna, and NASA will fly it on the Shuttle and test it in orbit. Phase 3 is similar to Phase 2 but will develop an even larger multibeam antenna and test it on the space station.
NASA Astrophysics Data System (ADS)
Knouse, G.; Weber, W.
1985-04-01
A three phase development program for ground and space segment technologies which will enhance and enable the second and third generation mobile satellite systems (MSS) is outlined. Phase 1, called the Mobile Satellite Experiment (MSAT-X), is directed toward the development of ground segment technology needed for future MSS generations. Technology validation and preoperational experiments with other government agencies will be carried out during the two year period following launch. The satellite channel capacity needed to carry out these experiments will be obtained from industry under a barter type agreement in exchange for NASA provided launch services. Phase 2 will develop and flight test the multibeam spacecraft antenna technology needed to obtain substantial frequency reuse for second generation commercial systems. Industry will provide the antenna, and NASA will fly it on the Shuttle and test it in orbit. Phase 3 is similar to Phase 2 but will develop an even larger multibeam antenna and test it on the space station.
Networking Technologies Enable Advances in Earth Science
NASA Technical Reports Server (NTRS)
Johnson, Marjory; Freeman, Kenneth; Gilstrap, Raymond; Beck, Richard
2004-01-01
This paper describes an experiment to prototype a new way of conducting science by applying networking and distributed computing technologies to an Earth Science application. A combination of satellite, wireless, and terrestrial networking provided geologists at a remote field site with interactive access to supercomputer facilities at two NASA centers, thus enabling them to validate and calibrate remotely sensed geological data in near-real time. This represents a fundamental shift in the way that Earth scientists analyze remotely sensed data. In this paper we describe the experiment and the network infrastructure that enabled it, analyze the data flow during the experiment, and discuss the scientific impact of the results.
The NASA/MSFC Coherent Lidar Technology Advisory Team
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.
1999-01-01
The SPAce Readiness Coherent Lidar Experiment (SPARCLE) mission was proposed as a low cost technology demonstration mission, using a 2-micron, 100-mJ, 6-Hz, 25-cm, coherent lidar system based on demonstrated technology. SPARCLE was selected in late October 1997 to be NASA's New Millennium Program (NMP) second earth-observing (EO-2) mission. To maximize the success probability of SPARCLE, NASA/MSFC desired expert guidance in the areas of coherent laser radar (CLR) theory, CLR wind measurement, fielding of CLR systems, CLR alignment validation, and space lidar experience. This led to the formation of the NASA/MSFC Coherent Lidar Technology Advisory Team (CLTAT) in December 1997. A threefold purpose for the advisory team was identified as: 1) guidance to the SPARCLE mission, 2) advice regarding the roadmap of post-SPARCLE coherent Doppler wind lidar (CDWL) space missions and the desired matching technology development plan 3, and 3) general coherent lidar theory, simulation, hardware, and experiment information exchange. The current membership of the CLTAT is shown. Membership does not result in any NASA or other funding at this time. We envision the business of the CLTAT to be conducted mostly by email, teleconference, and occasional meetings. The three meetings of the CLTAT to date, in Jan. 1998, July 1998, and Jan. 1999, have all been collocated with previously scheduled meetings of the Working Group on Space-Based Lidar Winds. The meetings have been very productive. Topics discussed include the SPARCLE technology validation plan including pre-launch end-to-end testing, the space-based wind mission roadmap beyond SPARCLE and its implications on the resultant technology development, the current values and proposed future advancement in lidar system efficiency, and the difference between using single-mode fiber optical mixing vs. the traditional free space optical mixing.
The Space Technology-7 Disturbance Reduction Systems
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Hsu, Oscar C.; Hanson, John; Hruby, Vlad
2004-01-01
The Space Technology 7 Disturbance Reduction System (DRS) is an in-space technology demonstration designed to validate technologies that are required for future missions such as the Laser Interferometer Space Antenna (LISA) and the Micro-Arcsecond X-ray Imaging Mission (MAXIM). The primary sensors that will be used by DRS are two Gravitational Reference Sensors (GRSs) being developed by Stanford University. DRS will control the spacecraft so that it flies about one of the freely-floating Gravitational Reference Sensor test masses, keeping it centered within its housing. The other GRS serves as a cross-reference for the first as well as being used as a reference for .the spacecraft s attitude control. Colloidal MicroNewton Thrusters being developed by the Busek Co. will be used to control the spacecraft's position and attitude using a six degree-of-freedom Dynamic Control System being developed by Goddard Space Flight Center. A laser interferometer being built by the Jet Propulsion Laboratory will be used to help validate the results of the experiment. The DRS will be launched in 2008 on the European Space Agency (ESA) LISA Pathfinder spacecraft along with a similar ESA experiment, the LISA Test Package.
ACTS Ka-Band Earth Stations: Technology, Performance, and Lessons Learned
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Struharik, Steven J.; Diamond, John J.; Stewart, David
2000-01-01
The Advanced Communications Technology Satellite (ACTS) Project invested heavily in prototype Ka-band satellite ground terminals to conduct an experiments program with ACTS. The ACTS experiments program proposed to validate Ka-band satellite and ground-station technology, demonstrate future telecommunication services, demonstrate commercial viability and market acceptability of these new services, evaluate system networking and processing technology, and characterize Ka-band propagation effects, including development of techniques to mitigate signal fading. This paper will present a summary of the fixed ground terminals developed by the NASA Glenn Research Center and its industry partners, emphasizing the technology and performance of the terminals and the lessons learned throughout their 6-year operation, including the inclined orbit phase-of-operations. The fixed ground stations used for experiments by government, academic, and commercial entities used reflector-based offset-fed antenna systems with antennas ranging in size from 0.35 to 3.4 in. in diameter. Gateway earth stations included two systems referred to as the NASA Ground Station (NGS) and the Link Evaluation Terminal (LET).
Optimization of EGFR high positive cell isolation procedure by design of experiments methodology.
Levi, Ofer; Tal, Baruch; Hileli, Sagi; Shapira, Assaf; Benhar, Itai; Grabov, Pavel; Eliaz, Noam
2015-01-01
Circulating tumor cells (CTCs) in blood circulation may play a role in monitoring and even in early detection of metastasis patients. Due to the limited presence of CTCs in blood circulation, viable CTCs isolation technology must supply a very high recovery rate. Here, we implement design of experiments (DOE) methodology in order to optimize the Bio-Ferrography (BF) immunomagnetic isolation (IMI) procedure for the EGFR high positive CTCs application. All consequent DOE phases such as screening design, optimization experiments and validation experiments were used. A significant recovery rate of more than 95% was achieved while isolating 100 EGFR high positive CTCs from 1 mL human whole blood. The recovery achievement in this research positions BF technology as one of the most efficient IMI technologies, which is ready to be challenged with patients' blood samples. © 2015 International Clinical Cytometry Society.
COFS 1 Guest Investigator Program
NASA Technical Reports Server (NTRS)
Fontana, Anthony; Wright, Robert L.
1986-01-01
The process for selecting guest investigators for participation in the Control of Flexible Structures (COFS)-1 program is described. Contracts and grants will be awarded in late CY87. A straw-man list of types of experiments and a distribution of the experiments has been defined to initiate definition of an experiments package which supports development and validation of control structures interaction technology. A schedule of guest investigator participation has been developed.
Propulsion Risk Reduction Activities for Non-Toxic Cryogenic Propulsion
NASA Technical Reports Server (NTRS)
Smith, Timothy D.; Klem, Mark D.; Fisher, Kenneth
2010-01-01
The Propulsion and Cryogenics Advanced Development (PCAD) Project s primary objective is to develop propulsion system technologies for non-toxic or "green" propellants. The PCAD project focuses on the development of non-toxic propulsion technologies needed to provide necessary data and relevant experience to support informed decisions on implementation of non-toxic propellants for space missions. Implementation of non-toxic propellants in high performance propulsion systems offers NASA an opportunity to consider other options than current hypergolic propellants. The PCAD Project is emphasizing technology efforts in reaction control system (RCS) thruster designs, ascent main engines (AME), and descent main engines (DME). PCAD has a series of tasks and contracts to conduct risk reduction and/or retirement activities to demonstrate that non-toxic cryogenic propellants can be a feasible option for space missions. Work has focused on 1) reducing the risk of liquid oxygen/liquid methane ignition, demonstrating the key enabling technologies, and validating performance levels for reaction control engines for use on descent and ascent stages; 2) demonstrating the key enabling technologies and validating performance levels for liquid oxygen/liquid methane ascent engines; and 3) demonstrating the key enabling technologies and validating performance levels for deep throttling liquid oxygen/liquid hydrogen descent engines. The progress of these risk reduction and/or retirement activities will be presented.
Propulsion Risk Reduction Activities for Nontoxic Cryogenic Propulsion
NASA Technical Reports Server (NTRS)
Smith, Timothy D.; Klem, Mark D.; Fisher, Kenneth L.
2010-01-01
The Propulsion and Cryogenics Advanced Development (PCAD) Project s primary objective is to develop propulsion system technologies for nontoxic or "green" propellants. The PCAD project focuses on the development of nontoxic propulsion technologies needed to provide necessary data and relevant experience to support informed decisions on implementation of nontoxic propellants for space missions. Implementation of nontoxic propellants in high performance propulsion systems offers NASA an opportunity to consider other options than current hypergolic propellants. The PCAD Project is emphasizing technology efforts in reaction control system (RCS) thruster designs, ascent main engines (AME), and descent main engines (DME). PCAD has a series of tasks and contracts to conduct risk reduction and/or retirement activities to demonstrate that nontoxic cryogenic propellants can be a feasible option for space missions. Work has focused on 1) reducing the risk of liquid oxygen/liquid methane ignition, demonstrating the key enabling technologies, and validating performance levels for reaction control engines for use on descent and ascent stages; 2) demonstrating the key enabling technologies and validating performance levels for liquid oxygen/liquid methane ascent engines; and 3) demonstrating the key enabling technologies and validating performance levels for deep throttling liquid oxygen/liquid hydrogen descent engines. The progress of these risk reduction and/or retirement activities will be presented.
Disturbance Reduction Control Design for the ST7 Flight Validation Experiment
NASA Technical Reports Server (NTRS)
Maghami, P. G.; Hsu, O. C.; Markley, F. L.; Houghton, M. B.
2003-01-01
The Space Technology 7 experiment will perform an on-orbit system-level validation of two specific Disturbance Reduction System technologies: a gravitational reference sensor employing a free-floating test mass, and a set of micro-Newton colloidal thrusters. The ST7 Disturbance Reduction System is designed to maintain the spacecraft's position with respect to a free-floating test mass to less than 10 nm/Hz, over the frequency range of 1 to 30 mHz. This paper presents the design and analysis of the coupled, drag-free and attitude control systems that close the loop between the gravitational reference sensor and the micro-Newton thrusters, while incorporating star tracker data at low frequencies. A full 18 degree-of-freedom model, which incorporates rigid-body models of the spacecraft and two test masses, is used to evaluate the effects of actuation and measurement noise and disturbances on the performance of the drag-free system.
LWS/SET Technology Experiment Carrier
NASA Technical Reports Server (NTRS)
Sherman, Barry; Giffin, Geoff
2002-01-01
This paper examines the approach taken to building a low-cost, modular spacecraft bus that can be used to support a variety of technology experiments in different space environments. It describes the techniques used and design drivers considered to ensure experiment independence from as yet selected host spacecraft. It describes the technology experiment carriers that will support NASA's Living With a Star Space Environment Testbed space missions. NASA has initiated the Living With a Star (LWS) Program to develop a better scientific understanding to address the aspects of the connected Sun-Earth system that affect life and society. A principal goal of the program is to bridge the gap between science, engineering, and user application communities. The Space Environment Testbed (SET) Project is one element of LWS. The Project will enable future science, operational, and commercial objectives in space and atmospheric environments by improving engineering approaches to the accommodation and/or mitigation of the effects of solar variability on technological systems. The SET Project is highly budget constrained and must seek to take advantage of as yet undetermined partnering opportunities for access to space. SET will conduct technology validation experiments hosted on available flight opportunities. The SET Testbeds will be developed in a manner that minimizes the requirements for accommodation, and will be flown as flight opportunities become available. To access the widest range of flight opportunities, two key development requirements are to maintain flexibility with respect to accommodation constraints and to have the capability to respond quickly to flight opportunities. Experiments, already developed to the technology readiness level of needing flight validation in the variable Sun-Earth environment, will be selected on the basis of the need for the subject technology, readiness for flight, need for flight resources and particular orbit. Experiments will be accumulated by the Project and manifested for specific flight opportunities as they become available. The SET Carrier is designed to present a standard set of interfaces to SET technology experiments and to be modular and flexible enough to interface to a variety of possible host spacecraft. The Carrier will have core components and mission unique components. Once the core carrier elements have been developed, only the mission unique components need to be defined and developed for any particular mission. This approach will minimize the mission specific cost and development schedule for a given flight opportunity. The standard set of interfaces provided by SET to experiments allows them to be developed independent of the particulars of a host spacecraft. The Carrier will provide the power, communication, and the necessary monitoring features to operate experiments. The Carrier will also provide all of the mechanical assemblies and harnesses required to adapt experiments to a particular host. Experiments may be hosted locally with the Carrier or remotely on the host spacecraft. The Carrier design will allow a single Carrier to support a variable number of experiments and will include features that support the ability to incrementally add experiments without disturbing the core architecture.
A Validation Framework for the Long Term Preservation of High Energy Physics Data
NASA Astrophysics Data System (ADS)
Ozerov, Dmitri; South, David M.
2014-06-01
The study group on data preservation in high energy physics, DPHEP, is moving to a new collaboration structure, which will focus on the implementation of preservation projects, such as those described in the group's large scale report published in 2012. One such project is the development of a validation framework, which checks the compatibility of evolving computing environments and technologies with the experiments software for as long as possible, with the aim of substantially extending the lifetime of the analysis software, and hence of the usability of the data. The framework is designed to automatically test and validate the software and data of an experiment against changes and upgrades to the computing environment, as well as changes to the experiment software itself. Technically, this is realised using a framework capable of hosting a number of virtual machine images, built with different configurations of operating systems and the relevant software, including any necessary external dependencies.
Utilization of sounding rockets and balloons in the German Space Programme
NASA Astrophysics Data System (ADS)
Preu, Peter; Friker, Achim; Frings, Wolfgang; Püttmann, Norbert
2005-08-01
Sounding rockets and balloons are important tools of Germany's Space Programme. DLR manages these activities and promotes scientific experiments and validation programmes within (1) Space Science, (2) Earth Observation, (3) Microgravity Research and (4) Re-entry Technologies (SHEFEX). In Space Science the present focus is at atmospheric research. Concerning Earth Observation balloon-borne measurements play a key role in the validation of atmospheric satellite sounders (ENVISAT). TEXUS and MAXUS sounding rockets are successfully used for short duration microgravity experiments. The Sharp Edge Flight Experiment SHEFEX will deliver data from a hypersonic flight for the validation of a new Thermal Protection System (TPS), wind tunnel testing and numerical analysis of aerothermodynamics. Signing the Revised Esrange and Andøya Special Project (EASP) Agreement 2006-2010 in June 2004 Germany has made an essential contribution to the long-term availability of the Scandinavian ranges for the European science community.
Lee, Jung Jae; Clarke, Charlotte L
2015-05-01
The aim of this study was to develop and psychometrically test a shortened version of the Information Technology Attitude Scales for Health, in the investigation of nursing students with clinical placement experiences. Nurses and nursing students need to develop high levels of competency in information and communication technology. However, they encounter statistically significant barriers in the use of the technology. Although some instruments have been developed to measure factors that influence nurses' attitudes towards technology, the validity is questionable and few studies have been developed to test the attitudes of nursing students, in particular. A cross-sectional survey design was performed. The Information Technology Attitude Scales for Health was used to collect data from October 2012-December 2012. A panel of experts reviewed the content of the instrument and a pilot study was conducted. Following this, a total of 508 nursing students, who were engaged in clinical placements, were recruited from six universities in South Korea. Exploratory and confirmatory factor analyses were performed and reliability and construct validity were assessed. The resulting instrument consisted of 19 items across four factors. Reliability of the four factors was acceptable and the validity was supported. The instrument was shown to be both valid and reliable for measuring nursing students' attitudes towards technology, thus aiding in the current understandings of this aspect. Through these measurements and understandings, nursing educators and students are able to be more reflexive of their attitudes and can thus seek to develop them positively. © 2015 John Wiley & Sons Ltd.
CFD Validation with Experiment and Verification with Physics of a Propellant Damping Device
NASA Technical Reports Server (NTRS)
Yang, H. Q.; Peugeot, John
2011-01-01
This paper will document our effort in validating a coupled fluid-structure interaction CFD tool in predicting a damping device performance in the laboratory condition. Consistently good comparisons of "blind" CFD predictions against experimental data under various operation conditions, design parameters, and cryogenic environment will be presented. The power of the coupled CFD-structures interaction code in explaining some unexpected phenomena of the device observed during the technology development will be illustrated. The evolution of the damper device design inside the LOX tank will be used to demonstrate the contribution of the tool in understanding, optimization and implementation of LOX damper in Ares I vehicle. It is due to the present validation effort, the LOX damper technology has matured to TRL 5. The present effort has also contributed to the transition of the technology from an early conceptual observation to the baseline design of thrust oscillation mitigation for the Ares I within a 10 month period.
Status of the Combustion Devices Injector Technology Program at the NASA MSFC
NASA Technical Reports Server (NTRS)
Jones, Gregg; Protz, Christopher; Trinh, Huu; Tucker, Kevin; Nesman, Tomas; Hulka, James
2005-01-01
To support the NASA Space Exploration Mission, an in-house program called Combustion Devices Injector Technology (CDIT) is being conducted at the NASA Marshall Space Flight Center (MSFC) for the fiscal year 2005. CDIT is focused on developing combustor technology and analysis tools to improve reliability and durability of upper-stage and in-space liquid propellant rocket engines. The three areas of focus include injector/chamber thermal compatibility, ignition, and combustion stability. In the compatibility and ignition areas, small-scale single- and multi-element hardware experiments will be conducted to demonstrate advanced technological concepts as well as to provide experimental data for validation of computational analysis tools. In addition, advanced analysis tools will be developed to eventually include 3-dimensional and multi- element effects and improve capability and validity to analyze heat transfer and ignition in large, multi-element injectors.
Development of a verification program for deployable truss advanced technology
NASA Technical Reports Server (NTRS)
Dyer, Jack E.
1988-01-01
Use of large deployable space structures to satisfy the growth demands of space systems is contingent upon reducing the associated risks that pervade many related technical disciplines. The overall objectives of this program was to develop a detailed plan to verify deployable truss advanced technology applicable to future large space structures and to develop a preliminary design of a deployable truss reflector/beam structure for use a a technology demonstration test article. The planning is based on a Shuttle flight experiment program using deployable 5 and 15 meter aperture tetrahedral truss reflections and a 20 m long deployable truss beam structure. The plan addresses validation of analytical methods, the degree to which ground testing adequately simulates flight and in-space testing requirements for large precision antenna designs. Based on an assessment of future NASA and DOD space system requirements, the program was developed to verify four critical technology areas: deployment, shape accuracy and control, pointing and alignment, and articulation and maneuvers. The flight experiment technology verification objectives can be met using two shuttle flights with the total experiment integrated on a single Shuttle Test Experiment Platform (STEP) and a Mission Peculiar Experiment Support Structure (MPESS). First flight of the experiment can be achieved 60 months after go-ahead with a total program duration of 90 months.
From Research to Flight: Surviving the TRL Valley of Death for Robotic and Human Space Exploration
NASA Technical Reports Server (NTRS)
Johnson, Les
2009-01-01
There must be a plan or opportunities for flight validation: a) To reduce the bottleneck of new technologies at the TRL Valley of Death; b) To allow frequent infusion of new technologies into flight missions. Risk must be tolerated for new technology flight experiments. Risk must also be accepted on early-adopting missions to enable new capabilities. Fundamental research is critical to taking the next giant leap in the scientific exploration of space. Technology push is often required to meet current mission requirements. Technology management requires more than issuing NRAs and overseeing contracts.
Large-scale experimental technology with remote sensing in land surface hydrology and meteorology
NASA Technical Reports Server (NTRS)
Brutsaert, Wilfried; Schmugge, Thomas J.; Sellers, Piers J.; Hall, Forrest G.
1988-01-01
Two field experiments to study atmospheric and land surface processes and their interactions are summarized. The Hydrologic-Atmospheric Pilot Experiment, which tested techniques for measuring evaporation, soil moisture storage, and runoff at scales of about 100 km, was conducted over a 100 X 100 km area in France from mid-1985 to early 1987. The first International Satellite Land Surface Climatology Program field experiment was conducted in 1987 to develop and use relationships between current satellite measurements and hydrologic, climatic, and biophysical variables at the earth's surface and to validate these relationships with ground truth. This experiment also validated surface parameterization methods for simulation models that describe surface processes from the scale of vegetation leaves up to scales appropriate to satellite remote sensing.
Validation of a unique concept for a low-cost, lightweight space-deployable antenna structure
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Bilyeu, G. D.; Veal, G. R.
1993-01-01
An experiment conducted in the framework of a NASA In-Space Technology Experiments Program based on a concept of inflatable deployable structures is described. The concept utilizes very low inflation pressure to maintain the required geometry on orbit and gravity-induced deflection of the structure precludes any meaningful ground-based demonstrations of functions performance. The experiment is aimed at validating and characterizing the mechanical functional performance of a 14-m-diameter inflatable deployable reflector antenna structure in the orbital operational environment. Results of the experiment are expected to significantly reduce the user risk associated with using large space-deployable antennas by demonstrating the functional performance of a concept that meets the criteria for low-cost, lightweight, and highly reliable space-deployable structures.
Experimental Validation: Subscale Aircraft Ground Facilities and Integrated Test Capability
NASA Technical Reports Server (NTRS)
Bailey, Roger M.; Hostetler, Robert W., Jr.; Barnes, Kevin N.; Belcastro, Celeste M.; Belcastro, Christine M.
2005-01-01
Experimental testing is an important aspect of validating complex integrated safety critical aircraft technologies. The Airborne Subscale Transport Aircraft Research (AirSTAR) Testbed is being developed at NASA Langley to validate technologies under conditions that cannot be flight validated with full-scale vehicles. The AirSTAR capability comprises a series of flying sub-scale models, associated ground-support equipment, and a base research station at NASA Langley. The subscale model capability utilizes a generic 5.5% scaled transport class vehicle known as the Generic Transport Model (GTM). The AirSTAR Ground Facilities encompass the hardware and software infrastructure necessary to provide comprehensive support services for the GTM testbed. The ground facilities support remote piloting of the GTM aircraft, and include all subsystems required for data/video telemetry, experimental flight control algorithm implementation and evaluation, GTM simulation, data recording/archiving, and audio communications. The ground facilities include a self-contained, motorized vehicle serving as a mobile research command/operations center, capable of deployment to remote sites when conducting GTM flight experiments. The ground facilities also include a laboratory based at NASA LaRC providing near identical capabilities as the mobile command/operations center, as well as the capability to receive data/video/audio from, and send data/audio to the mobile command/operations center during GTM flight experiments.
Canyval-x: Cubesat Astronomy by NASA and Yonsei Using Virtual Telescope Alignment Experiment
NASA Technical Reports Server (NTRS)
Shah, Neerav
2016-01-01
CANYVAL-X is a technology demonstration CubeSat mission with a primary objective of validating technologies that allow two spacecraft to fly in formation along an inertial line-of-sight (i.e., align two spacecraft to an inertial source). Demonstration of precision dual-spacecraft alignment achieving fine angular precision enables a variety of cutting-edge heliophysics and astrophysics science.
Disruption Tolerant Networking Flight Validation Experiment on NASA's EPOXI Mission
NASA Technical Reports Server (NTRS)
Wyatt, Jay; Burleigh, Scott; Jones, Ross; Torgerson, Leigh; Wissler, Steve
2009-01-01
In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions. This activity was part of a larger NASA space DTN development program to mature DTN to flight readiness for a wide variety of mission types by the end of 2011. This paper describes the DTN protocols, the flight demo implementation, validation metrics which were created for the experiment, and validation results.
Validation of a wireless modular monitoring system for structures
NASA Astrophysics Data System (ADS)
Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind
2002-06-01
A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.
Flight Testing of an Airport Surface Guidance, Navigation, and Control System
NASA Technical Reports Server (NTRS)
Young, Steven D.; Jones, Denise R.
1998-01-01
This document describes operations associated with a set of flight experiments and demonstrations using a Boeing-757-200 (B-757) research aircraft as part of low visibility landing and surface operations (LVLASO) research activities. To support this experiment, the B-757 performed flight and taxi operations at the Hartsfield-Atlanta International Airport (ATL) in Atlanta, GA. The B-757 was equipped with experimental displays that were designed to provide flight crews with sufficient information to enable safe, expedient surface operations in any weather condition down to a runway visual range (RVR) of 300 feet. In addition to flight deck displays and supporting equipment onboard the B-757, there was also a ground-based component of the system that provided for ground controller inputs and surveillance of airport surface movements. The integrated ground and airborne components resulted in a system that has the potential to significantly improve the safety and efficiency of airport surface movements particularly as weather conditions deteriorate. Several advanced technologies were employed to show the validity of the operational concept at a major airport facility, to validate flight simulation findings, and to assess each of the individual technologies performance in an airport environment. Results show that while the maturity of some of the technologies does not permit immediate implementation, the operational concept is valid and the performance is more than adequate in many areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, J.D.
1994-12-31
Technical developments on the neutral particle beam (NPB) program over a period of 18 years led to significant developments in accelerator technology. Many of these state-of-the-art technologies were integrated into the Ground Test Accelerator (GTA). GTA beam experiments were completed on components and systems that included the ion source through low-energy DTL modules. Provisions for beam funneling, matching, cryogenic (20 K) operation, detailed transverse and longitudinal beam characterization, combined with state-of-the-art accelerator and rf controls made this GTA system unique. The authors will summarize the types and magnitudes of these technology advances that culminated in the fabrication of the 24more » MeV front end of the GTA. A number of highly instrumented beam experiments at several stages validated the innovative designs. Applications of GTA-developed technology to several new accelerators will highlight the practical benefits of the GTA technology integration.« less
NASA Technical Reports Server (NTRS)
Jenkins, Phillip P.; Krasowski, Michael J.; Greer, Lawrence C.; Flatico, Joseph M.
2005-01-01
The Forward Technology Solar Cell Experiment (FTSCE) is a space solar cell experiment built as part of the Fifth Materials on the International Space Station Experiment (MISSE-5): Data Acquisition and Control Hardware and Software. It represents a collaborative effort between the NASA Glenn Research Center, the Naval Research Laboratory, and the U.S. Naval Academy. The purpose of this experiment is to place current and future solar cell technologies on orbit where they will be characterized and validated. This is in response to recent on-orbit and ground test results that raised concerns about the in-space survivability of new solar cell technologies and about current ground test methodology. The various components of the FTSCE are assembled into a passive experiment container--a 2- by 2- by 4-in. folding metal container that will be attached by an astronaut to the outer structure of the International Space Station. Data collected by the FTSCE will be relayed to the ground through a transmitter assembled by the U.S. Naval Academy. Data-acquisition electronics and software were designed to be tolerant of the thermal and radiation effects expected on orbit. The experiment has been verified and readied for flight on STS-114.
The Role of Empirical Evidence for Transferring a New Technology to Industry
NASA Astrophysics Data System (ADS)
Baldassarre, Maria Teresa; Bruno, Giovanni; Caivano, Danilo; Visaggio, Giuseppe
Technology transfer and innovation diffusion are key success factors for an enterprise. The shift to a new software technology involves, on one hand, inevitable changes to ingrained and familiar processes and, on the other, requires training, changes in practices and commitment on behalf of technical staff and management. Nevertheless, industry is often reluctant to innovation due to the changes it determines. The process of innovation diffusion is easier if the new technology is supported by empirical evidence. In this sense our conjecture is that Empirical Software Engineering (ESE) serves as means for validating and transferring a new technology within production processes. In this paper, the authors report their experience of a method, Multiview Framework, defined in the SERLAB research laboratory as support for designing and managing a goal oriented measurement program that has been validated through various empirical studies before being transferred to an Italian SME. Our discussion points out the important role of empirical evidence for obtaining management commitment and buy-in on behalf of technical staff, and for making technological transfer possible.
Fuel-Air Mixing and Combustion in Scramjets
NASA Technical Reports Server (NTRS)
Drummond, J. P.; Diskin, Glenn S.; Cutler, A. D.
2002-01-01
Activities in the area of scramjet fuel-air mixing and combustion associated with the Research and Technology Organization Working Group on Technologies for Propelled Hypersonic Flight are described. Work discussed in this paper has centered on the design of two basic experiments for studying the mixing and combustion of fuel and air in a scramjet. Simulations were conducted to aid in the design of these experiments. The experimental models were then constructed, and data were collected in the laboratory. Comparison of the data from a coaxial jet mixing experiment and a supersonic combustor experiment with a combustor code were then made and described. This work was conducted by NATO to validate combustion codes currently employed in scramjet design and to aid in the development of improved turbulence and combustion models employed by the codes.
Applications of CRISPR genome editing technology in drug target identification and validation.
Lu, Quinn; Livi, George P; Modha, Sundip; Yusa, Kosuke; Macarrón, Ricardo; Dow, David J
2017-06-01
The analysis of pharmaceutical industry data indicates that the major reason for drug candidates failing in late stage clinical development is lack of efficacy, with a high proportion of these due to erroneous hypotheses about target to disease linkage. More than ever, there is a requirement to better understand potential new drug targets and their role in disease biology in order to reduce attrition in drug development. Genome editing technology enables precise modification of individual protein coding genes, as well as noncoding regulatory sequences, enabling the elucidation of functional effects in human disease relevant cellular systems. Areas covered: This article outlines applications of CRISPR genome editing technology in target identification and target validation studies. Expert opinion: Applications of CRISPR technology in target validation studies are in evidence and gaining momentum. Whilst technical challenges remain, we are on the cusp of CRISPR being applied in complex cell systems such as iPS derived differentiated cells and stem cell derived organoids. In the meantime, our experience to date suggests that precise genome editing of putative targets in primary cell systems is possible, offering more human disease relevant systems than conventional cell lines.
Kilopower: Small and Affordable Fission Power Systems for Space
NASA Technical Reports Server (NTRS)
Mason, Lee; Palac, Don; Gibson, Marc
2017-01-01
The Nuclear Systems Kilopower Project was initiated by NASA's Space Technology Mission Directorate Game Changing Development Program in fiscal year 2015 to demonstrate subsystem-level technology readiness of small space fission power in a relevant environment (Technology Readiness Level 5) for space science and human exploration power needs. The Nuclear Systems Kilopower Project centerpiece is the Kilopower Reactor Using Stirling Technology (KRUSTY) test, which consists of the development and testing of a fission ground technology demonstrator of a 1 kWe-class fission power system. The technologies to be developed and validated by KRUSTY are extensible to space fission power systems from 1 to 10 kWe, which can enable higher power future potential deep space science missions, as well as modular surface fission power systems for exploration. The Kilopower Project is cofounded by NASA and the Department of Energy National Nuclear Security Administration (NNSA).KRUSTY include the reactor core, heat pipes to transfer the heat from the core to the power conversion system, and the power conversion system. Los Alamos National Laboratory leads the design of the reactor, and the Y-12 National Security Complex is fabricating it. NASA Glenn Research Center (GRC) has designed, built, and demonstrated the balance of plant heat transfer and power conversion portions of the KRUSTY experiment. NASA MSFC developed an electrical reactor simulator for non-nuclear testing, and the design of the reflector and shielding for nuclear testing. In 2016, an electrically heated non-fissionable Depleted Uranium (DU) core was tested at GRC in a configuration identical to the planned nuclear test. Once the reactor core has been fabricated and shipped to the Device Assembly Facility at the NNSAs Nevada National Security Site, the KRUSTY nuclear experiment will be assembled and tested. Completion of the KRUSTY experiment will validate the readiness of 1 to 10 kWe space fission technology for NASAs future requirements for sunlight-independent space power. An early opportunity for demonstration of In-Situ Resource Utilization (ISRU) capability on the surface of Mars is currently being considered for 2026 launch. Since a space fission system is the leading option for power generation for the first Mars human outpost, a smaller version of a planetary surface fission power system could be built to power the ISRU demonstration and ensure its end-to-end validity. Planning is underway to start the hardware development of this subscale flight demonstrator in 2018.
Electrical Capacitance Volume Tomography for the Packed Bed Reactor ISS Flight Experiment
NASA Technical Reports Server (NTRS)
Marashdeh, Qussai; Motil, Brian; Wang, Aining; Liang-Shih, Fan
2013-01-01
Fixed packed bed reactors are compact, require minimum power and maintenance to operate, and are highly reliable. These features make this technology a highly desirable unit operation for long duration life support systems in space. NASA is developing an ISS experiment to address this technology with particular focus on water reclamation and air revitalization. Earlier research and development efforts funded by NASA have resulted in two hydrodynamic models which require validation with appropriate instrumentation in an extended microgravity environment. To validate these models, the instantaneous distribution of the gas and liquid phases must be measured.Electrical Capacitance Volume Tomography (ECVT) is a non-invasive imaging technology recently developed for multi-phase flow applications. It is based on distributing flexible capacitance plates on the peripheral of a flow column and collecting real-time measurements of inter-electrode capacitances. Capacitance measurements here are directly related to dielectric constant distribution, a physical property that is also related to material distribution in the imaging domain. Reconstruction algorithms are employed to map volume images of dielectric distribution in the imaging domain, which is in turn related to phase distribution. ECVT is suitable for imaging interacting materials of different dielectric constants, typical in multi-phase flow systems. ECVT is being used extensively for measuring flow variables in various gas-liquid and gas-solid flow systems. Recent application of ECVT include flows in risers and exit regions of circulating fluidized beds, gas-liquid and gas-solid bubble columns, trickle beds, and slurry bubble columns. ECVT is also used to validate flow models and CFD simulations. The technology is uniquely qualified for imaging phase concentrations in packed bed reactors for the ISS flight experiments as it exhibits favorable features of compact size, low profile sensors, high imaging speed, and flexibility to fit around columns of various shapes and sizes. ECVT is also safer than other commonly used imaging modalities as it operates in the range of low frequencies (1 MHz) and does not radiate radioactive energy. In this effort, ECVT is being used to image flow parameters in a packed bed reactor for an ISS flight experiment.
NASA Technical Reports Server (NTRS)
Whiteman, D. N.; Russo, F.; Demoz, B.; Miloshevich, L. M.; Veselovskii, I.; Hannon, S.; Wang, Z.; Vomel, H.; Schmidlin, F.; Lesht, B.
2005-01-01
Early work within the Aqua validation activity revealed there to be large differences in water vapor measurement accuracy among the various technologies in use for providing validation data. The validation measurements were made at globally distributed sites making it difficult to isolate the sources of the apparent measurement differences among the various sensors, which included both Raman lidar and radiosonde. Because of this, the AIRS Water Vapor Experiment-Ground (AWEX-G) was held in October - November, 2003 with the goal of bringing validation technologies to a common site for intercomparison and resolution of the measurement discrepancies. Using the University of Colorado Cryogenic Frostpoint Hygrometer (CFH) as the water vapor reference, the AWEX-G field campaign resulted in new correction techniques for both Raman lidar, Vaisala RS80-H and RS90/92 measurements that significantly improve the absolute accuracy of those measurement systems particularly in the upper troposphere. Mean comparisons of radiosondes and lidar are performed demonstrating agreement between corrected sensors and the CFH to generally within 5% thereby providing data of sufficient accuracy for Aqua validation purposes. Examples of the use of the correction techniques in radiance and retrieval comparisons are provided and discussed.
Materials on the International Space Station - Forward Technology Solar Cell Experiment
NASA Technical Reports Server (NTRS)
Walters, R. J.; Garner, J. C.; Lam, S. N.; Vazquez, J. A.; Braun, W. R.; Ruth, R. E.; Lorentzen, J. R.; Bruninga, R.; Jenkins, P. P.; Flatico, J. M.
2005-01-01
This paper describes a space solar cell experiment currently being built by the Naval Research Laboratory (NRL) in collaboration with NASA Glenn Research Center (GRC), and the US Naval Academy (USNA). The experiment has been named the Forward Technology Solar Cell Experiment (FTSCE), and the purpose is to rapidly put current and future generation space solar cells on orbit and provide validation data for these technologies. The FTSCE is being fielded in response to recent on-orbit and ground test anomalies associated with space solar arrays that have raised concern over the survivability of new solar technologies in the space environment and the validity of present ground test protocols. The FTSCE is being built as part of the Fifth Materials on the International Space Station (MISSE) Experiment (MISSE-5), which is a NASA program to characterize the performance of new prospective spacecraft materials when subjected to the synergistic effects of the space environment. Telemetry, command, control, and communication (TNC) for the FTSCE will be achieved through the Amateur Satellite Service using the PCSat2 system, which is an Amateur Radio system designed and built by the USNA. In addition to providing an off-the-shelf solution for FTSCE TNC, PCSat2 will provide a communications node for the Amateur Radio satellite system. The FTSCE and PCSat2 will be housed within the passive experiment container (PEC), which is an approximately 2ft x2ft x 4in metal container built by NASA Langley Research Center (NASA LaRC) as part of the MISSE-5 program. NASA LaRC has also supplied a thin film materials experiment that will fly on the exterior of the thermal blanket covering the PCSat2. The PEC is planned to be transported to the ISS on a Shuttle flight. The PEC will be mounted on the exterior of the ISS by an astronaut during an extravehicular activity (EVA). After nominally one year, the PEC will be retrieved and returned to Earth. At the time of writing this paper, the subsystems of the experiment are being integrated at NRL, and we are preparing to commence environmental testing.
NASA Astrophysics Data System (ADS)
Tanaka, Hiroshi; Hashimura, Shinji; Hiroo, Yasuaki
We present a program to learn ability to solve problems on engineering. This program is called “Experiments in creative engineering” in the department of mechanical engineering in Kurume National College of Technology advanced engineering school. In the program, students have to determine own theme and manufacture experimental devices or some machines by themselves. The students must also perform experiments to valid the function and performance of their devices by themselves. The restriction of the theme is to manufacture a device which function dose not basically exist in the world with limited cost (up to 20,000Yen) . As the results of questionnaire of students, the program would be very effective to the creative education for the students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calderer, Antoni; Yang, Xiaolei; Angelidis, Dionysios
2015-10-30
The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.
Digital Fly-By-Wire Flight Control Validation Experience
NASA Technical Reports Server (NTRS)
Szalai, K. J.; Jarvis, C. R.; Krier, G. E.; Megna, V. A.; Brock, L. D.; Odonnell, R. N.
1978-01-01
The experience gained in digital fly-by-wire technology through a flight test program being conducted by the NASA Dryden Flight Research Center in an F-8C aircraft is described. The system requirements are outlined, along with the requirements for flight qualification. The system is described, including the hardware components, the aircraft installation, and the system operation. The flight qualification experience is emphasized. The qualification process included the theoretical validation of the basic design, laboratory testing of the hardware and software elements, systems level testing, and flight testing. The most productive testing was performed on an iron bird aircraft, which used the actual electronic and hydraulic hardware and a simulation of the F-8 characteristics to provide the flight environment. The iron bird was used for sensor and system redundancy management testing, failure modes and effects testing, and stress testing in many cases with the pilot in the loop. The flight test program confirmed the quality of the validation process by achieving 50 flights without a known undetected failure and with no false alarms.
Development of the Packed Bed Reactor ISS Flight Experiment
NASA Technical Reports Server (NTRS)
Patton, Martin O.; Bruzas, Anthony E.; Rame, Enrique; Motil, Brian J.
2012-01-01
Packed bed reactors are compact, require minimum power and maintenance to operate, and are highly reliable. These features make this technology a leading candidate as a potential unit operation in support of long duration human space exploration. On earth, this type of reactor accounts for approximately 80% of all the reactors used in the chemical process industry today. Development of this technology for space exploration is truly crosscutting with many other potential applications (e.g., in-situ chemical processing of planetary materials and transport of nutrients through soil). NASA is developing an ISS experiment to address this technology with particular focus on water reclamation and air revitalization. Earlier research and development efforts funded by NASA have resulted in two hydrodynamic models which require validation with appropriate instrumentation in an extended microgravity environment. The first model developed by Motil et al., (2003) is based on a modified Ergun equation. This model was demonstrated at moderate gas and liquid flow rates, but extension to the lower flow rates expected in many advanced life support systems must be validated. The other model, developed by Guo et al., (2004) is based on Darcy s (1856) law for two-phase flow. This model has been validated for a narrow range of flow parameters indirectly (without full instrumentation) and included test points where the flow was not fully developed. The flight experiment presented will be designed with removable test sections to test the hydrodynamic models. The experiment will provide flexibility to test additional beds with different types of packing in the future. One initial test bed is based on the VRA (Volatile Removal Assembly), a packed bed reactor currently on ISS whose behavior in micro-gravity is not fully understood. Improving the performance of this system through an accurate model will increase our ability to purify water in the space environment.
The introduction of a new consulting technology into the National Health Service (NHS) for Scotland.
Heaney, David; Caldow, Jan; McClusky, Christine; King, Gerry; Webster, Karyn; Mair, Fiona; Ferguson, James
2009-01-01
New technologies can change healthcare delivery. Cisco HealthPresence, an integrated platform that combines video, audio, and call center technology with medical information to create a virtual clinic experience, was piloted on emergency department patients. The aim was to assess primary care consultations. Patients were supported by an assistant. The doctor was remote from the patient and collected details and a recommended management plan. The same doctor re-examined the patient face-to-face. All patients completed a questionnaire about the experience. Key staff and a small sample of patients were interviewed. One hundred and five (N = 105) patients were included; 42% were given advice, 25% were prescribed analgesia, 26% were prescribed antibiotics, and 15% were x-rayed. There were early problems with the digital stethoscope. Doctors reported that the management plan changed in 7% of cases after seeing the patient. At least 90% of patients reported a positive experience. All patients and staff interviewed were positive. Staff found equipment to be valid and reliable; a concern was the inability to perform "hands on" examination. Telemedicine requires a change in the way of consulting and staff must be interested in using the technology to understand the differences. As one of the doctors said, "HealthPresence was a positive experience." Greater numbers would be required to validate key findings. As judged by clinicians, HealthPresence was successful and potentially safe for triage of unscheduled cases. Different types of staffing models need to be considered to ensure optimum use of health professionals. This study has shown that, despite some limitations, most HealthPresence consultations were found to be safe and appropriate. Further study of this consultation technology is required. HealthPresence has the potential to transform access to services for many patients, and to improve the effectiveness of delivery across a number of services.
Plume Measurement System (PLUMES) Calibration Experiment
1994-08-01
calibration chamber was lished and documented. To apply acoustic designed and built Particles were suspended technology to monitoring suspended sedi- in the...procedures are described in Chap- ter 2. Repeatability of the experiments and validity of the results are de - scribed in Chapter 3. In Chapter 4, the range...went into their design . The first two subsections give an overview of the calibration chamber and its characteristics. The remaining subsections describe
Experimental Evaluation of Verification and Validation Tools on Martian Rover Software
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich
2003-01-01
We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.
Results of the Vapor Compression Distillation Flight Experiment (VCD-FE)
NASA Technical Reports Server (NTRS)
Hutchens, Cindy; Graves, Rex
2004-01-01
Vapor Compression Distillation (VCD) is the chosen technology for urine processing aboard the International Space Station (ISS). Key aspects of the VCD design have been verified and significant improvements made throughout the ground;based development history. However, an important element lacking from previous subsystem development efforts was flight-testing. Consequently, the demonstration and validation of the VCD technology and the investigation of subsystem performance in micro-gravity were the primary goals of the VCD-FE. The Vapor Compression Distillation Flight Experiment (VCD-E) was a flight experiment aboard the Space Shuttle Columbia during the STS-107 mission. The VCD-FE was a full-scale developmental version of the Space Station Urine Processor Assembly (UPA) and was designed to test some of the potential micro-gravity issues with the design. This paper summarizes the experiment results.
Janssen, Ellen M; Marshall, Deborah A; Hauber, A Brett; Bridges, John F P
2017-12-01
The recent endorsement of discrete-choice experiments (DCEs) and other stated-preference methods by regulatory and health technology assessment (HTA) agencies has placed a greater focus on demonstrating the validity and reliability of preference results. Areas covered: We present a practical overview of tests of validity and reliability that have been applied in the health DCE literature and explore other study qualities of DCEs. From the published literature, we identify a variety of methods to assess the validity and reliability of DCEs. We conceptualize these methods to create a conceptual model with four domains: measurement validity, measurement reliability, choice validity, and choice reliability. Each domain consists of three categories that can be assessed using one to four procedures (for a total of 24 tests). We present how these tests have been applied in the literature and direct readers to applications of these tests in the health DCE literature. Based on a stakeholder engagement exercise, we consider the importance of study characteristics beyond traditional concepts of validity and reliability. Expert commentary: We discuss study design considerations to assess the validity and reliability of a DCE, consider limitations to the current application of tests, and discuss future work to consider the quality of DCEs in healthcare.
Space Missions for Automation and Robotics Technologies (SMART) Program
NASA Technical Reports Server (NTRS)
Cliffone, D. L.; Lum, H., Jr.
1985-01-01
NASA is currently considering the establishment of a Space Mission for Automation and Robotics Technologies (SMART) Program to define, develop, integrate, test, and operate a spaceborne national research facility for the validation of advanced automation and robotics technologies. Initially, the concept is envisioned to be implemented through a series of shuttle based flight experiments which will utilize telepresence technologies and real time operation concepts. However, eventually the facility will be capable of a more autonomous role and will be supported by either the shuttle or the space station. To ensure incorporation of leading edge technology in the facility, performance capability will periodically and systematically be upgraded by the solicitation of recommendations from a user advisory group. The facility will be managed by NASA, but will be available to all potential investigators. Experiments for each flight will be selected by a peer review group. Detailed definition and design is proposed to take place during FY 86, with the first SMART flight projected for FY 89.
TELMA: Technology-enhanced learning environment for minimally invasive surgery.
Sánchez-González, Patricia; Burgos, Daniel; Oropesa, Ignacio; Romero, Vicente; Albacete, Antonio; Sánchez-Peralta, Luisa F; Noguera, José F; Sánchez-Margallo, Francisco M; Gómez, Enrique J
2013-06-01
Cognitive skills training for minimally invasive surgery has traditionally relied upon diverse tools, such as seminars or lectures. Web technologies for e-learning have been adopted to provide ubiquitous training and serve as structured repositories for the vast amount of laparoscopic video sources available. However, these technologies fail to offer such features as formative and summative evaluation, guided learning, or collaborative interaction between users. The "TELMA" environment is presented as a new technology-enhanced learning platform that increases the user's experience using a four-pillared architecture: (1) an authoring tool for the creation of didactic contents; (2) a learning content and knowledge management system that incorporates a modular and scalable system to capture, catalogue, search, and retrieve multimedia content; (3) an evaluation module that provides learning feedback to users; and (4) a professional network for collaborative learning between users. Face validation of the environment and the authoring tool are presented. Face validation of TELMA reveals the positive perception of surgeons regarding the implementation of TELMA and their willingness to use it as a cognitive skills training tool. Preliminary validation data also reflect the importance of providing an easy-to-use, functional authoring tool to create didactic content. The TELMA environment is currently installed and used at the Jesús Usón Minimally Invasive Surgery Centre and several other Spanish hospitals. Face validation results ascertain the acceptance and usefulness of this new minimally invasive surgery training environment. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Demeo, Martha E.
1990-01-01
The feasibility of an experiment which will provide an on-orbit validation of Controls-Structures Interaction (CSI) technology, was investigated. The experiment will demonstrate the on-orbit characterization and flexible-body control of large flexible structure dynamics using the shuttle Remote Manipulator System (RMS) with an attached payload as a test article. By utilizing existing hardware as well as establishing integration, operation and safety algorithms, techniques and procedures, the experiment will minimize the costs and risks of implementing a flight experiment. The experiment will also offer spin-off enhancement to both the Shuttle RMS (SRMS) and the Space Station RMS (SSRMS).
Pre-Launch End-to-End Testing Plans for the SPAce Readiness Coherent Lidar Experiment (SPARCLE)
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.
1999-01-01
The SPAce Readiness Coherent Lidar Experiment (SPARCLE) mission was proposed as a low cost technology demonstration mission, using a 2-micron, 100-mJ, 6-Hz, 25-cm, coherent lidar system based on demonstrated technology. SPARCLE was selected in late October 1997 to be NASA's New Millennium Program (NMP) second earth-observing (EO-2) mission. To maximize the success probability of SPARCLE, NASA/MSFC desired expert guidance in the areas of coherent laser radar (CLR) theory, CLR wind measurement, fielding of CLR systems, CLR alignment validation, and space lidar experience. This led to the formation of the NASA/MSFC Coherent Lidar Technology Advisory Team (CLTAT) in December 1997. A threefold purpose for the advisory team was identified as: 1) guidance to the SPARCLE mission, 2) advice regarding the roadmap of post-SPARCLE coherent Doppler wind lidar (CDWL) space missions and the desired matching technology development plan 3, and 3) general coherent lidar theory, simulation, hardware, and experiment information exchange. The current membership of the CLTAT is shown. Membership does not result in any NASA or other funding at this time. We envision the business of the CLTAT to be conducted mostly by email, teleconference, and occasional meetings. The three meetings of the CLTAT to date, in Jan. 1998, July 1998, and Jan. 1999, have all been collocated with previously scheduled meetings of the Working Group on Space-Based Lidar Winds. The meetings have been very productive. Topics discussed include the SPARCLE technology validation plan including pre-launch end-to-end testing, the space-based wind mission roadmap beyond SPARCLE and its implications on the resultant technology development, the current values and proposed future advancement in lidar system efficiency, and the difference between using single-mode fiber optical mixing vs. the traditional free space optical mixing. attitude information from lidar and non-lidar sensors, and pointing knowledge algorithms will meet this second requirement. The topic of this paper is the pre-launch demonstration of the first requirement, adequate sensitivity of the SPARCLE lidar.
Measuring user experience in digital gaming: theoretical and methodological issues
NASA Astrophysics Data System (ADS)
Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte
2007-01-01
There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.
Distrubtion Tolerant Network Technology Flight Validation Report: DINET
NASA Technical Reports Server (NTRS)
Jones, Ross M.
2009-01-01
In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then, they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions.
Distribution Tolerant Network Technology Flight Validation Report: DINET
NASA Technical Reports Server (NTRS)
Jones, Ross M.
2009-01-01
In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then, they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions.
Validation of Automated Scoring of Science Assessments
ERIC Educational Resources Information Center
Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.
2016-01-01
Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…
Proceedings of Tenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1985-01-01
Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.
Outsourcing bioanalytical services at Janssen Research and Development: the sequel anno 2017.
Dillen, Lieve; Verhaeghe, Tom
2017-08-01
The strategy of outsourcing bioanalytical services at Janssen has been evolving over the last years and an update will be given on the recent changes in our processes. In 2016, all internal GLP-related activities were phased out and this decision lead to the re-orientation of the in-house bioanalytical activities. As a consequence, in-depth experience with the validated bioanalytical assays for new drug candidates is currently gained together with the external partner, since development and validation of the assay and execution of GLP preclinical studies are now transferred to the CRO. The evolution to externalize more bioanalytical support has created opportunities to build even stronger partnerships with the CROs and to refocus internal resources. Case studies are presented illustrating challenges encountered during method development and validation at preferred partners when limited internal experience is obtained or with introduction of new technology.
A study on the attitude of use the mobile clinic registration system in Taiwan.
Lai, Yi-Horng; Huang, Fen-Fen; Yang, Hsieh-Hua
2015-01-01
Mobile apps provide diverse services and various convenient functions. This study applied the modified technology acceptance model (MTAM) in information systems research to the use of the mobile hospital registration system in Taiwan. The MTAM posits that perceived ease of use and perceived usefulness of technology influence users' attitudes toward using technology. Research studies using MTAM have determined information technology experience as a factor in predicting attitude. The objective of this present study is to test the validity of the MTAM model when it is being applied to the mobile registration system. The data was collected from 501 patients in a Taiwan's medical center. Path analysis results have shown that TAM is an applicable model in examining factors influencing users' attitudes of using the mobile registration system. It can be found that the perceived usefulness and the perceived ease of use are positively associated with users' attitudes toward using the mobile registration system, and they can improve users' attitudes of using it. In addition, the perceived ease of use is positively associated with the perceived usefulness. As for the personal prior experience, the information technology experience is positively associated with perceived usefulness and the perceived ease of use.
Fuel Cell and Hydrogen Technology Validation | Hydrogen and Fuel Cells |
NREL Fuel Cell and Hydrogen Technology Validation Fuel Cell and Hydrogen Technology Validation The NREL technology validation team works on validating hydrogen fuel cell electric vehicles; hydrogen fueling infrastructure; hydrogen system components; and fuel cell use in early market applications such as
Presence capture cameras - a new challenge to the image quality
NASA Astrophysics Data System (ADS)
Peltoketo, Veli-Tapani
2016-04-01
Commercial presence capture cameras are coming to the markets and a new era of visual entertainment starts to get its shape. Since the true presence capturing is still a very new technology, the real technical solutions are just passed a prototyping phase and they vary a lot. Presence capture cameras have still the same quality issues to tackle as previous phases of digital imaging but also numerous new ones. This work concentrates to the quality challenges of presence capture cameras. A camera system which can record 3D audio-visual reality as it is has to have several camera modules, several microphones and especially technology which can synchronize output of several sources to a seamless and smooth virtual reality experience. Several traditional quality features are still valid in presence capture cameras. Features like color fidelity, noise removal, resolution and dynamic range create the base of virtual reality stream quality. However, co-operation of several cameras brings a new dimension for these quality factors. Also new quality features can be validated. For example, how the camera streams should be stitched together with 3D experience without noticeable errors and how to validate the stitching? The work describes quality factors which are still valid in the presence capture cameras and defines the importance of those. Moreover, new challenges of presence capture cameras are investigated in image and video quality point of view. The work contains considerations how well current measurement methods can be used in presence capture cameras.
Experience in Evaluating AAL Solutions in Living Labs
Colomer, Juan Bautista Montalvá; Salvi, Dario; Cabrera-Umpierrez, Maria Fernanda; Arredondo, Maria Teresa; Abril, Patricia; Jimenez-Mixco, Viveca; García-Betances, Rebeca; Fioravanti, Alessio; Pastorino, Matteo; Cancela, Jorge; Medrano, Alejandro
2014-01-01
Ambient assisted living (AAL) is a complex field, where different technologies are integrated to offer solutions for the benefit of different stakeholders. Several evaluation techniques are commonly applied that tackle specific aspects of AAL; however, holistic evaluation approaches are lacking when addressing the needs of both developers and end-users. Living labs have been often used as real-life test and experimentation environments for co-designing AAL technologies and validating them with relevant stakeholders. During the last five years, we have been evaluating AAL systems and services in the framework of various research projects. This paper presents the lessons learned in this experience and proposes a set of harmonized guidelines to conduct evaluations in living labs. PMID:24763209
An Overview of NASA's SubsoniC Research Aircraft Testbed (SCRAT)
NASA Technical Reports Server (NTRS)
Baumann, Ethan; Hernandez, Joe; Ruhf, John
2013-01-01
National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft’s mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft’s flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT’s research systems and capabilities
An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)
NASA Technical Reports Server (NTRS)
Baumann, Ethan; Hernandez, Joe; Ruhf, John C.
2013-01-01
National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.
Reiter, Harald; Muehlsteff, Jens; Sipilä, Auli
2011-01-01
Functional textiles are seen as promising technology to enable healthcare services and medical care outside hospitals due to their ability to integrate textile-based sensing and monitoring technologies into the daily life. In the past much effort has been spent onto basic functional textile research already showing that reliable monitoring solutions can be realized. The challenge remains to find and develop suited medical application and to fulfil the boundary conditions for medical endorsement and exploitation. The HeartCycle vest described in this abstract will serve as an example for a functional textile carefully developed according to the requirements of a specific medical application, its clinical validation, the related certification aspects and the next improvement steps towards exploitation.
NASA Technical Reports Server (NTRS)
Maghami, Peiman; O'Donnell, James, Jr.; Hsu, Oscar; Ziemer, John; Dunn, Charles
2017-01-01
The Space Technology-7 Disturbance Reduction System (DRS) is an experiment package aboard the European Space Agency (ESA) LISA Pathfinder spacecraft. LISA Pathfinder launched from Kourou, French Guiana on December 3, 2015. The DRS is tasked to validate two specific technologies: colloidal micro-Newton thrusters (CMNT) to provide low-noise control capability of the spacecraft, and drag-free control flight. This validation is performed using highly sensitive drag-free sensors, which are provided by the LISA Technology Package of the European Space Agency. The Disturbance Reduction System is required to maintain the spacecrafts position with respect to a free-floating test mass to better than 10nmHz, along its sensitive axis (axis in optical metrology). It also has a goal of limiting the residual accelerations of any of the two test masses to below 30 (1 + [f3 mHz]) fmsHz, over the frequency range of 1 to 30 mHz.This paper briefly describes the design and the expected on-orbit performance of the control system for the two modes wherein the drag-free performance requirements are verified. The on-orbit performance of these modes are then compared to the requirements, as well as to the expected performance, and discussed.
Experimental Evaluation of Verification and Validation Tools on Martian Rover Software
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem
2003-01-01
To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this includes the code preparation, seeding of defects, participant training and experimental setup. Next we give a qualitative overview of how the experiment went from the point of view of each technology; model checking (section 5), static analysis (section 6), runtime analysis (section 7) and testing (section 8). The find section gives some preliminary quantitative results on how the tools compared.
NASA Technical Reports Server (NTRS)
Cayeux, P.; Raballand, F.; Borde, J.; Berges, J.-C.; Meyssignac, B.
2007-01-01
Within the framework of a partnership agreement, EADS ASTRIUM has worked since June 2006 for the CNES formation flying experiment on the PRISMA mission. EADS ASTRIUM is responsible for the anti-collision function. This responsibility covers the design and the development of the function as a Matlab/Simulink library, as well as its functional validation and performance assessment. PRISMA is a technology in-orbit testbed mission from the Swedish National Space Board, mainly devoted to formation flying demonstration. PRISMA is made of two micro-satellites that will be launched in 2009 on a quasi-circular SSO at about 700 km of altitude. The CNES FFIORD experiment embedded on PRISMA aims at flight validating an FFRF sensor designed for formation control, and assessing its performances, in preparation to future formation flying missions such as Simbol X; FFIORD aims as well at validating various typical autonomous rendezvous and formation guidance and control algorithms. This paper presents the principles of the collision avoidance function developed by EADS ASTRIUM for FFIORD; three kinds of maneuvers were implemented and are presented in this paper with their performances.
English, Sangeeta B.; Shih, Shou-Ching; Ramoni, Marco F.; Smith, Lois E.; Butte, Atul J.
2014-01-01
Though genome-wide technologies, such as microarrays, are widely used, data from these methods are considered noisy; there is still varied success in downstream biological validation. We report a method that increases the likelihood of successfully validating microarray findings using real time RT-PCR, including genes at low expression levels and with small differences. We use a Bayesian network to identify the most relevant sources of noise based on the successes and failures in validation for an initial set of selected genes, and then improve our subsequent selection of genes for validation based on eliminating these sources of noise. The network displays the significant sources of noise in an experiment, and scores the likelihood of validation for every gene. We show how the method can significantly increase validation success rates. In conclusion, in this study, we have successfully added a new automated step to determine the contributory sources of noise that determine successful or unsuccessful downstream biological validation. PMID:18790084
Rebuilding the space technology base
NASA Technical Reports Server (NTRS)
Povinelli, Frederick P.; Stephenson, Frank W.; Sokoloski, Martin M.; Montemerlo, Melvin D.; Venneri, Samuel L.; Mulville, Daniel R.; Hirschbein, Murray S.; Smith, Paul H.; Schnyer, A. Dan; Lum, Henry
1989-01-01
NASA's Civil Space Technology Initiative (CSTI) will not only develop novel technologies for space exploration and exploitation, but also take mature technologies into their demonstration phase in earth orbit. In the course of five years, CSTI will pay off in ground- and space-tested hardware, software, processes, methods for low-orbit transport and operation, and fundamental scientific research on the orbital environment. Attention is given to LOX/hydrogen and LOX/hydrocarbon reusable engines, liquid/solid fuel hybrid boosters, and aeroassist flight experiments for the validation of aerobraking with atmospheric friction. Also discussed are advanced scientific sensors, systems autonomy and telerobotics, control of flexible structures, precise segmented reflectors, high-rate high-capacity data handling, and advanced nuclear power systems.
Mouse Click Plagiarism: Can Technology Help to Fight Back?
ERIC Educational Resources Information Center
Tulley Pitchford, Kay
2012-01-01
Many students arrive at university accustomed to adopting the internet as their primary source of information, but with no prior experience of referencing. This raises issues of the reliability and validity of digital sources, as well as bringing new opportunities for cheating. The internet has made plagiarism quicker and easier; a student simply…
Modern Guidelines for Forming the International Image of Universities: Foreign Experience
ERIC Educational Resources Information Center
Padalka, Oleh; Voronenko, Oleksandr; Humeniuk, Tetiana
2015-01-01
The article presents the analysis of scientific and technological activities of higher education pedagogical institutions of Ukraine at the end of 2014 according to the staffing situation, the functioning of postgraduate and doctoral studies and the availability of the valid specialized academic councils on thesis defense, according to students'…
What Motivates Introductory Geology Students to Study for an Exam?
ERIC Educational Resources Information Center
Lukes, Laura A.; McConnell, David A.
2014-01-01
There is a need to understand why some students succeed and persist in STEM fields and others do not. While numerous studies have focused on the positive results of using empirically validated teaching methods in introductory science, technology, engineering, and math (STEM) courses, little data has been collected about the student experience in…
Development of chemistry attitudes and experiences questionnaire (CAEQ)
NASA Astrophysics Data System (ADS)
Dalgety, Jacinta; Coll, Richard K.; Jones, Alister
2003-09-01
In this article we describe the development of the Chemistry Attitudes and Experiences Questionnaire (CAEQ) that measures first-year university chemistry students' attitude toward chemistry, chemistry self-efficacy, and learning experiences. The instrument was developed as part of a larger study and sought to fulfill a need for an instrument to investigate factors that influence student enrollment choice. We set out to design the instrument in a manner that would maximize construct validity. The CAEQ was piloted with a cohort of science and technology students (n = 129) at the end of their first year. Based on statistical analysis the instrument was modified and subsequently administered on two occasions at two tertiary institutions (n = 669). Statistical data along with additional data gathered from interviews suggest that the CAEQ possesses good construct validity and will prove a useful tool for tertiary level educators who wish to gain an understanding of factors that influence student choice of chemistry enrolment.
NASA Technical Reports Server (NTRS)
Komendera, Erik E.; Dorsey, John T.
2017-01-01
Developing a capability for the assembly of large space structures has the potential to increase the capabilities and performance of future space missions and spacecraft while reducing their cost. One such application is a megawatt-class solar electric propulsion (SEP) tug, representing a critical transportation ability for the NASA lunar, Mars, and solar system exploration missions. A series of robotic assembly experiments were recently completed at Langley Research Center (LaRC) that demonstrate most of the assembly steps for the SEP tug concept. The assembly experiments used a core set of robotic capabilities: long-reach manipulation and dexterous manipulation. This paper describes cross-cutting capabilities and technologies for in-space assembly (ISA), applies the ISA approach to a SEP tug, describes the design and development of two assembly demonstration concepts, and summarizes results of two sets of assembly experiments that validate the SEP tug assembly steps.
Merlin: Computer-Aided Oligonucleotide Design for Large Scale Genome Engineering with MAGE.
Quintin, Michael; Ma, Natalie J; Ahmed, Samir; Bhatia, Swapnil; Lewis, Aaron; Isaacs, Farren J; Densmore, Douglas
2016-06-17
Genome engineering technologies now enable precise manipulation of organism genotype, but can be limited in scalability by their design requirements. Here we describe Merlin ( http://merlincad.org ), an open-source web-based tool to assist biologists in designing experiments using multiplex automated genome engineering (MAGE). Merlin provides methods to generate pools of single-stranded DNA oligonucleotides (oligos) for MAGE experiments by performing free energy calculation and BLAST scoring on a sliding window spanning the targeted site. These oligos are designed not only to improve recombination efficiency, but also to minimize off-target interactions. The application further assists experiment planning by reporting predicted allelic replacement rates after multiple MAGE cycles, and enables rapid result validation by generating primer sequences for multiplexed allele-specific colony PCR. Here we describe the Merlin oligo and primer design procedures and validate their functionality compared to OptMAGE by eliminating seven AvrII restriction sites from the Escherichia coli genome.
Hanauer, David I; Bauerle, Cynthia
2015-01-01
Science, technology, engineering, and mathematics education reform efforts have called for widespread adoption of evidence-based teaching in which faculty members attend to student outcomes through assessment practice. Awareness about the importance of assessment has illuminated the need to understand what faculty members know and how they engage with assessment knowledge and practice. The Faculty Self-Reported Assessment Survey (FRAS) is a new instrument for evaluating science faculty assessment knowledge and experience. Instrument validation was composed of two distinct studies: an empirical evaluation of the psychometric properties of the FRAS and a comparative known-groups validation to explore the ability of the FRAS to differentiate levels of faculty assessment experience. The FRAS was found to be highly reliable (α = 0.96). The dimensionality of the instrument enabled distinction of assessment knowledge into categories of program design, instrumentation, and validation. In the known-groups validation, the FRAS distinguished between faculty groups with differing levels of assessment experience. Faculty members with formal assessment experience self-reported higher levels of familiarity with assessment terms, higher frequencies of assessment activity, increased confidence in conducting assessment, and more positive attitudes toward assessment than faculty members who were novices in assessment. These results suggest that the FRAS can reliably and validly differentiate levels of expertise in faculty knowledge of assessment. © 2015 D. I. Hanauer and C. Bauerle. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
NASA Technical Reports Server (NTRS)
Bauer, Frank H. (Technical Monitor); Dennehy, Neil; Gambino, Joel; Maynard, Andrew; Brady, T.; Buckley, S.; Zinchuk, J.
2003-01-01
The Inertial Stellar Compass (ISC) is a miniature, low power, stellar inertial attitude determination system with an accuracy of better than 0.1 degree (1 sigma) in three axes. The ISC consumes only 3.5 Watts of power and is contained in a 2.5 kg package. With its embedded on-board processor, the ISC provides attitude quaternion information and has Lost-in-Space (LIS) initialization capability. The attitude accuracy and LIS capability are provided by combining a wide field of view Active Pixel Sensor (APS) star camera and Micro- ElectroMechanical System (MEMS) inertial sensor information in an integrated sensor system. The performance and small form factor make the ISC a useful sensor for a wide range of missions. In particular, the ISC represents an enabling, fully integrated, micro-satellite attitude determination system. Other applications include using the ISC as a single sensor solution for attitude determination on medium performance spacecraft and as a bolt on independent safe-hold sensor or coarse acquisition sensor for many other spacecraft. NASA's New Millennium Program (NMP) has selected the ISC technology for a Space Technology 6 (ST6) flight validation experiment scheduled for 2004. NMP missions, such a s ST6, are intended to validate advanced technologies that have not flown in space in order to reduce the risk associated with their infusion into future NASA missions. This paper describes the design, operation, and performance of the ISC and outlines the technology validation plan. A number of mission applications for the ISC technology are highlighted, both for the baseline ST6 ISC configuration and more ambitious applications where ISC hardware and software modifications would be required. These applications demonstrate the wide range of Space and Earth Science missions that would benefit from infusion of the ISC technology.
Phase I Development of Neutral Beam Injector Solid-State Power System
NASA Astrophysics Data System (ADS)
Prager, James; Ziemba, Timothy; Miller, Kenneth E.; Slobodov, Ilia; Anderson, Seth
2017-10-01
Neutral beam injection (NBI) is an important tool for plasma heating, current drive and a diagnostic at fusion science experiments around the United States, including tokamaks, validation platform experiments, and privately funded fusion concepts. Currently, there are no vendors in the United States for NBI power systems. Eagle Harbor Technologies (EHT), Inc. is developing a new power system for NBI that takes advantage of the latest developments in solid-state switching. EHT has developed a resonant converter that can be scaled to the power levels required for NBI at small-scale validation platform experiments like the Lithium Tokamak Experiment. This power system can be used to modulate the NBI voltages over the course of a plasma shot, which can lead to improved control over the plasma. EHT will present initial modeling used to design this system as well as experimental data showing operation at 15 kV and 40 A for 10 ms into a test load. With support of DOE SBIR.
The IXV experience, from the mission conception to the flight results
NASA Astrophysics Data System (ADS)
Tumino, G.; Mancuso, S.; Gallego, J.-M.; Dussy, S.; Preaud, J.-P.; Di Vita, G.; Brunner, P.
2016-07-01
The atmospheric re-entry domain is a cornerstone of a wide range of space applications, ranging from reusable launcher stages developments, robotic planetary exploration, human space flight, to innovative applications such as reusable research platforms for in orbit validation of multiple space applications technologies. The Intermediate experimental Vehicle (IXV) is an advanced demonstrator which has performed in-flight experimentation of atmospheric re-entry enabling systems and technologies aspects, with significant advancements on Europe's previous flight experiences, consolidating Europe's autonomous position in the strategic field of atmospheric re-entry. The IXV mission objectives were the design, development, manufacturing, assembling and on-ground to in-flight verification of an autonomous European lifting and aerodynamically controlled reentry system, integrating critical re-entry technologies at system level. Among such critical technologies of interest, special attention was paid to aerodynamic and aerothermodynamics experimentation, including advanced instrumentation for aerothermodynamics phenomena investigations, thermal protections and hot-structures, guidance, navigation and flight control through combined jets and aerodynamic surfaces (i.e. flaps), in particular focusing on the technologies integration at system level for flight, successfully performed on February 11th, 2015.
Cooling tower plume - model and experiment
NASA Astrophysics Data System (ADS)
Cizek, Jan; Gemperle, Jiri; Strob, Miroslav; Nozicka, Jiri
The paper discusses the description of the simple model of the, so-called, steam plume, which in many cases forms during the operation of the evaporative cooling systems of the power plants, or large technological units. The model is based on semi-empirical equations that describe the behaviour of a mixture of two gases in case of the free jet stream. In the conclusion of the paper, a simple experiment is presented through which the results of the designed model shall be validated in the subsequent period.
EVAL mission requirements, phase 1
NASA Technical Reports Server (NTRS)
1976-01-01
The aspects of NASA's applications mission were enhanced by utilization of shuttle/spacelab, and payload groupings which optimize the cost of achieving the mission goals were defined. Preliminary Earth Viewing Application Laboratory (EVAL) missions, experiments, sensors, and sensor groupings were developed. The major technological EVAL themes and objectives which NASA will be addressing during the 1980 to 2,000 time period were investigated. Missions/experiments which addressed technique development, sensor development, application development, and/or operational data collection were considered as valid roles for EVAL flights.
Designing biomedical proteomics experiments: state-of-the-art and future perspectives.
Maes, Evelyne; Kelchtermans, Pieter; Bittremieux, Wout; De Grave, Kurt; Degroeve, Sven; Hooyberghs, Jef; Mertens, Inge; Baggerman, Geert; Ramon, Jan; Laukens, Kris; Martens, Lennart; Valkenborg, Dirk
2016-05-01
With the current expanded technical capabilities to perform mass spectrometry-based biomedical proteomics experiments, an improved focus on the design of experiments is crucial. As it is clear that ignoring the importance of a good design leads to an unprecedented rate of false discoveries which would poison our results, more and more tools are developed to help researchers designing proteomic experiments. In this review, we apply statistical thinking to go through the entire proteomics workflow for biomarker discovery and validation and relate the considerations that should be made at the level of hypothesis building, technology selection, experimental design and the optimization of the experimental parameters.
NASA Astrophysics Data System (ADS)
Rohringer, C.; Engel, G.; Köll, R.; Wagner, W.; van Helden, W.
2017-10-01
The inclusion of solar thermal energy into energy systems requires storage possibilities to overcome the gap between supply and demand. Storage of thermal energy with closed sorption thermal energy systems has the advantage of low thermal losses and high energy density. However, the efficiency of these systems needs yet to be increased to become competitive on the market. In this paper, the so-called “charge boost technology” is developed and tested via experiments as a new concept for the efficiency increase of compact thermal energy storages. The main benefit of the charge boost technology is that it can reach a defined state of charge for sorption thermal energy storages at lower temperature levels than classic pure desorption processes. Experiments are conducted to provide a proof of principle for this concept. The results show that the charge boost technology does function as predicted and is a viable option for further improvement of sorption thermal energy storages. Subsequently, a new process application is developed by the author with strong focus on the utilization of the advantages of the charge boost technology over conventional desorption processes. After completion of the conceptual design, the theoretical calculations are validated via experiments.
Use of Synchronized Phasor Measurements for Model Validation in ERCOT
NASA Astrophysics Data System (ADS)
Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill
2013-05-01
This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.
Neuronix enables continuous, simultaneous neural recording and electrical microstimulation.
Zhi Yang; Jian Xu; Anh Tuan Nguyen; Tong Wu; Wenfeng Zhao; Wing-Kin Tam
2016-08-01
This paper reports a novel neurotechnology (Neuronix) and its validation through experiments. It is a miniature system-on-chip (SoC) that allows recording with simultaneous electrical microstimulation. This function has not been demonstrated before and enables precise, closed-loop neuromodulation. Neuronix represents recent advancement in brain technology and applies to both animal research and clinical applications.
ERIC Educational Resources Information Center
Yang, Ya-Ting C.; Gamble, Jeffrey; Tang, Shiun-Yi S.
2012-01-01
The challenge of providing authentic experiences and interactions for fostering oral proficiency and motivation in foreign languages is an opportunity for innovation in educational technology and instructional design. Although several recent innovations have received the attention of scholars, empirical investigation and validation is often…
Satellite-based soil moisture validation and field experiments; Skylab to SMAP
USDA-ARS?s Scientific Manuscript database
Soil moisture remote sensing has reached a level of maturity that is now limited primarily by technology and funding. This is a result of extensive research and development that began in earnest in the 1970s and by the late 1990s had provided the basis and direction needed to support two dedicated s...
Design of a water electrolysis flight experiment
NASA Technical Reports Server (NTRS)
Lee, M. Gene; Grigger, David J.; Thompson, C. Dean; Cusick, Robert J.
1993-01-01
Supply of oxygen (O2) and hydrogen (H2) by electolyzing water in space will play an important role in meeting the National Aeronautics and Space Administration's (NASA's) needs and goals for future space missios. Both O2 and H2 are envisioned to be used in a variety of processes including crew life support, spacecraft propulsion, extravehicular activity, electrical power generation/storage as well as in scientific experiment and manufacturing processes. The Electrolysis Performance Improvement Concept Study (EPICS) flight experiment described herein is sponsored by NASA Headquarters as a part of the In-Space Technology Experiment Program (IN-STEP). The objective of the EPICS is to further contribute to the improvement of the SEF technology, specifially by demonstrating and validating the SFE electromechanical process in microgravity as well as investigating perrformance improvements projected possible in a microgravity environment. This paper defines the experiment objective and presents the results of the preliminary design of the EPICS. The experiment will include testing three subscale self-contained SFE units: one containing baseline components, and two units having variations in key component materials. Tests will be conducted at varying current and thermal condition.
U.S.A.B.I.L.I.T.Y. Framework for Older Adults.
Caboral-Stevens, Meriam; Whetsell, Martha V; Evangelista, Lorraine S; Cypress, Brigitte; Nickitas, Donna
2015-01-01
The purpose of the current study was to present a framework to determine potential usability of health websites by older adults. Review of the literature showed paucity of nursing theory related to the use of technology and usability, particularly in older adults. The Roy Adaptation Model, a widely used nursing theory, was chosen to provide framework for the new model. Technology constructs from the Technology Acceptance Model and United Theory of Acceptance and Use of Technology and behavioral control construct from the Theory of Planned Behavior were integrated into the construction of the derived model. The Use of Technology for Adaptation by Older Adults and/or Those With Limited Literacy (U.S.A.B.I.L.I.T.Y.) Model was constructed from the integration of diverse theoretical/conceptual perspectives. The four determinants of usability in the conceptual model include (a) efficiency, (b) learnability, (c) perceived user experience, and (d) perceived control. Because of the lack of well-validated survey questionnaires to measure these determinants, a U.S.A.B.I.L.I.T.Y. Survey was developed. A panel of experts evaluated face and content validity of the new instrument. Internal consistency of the new instrument was 0.96. Usability is key to accepting technology. The derived U.S.A.B.I.L.I.T.Y. framework could serve as a guide for nurses in formative evaluation of technology. Copyright 2015, SLACK Incorporated.
NASA Technical Reports Server (NTRS)
Wachholz, James J.; Murphy, David M.
1996-01-01
The SCARLET I (Solar Concentrator Army with Refractive Linear Element Technology) solar array wing was designed and built to demonstrate, in flight, the feasibility of integrating deployable concentrator optics within the design envelope of typical rigid array technology. Innovative mechanism designs were used throughout the array, and a full series of qualification tests were successfully performed in anticipation of a flight on the Multiple Experiment Transporter to Earth Orbit and Return (METEOR) spacecraft. Even though the Conestoga launch vehicle was unable to place the spacecraft in orbit, the program effort was successful in achieving the milestones of analytical and design development functional validation, and flight qualification, thus leading to a future flight evaluation for the SCARLET technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wachholz, J.J.; Murphy, D.M.
1996-05-01
The SCARLET I (Solar Concentrator Army with Refractive Linear Element Technology) solar array wing was designed and built to demonstrate, in flight, the feasibility of integrating deployable concentrator optics within the design envelope of typical rigid array technology. Innovative mechanism designs were used throughout the array, and a full series of qualification tests were successfully performed in anticipation of a flight on the Multiple Experiment Transporter to Earth Orbit and Return (METEOR) spacecraft. Even though the Conestoga launch vehicle was unable to place the spacecraft in orbit, the program effort was successful in achieving the milestones of analytical and designmore » development functional validation, and flight qualification, thus leading to a future flight evaluation for the SCARLET technology.« less
Simulation studies for the evaluation of health information technologies: experiences and results.
Ammenwerth, Elske; Hackl, Werner O; Binzer, Kristine; Christoffersen, Tue E H; Jensen, Sanne; Lawton, Kitta; Skjoet, Peter; Nohr, Christian
It is essential for new health information technologies (IT) to undergo rigorous evaluations to ensure they are effective and safe for use in real-world situations. However, evaluation of new health IT is challenging, as field studies are often not feasible when the technology being evaluated is not sufficiently mature. Laboratory-based evaluations have also been shown to have insufficient external validity. Simulation studies seem to be a way to bridge this gap. The aim of this study was to evaluate, using a simulation methodology, the impact of a new prototype of an electronic medication management system on the appropriateness of prescriptions and drug-related activities, including laboratory test ordering or medication changes. This article presents the results of a controlled simulation study with 50 simulation runs, including ten doctors and five simulation patients, and discusses experiences and lessons learnt while conducting the study. Although the new electronic medication management system showed tendencies to improve medication safety when compared with the standard system, this tendency was not significant. Altogether, five distinct situations were identified where the new medication management system did help to improve medication safety. This simulation study provided a good compromise between internal validity and external validity. However, several challenges need to be addressed when undertaking simulation evaluations including: preparation of adequate test cases; training of participants before using unfamiliar applications; consideration of time, effort and costs of conducting the simulation; technical maturity of the evaluated system; and allowing adequate preparation of simulation scenarios and simulation setting. Simulation studies are an interesting but time-consuming approach, which can be used to evaluate newly developed health IT systems, particularly those systems that are not yet sufficiently mature to undergo field evaluation studies.
Ground Testing of a 10 K Sorption Cryocooler Flight Experiment (BETSCE)
NASA Technical Reports Server (NTRS)
Bard, S.; Wu, J.; Karlmann, P.; Cowgill, P.; Mirate, C.; Rodriguez, J.
1994-01-01
The Brilliant Eyes Ten-Kelvin Sorption Cryocooler Experiment (BETSCE) is a Space Shuttle side-wall-mounted flight experiment designed to demonstrate 10 K sorption cryocooler technology in a space environment. The BETSCE objectives are to: (1) provide a thorough end-to-end characterization and space performance validation of a complete, multistage, automated, closed-cycle hydride sorption cryocooler in the 10 to 30 K temperature range, (2) acquire the quantitative microgravity database required to provide confident engineering design, scaling, and optimization, (3) advance the enabling technologies and resolve integration issues, and (4) provide hardware qualification and safety verification heritage. BETSCE ground tests were the first-ever demonstration of a complete closed-cycle 10 K sorption cryocooler. Test results exceeded functional requirements. This paper summarizes functional and environmental ground test results, planned characterization tests, important development challenges that were overcome, and valuable lessons-learned.
Wilkinson, Ann; While, Alison E; Roberts, Julia
2009-04-01
This paper is a report of a review to describe and discuss the psychometric properties of instruments used in healthcare education settings measuring experience and attitudes of healthcare students regarding their information and communication technology skills and their use of computers and the Internet for education. Healthcare professionals are expected to be computer and information literate at registration. A previous review of evaluative studies of computer-based learning suggests that methods of measuring learners' attitudes to computers and computer aided learning are problematic. A search of eight health and social science databases located 49 papers, the majority published between 1995 and January 2007, focusing on the experience and attitudes of students in the healthcare professions towards computers and e-learning. An integrative approach was adopted, with narrative description of findings. Criteria for inclusion were quantitative studies using survey tools with samples of healthcare students and concerning computer and information literacy skills, access to computers, experience with computers and use of computers and the Internet for education purposes. Since the 1980s a number of instruments have been developed, mostly in the United States of America, to measure attitudes to computers, anxiety about computer use, information and communication technology skills, satisfaction and more recently attitudes to the Internet and computers for education. The psychometric properties are poorly described. Advances in computers and technology mean that many earlier tools are no longer valid. Measures of the experience and attitudes of healthcare students to the increased use of e-learning require development in line with computer and technology advances.
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; O'Donnell, James R.; Hsu, Oscar H.; Ziemer, John K.; Dunn, Charles E.
2017-01-01
The Space Technology-7 Disturbance Reduction System (DRS) is an experiment package aboard the European Space Agency (ESA) LISA Pathfinder spacecraft. LISA Pathfinder launched from Kourou, French Guiana on December 3, 2015. The DRS is tasked to validate two specific technologies: colloidal micro-Newton thrusters (CMNT) to provide low-noise control capability of the spacecraft, and drag-free controlflight. This validation is performed using highly sensitive drag-free sensors, which are provided by the LISA Technology Package of the European Space Agency. The Disturbance Reduction System is required to maintain the spacecrafts position with respect to a free-floating test mass to better than 10nm/(square root of Hz), along its sensitive axis (axis in optical metrology). It also has a goal of limiting the residual accelerations of any of the two test masses to below 30 x 10(exp -14) (1 + ([f/3 mHz](exp 2))) m/sq s/(square root of Hz), over the frequency range of 1 to 30 mHz.This paper briefly describes the design and the expected on-orbit performance of the control system for the two modes wherein the drag-free performance requirements are verified. The on-orbit performance of these modes are then compared to the requirements, as well as to the expected performance, and discussed.
Central Limit Theorem: New SOCR Applet and Demonstration Activity
Dinov, Ivo D.; Christou, Nicolas; Sanchez, Juana
2011-01-01
Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multifaceted learning environments, which may facilitate student comprehension and information retention. In this manuscript, we describe one such innovative effort of using technological tools for improving student motivation and learning of the theory, practice and usability of the Central Limit Theorem (CLT) in probability and statistics courses. Our approach is based on harnessing the computational libraries developed by the Statistics Online Computational Resource (SOCR) to design a new interactive Java applet and a corresponding demonstration activity that illustrate the meaning and the power of the CLT. The CLT applet and activity have clear common goals; to provide graphical representation of the CLT, to improve student intuition, and to empirically validate and establish the limits of the CLT. The SOCR CLT activity consists of four experiments that demonstrate the assumptions, meaning and implications of the CLT and ties these to specific hands-on simulations. We include a number of examples illustrating the theory and applications of the CLT. Both the SOCR CLT applet and activity are freely available online to the community to test, validate and extend (Applet: http://www.socr.ucla.edu/htmls/SOCR_Experiments.html and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_GeneralCentralLimitTheorem). PMID:21833159
Central Limit Theorem: New SOCR Applet and Demonstration Activity.
Dinov, Ivo D; Christou, Nicolas; Sanchez, Juana
2008-07-01
Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multifaceted learning environments, which may facilitate student comprehension and information retention. In this manuscript, we describe one such innovative effort of using technological tools for improving student motivation and learning of the theory, practice and usability of the Central Limit Theorem (CLT) in probability and statistics courses. Our approach is based on harnessing the computational libraries developed by the Statistics Online Computational Resource (SOCR) to design a new interactive Java applet and a corresponding demonstration activity that illustrate the meaning and the power of the CLT. The CLT applet and activity have clear common goals; to provide graphical representation of the CLT, to improve student intuition, and to empirically validate and establish the limits of the CLT. The SOCR CLT activity consists of four experiments that demonstrate the assumptions, meaning and implications of the CLT and ties these to specific hands-on simulations. We include a number of examples illustrating the theory and applications of the CLT. Both the SOCR CLT applet and activity are freely available online to the community to test, validate and extend (Applet: http://www.socr.ucla.edu/htmls/SOCR_Experiments.html and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_GeneralCentralLimitTheorem).
Provenance for Runtime Workflow Steering and Validation in Computational Seismology
NASA Astrophysics Data System (ADS)
Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.
2014-12-01
Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage technology, experimenting ways to ensure a rapid and flexible access to the lineage traces. It supports the users with the visualisation of graphical products and offers combined operations to access and download the data which may be selectively stored at runtime, into dedicated data archives.
Structural Analysis and Testing of the Inflatable Re-entry Vehicle Experiment (IRVE)
NASA Technical Reports Server (NTRS)
Lindell, Michael C.; Hughes, Stephen J.; Dixon, Megan; Wiley, Cliff E.
2006-01-01
The Inflatable Re-entry Vehicle Experiment (IRVE) is a 3.0 meter, 60 degree half-angle sphere cone, inflatable aeroshell experiment designed to demonstrate various aspects of inflatable technology during Earth re-entry. IRVE will be launched on a Terrier-Improved Orion sounding rocket from NASA s Wallops Flight Facility in the fall of 2006 to an altitude of approximately 164 kilometers and re-enter the Earth s atmosphere. The experiment will demonstrate exo-atmospheric inflation, inflatable structure leak performance throughout the flight regime, structural integrity under aerodynamic pressure and associated deceleration loads, thermal protection system performance, and aerodynamic stability. Structural integrity and dynamic response of the inflatable will be monitored with photogrammetric measurements of the leeward side of the aeroshell during flight. Aerodynamic stability and drag performance will be verified with on-board inertial measurements and radar tracking from multiple ground radar stations. In addition to demonstrating inflatable technology, IRVE will help validate structural, aerothermal, and trajectory modeling and analysis techniques for the inflatable aeroshell system. This paper discusses the structural analysis and testing of the IRVE inflatable structure. Equations are presented for calculating fabric loads in sphere cone aeroshells, and finite element results are presented which validate the equations. Fabric material properties and testing are discussed along with aeroshell fabrication techniques. Stiffness and dynamics tests conducted on a small-scale development unit and a full-scale prototype unit are presented along with correlated finite element models to predict the in-flight fundamental mod
Inflatable Re-Entry Vehicle Experiment (IRVE) Design Overview
NASA Technical Reports Server (NTRS)
Hughes, Stephen J.; Dillman, Robert A.; Starr, Brett R.; Stephan, Ryan A.; Lindell, Michael C.; Player, Charles J.; Cheatwood, F. McNeil
2005-01-01
Inflatable aeroshells offer several advantages over traditional rigid aeroshells for atmospheric entry. Inflatables offer increased payload volume fraction of the launch vehicle shroud and the possibility to deliver more payload mass to the surface for equivalent trajectory constraints. An inflatable s diameter is not constrained by the launch vehicle shroud. The resultant larger drag area can provide deceleration equivalent to a rigid system at higher atmospheric altitudes, thus offering access to higher landing sites. When stowed for launch and cruise, inflatable aeroshells allow access to the payload after the vehicle is integrated for launch and offer direct access to vehicle structure for structural attachment with the launch vehicle. They also offer an opportunity to eliminate system duplication between the cruise stage and entry vehicle. There are however several potential technical challenges for inflatable aeroshells. First and foremost is the fact that they are flexible structures. That flexibility could lead to unpredictable drag performance or an aerostructural dynamic instability. In addition, durability of large inflatable structures may limit their application. They are susceptible to puncture, a potentially catastrophic insult, from many possible sources. Finally, aerothermal heating during planetary entry poses a significant challenge to a thin membrane. NASA Langley Research Center and NASA's Wallops Flight Facility are jointly developing inflatable aeroshell technology for use on future NASA missions. The technology will be demonstrated in the Inflatable Re-entry Vehicle Experiment (IRVE). This paper will detail the development of the initial IRVE inflatable system to be launched on a Terrier/Orion sounding rocket in the fourth quarter of CY2005. The experiment will demonstrate achievable packaging efficiency of the inflatable aeroshell for launch, inflation, leak performance of the inflatable system throughout the flight regime, structural integrity when exposed to a relevant dynamic pressure and aerodynamic stability of the inflatable system. Structural integrity and structural response of the inflatable will be verified with photogrammetric measurements of the back side of the aeroshell in flight. Aerodynamic stability as well as drag performance will be verified with on board inertial measurements and radar tracking from multiple ground radar stations. The experiment will yield valuable information about zero-g vacuum deployment dynamics of the flexible inflatable structure with both inertial and photographic measurements. In addition to demonstrating inflatable technology, IRVE will validate structural, aerothermal, and trajectory modeling techniques for the inflatable. Structural response determined from photogrammetrics will validate structural models, skin temperature measurements and additional in-depth temperature measurements will validate material thermal performance models, and on board inertial measurements along with radar tracking from multiple ground radar stations will validate trajectory simulation models.
Li, Tingwen; Rogers, William A.; Syamlal, Madhava; ...
2016-07-29
Here, the MFiX suite of multiphase computational fluid dynamics (CFD) codes is being developed at U.S. Department of Energy's National Energy Technology Laboratory (NETL). It includes several different approaches to multiphase simulation: MFiX-TFM, a two-fluid (Eulerian–Eulerian) model; MFiX-DEM, an Eulerian fluid model with a Lagrangian Discrete Element Model for the solids phase; and MFiX-PIC, Eulerian fluid model with Lagrangian particle ‘parcels’ representing particle groups. These models are undergoing continuous development and application, with verification, validation, and uncertainty quantification (VV&UQ) as integrated activities. After a brief summary of recent progress in the verification, validation and uncertainty quantification (VV&UQ), this article highlightsmore » two recent accomplishments in the application of MFiX-TFM to fossil energy technology development. First, recent application of MFiX to the pilot-scale KBR TRIG™ Transport Gasifier located at DOE's National Carbon Capture Center (NCCC) is described. Gasifier performance over a range of operating conditions was modeled and compared to NCCC operational data to validate the ability of the model to predict parametric behavior. Second, comparison of code predictions at a detailed fundamental scale is presented studying solid sorbents for the post-combustion capture of CO 2 from flue gas. Specifically designed NETL experiments are being used to validate hydrodynamics and chemical kinetics for the sorbent-based carbon capture process.« less
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.
Measuring Tropospheric Winds from Space Using a Coherent Doppler Lidar Technique
NASA Technical Reports Server (NTRS)
Miller, Timothy L.; Kavaya, Michael J.; Emmitt, G. David
1999-01-01
The global measurement of tropospheric wind profiles has been cited by the operational meteorological community as the most important missing element in the present and planned observing system. The most practical and economical method for obtaining this measurement is from low earth orbit, utilizing a Doppler lidar (laser radar) technique. Specifically, this paper will describe the coherent Doppler wind lidar (CDWL) technique, the design and progress of a current space flight project to fly such a system on the Space Shuttle, and plans for future flights of similar instruments. The SPARCLE (SPAce Readiness Coherent Lidar Experiment) is a Shuttle-based instrument whose flight is targeted for March, 2001. The objectives of SPARCLE are three-fold: Confirm that the coherent Doppler lidar technique can measure line-of-sight winds to within 1-2 m/s accuracy; Collect data to permit validation and improvement of instrument performance models to enable better design of future missions; and Collect wind and backscatter data for future mission optimization and for atmospheric studies. These objectives reflect the nature of the experiment and its program sponsor, NASA's New Millennium Program. The experiment is a technology validation mission whose primary purpose is to provide a space flight validation of this particular technology. (It should be noted that the CDWL technique has successfully been implemented from ground-based and aircraft-based platforms for a number of years.) Since the conduct of the SPARCLE mission is tied to future decisions on the choice of technology for free-flying, operational missions, the collection of data is intrinsically tied to the validation and improvement of instrument performance models that predict the sensitivity and accuracy of any particular present or future instrument system. The challenges unique to space flight for an instrument such as SPARCLE and follow-ons include: Obtaining the required lidar sensitivity from the long distance of orbit height to the lower atmosphere; Maintaining optical alignments after launch to orbit, and during operations in "microgravity"; Obtaining pointing knowledge of sufficient accuracy to remove the speed of the spacecraft (and the rotating Earth) from the measurements; Providing sufficient power (not a problem on the Shuttle) and cooling to the instrument. The paper will describe the status and challenges of the SPARCLE project, the value of obtaining wind data from orbit, and will present a roadmap to future instruments for scientific research and operational meteorology.
A conceptual framework of outcomes for caregivers of assistive technology users.
Demers, Louise; Fuhrer, Marcus J; Jutai, Jeffrey; Lenker, James; Depa, Malgorzata; De Ruyter, Frank
2009-08-01
To develop and validate the content of a conceptual framework concerning outcomes for caregivers whose recipients are assistive technology users. The study was designed in four stages. First, a list of potential key variables relevant to the caregivers of assistive technology users was generated from a review of the existing literature and semistructured interviews with caregivers. Second, the variables were analyzed, regrouped, and partitioned, using a conceptual mapping approach. Third, the key areas were anchored in a general stress model of caregiving. Finally, the judgments of rehabilitation experts were used to evaluate the conceptual framework. An important result of this study is the identification of a complex set of variables that need to be considered when examining the experience of caregivers of assistive technology users. Stressors, such as types of assistance, number of tasks, and physical effort, are predominant contributors to caregiver outcomes along with caregivers' personal resources acting as mediating factors (intervening variables) and assistive technology acting as a key moderating factor (effect modifier variable). Recipients' use of assistive technology can enhance caregivers' well being because of its potential for alleviating a number of stressors associated with caregiving. Viewed as a whole, this work demonstrates that the assistive technology experience of caregivers has many facets that merit the attention of outcomes researchers.
Nicola, Kristy; Waugh, Jemimah; Charles, Emily; Russell, Trevor
2018-06-01
In rural and remote communities children with motor difficulties have less access to rehabilitation services. Telerehabilitation technology is a potential method to overcome barriers restricting access to healthcare in these areas. Assessment is necessary to guide clinical reasoning; however it is unclear which paediatric assessments can be administered remotely. The Movement Assessment Battery for Children - 2nd Edition is commonly used by various health professionals to assess motor performance of children. The aim of this study was to investigate the feasibility and concurrent validity of performing the Movement Assessment Battery for Children - 2nd Edition remotely via telerehabilitation technology compared to the conventional in-person method. Fifty-nine children enrolled in a state school (5-11 years old) volunteered to perform one in-person and one telerehabilitation mediated assessment. The order of the method of delivery and the therapist performing the assessment were randomized. After both assessments were complete, a participant satisfaction questionnaire was completed by each child. The Bland-Altman limits of agreement for the total test standard score were -3.15 to 3.22 which is smaller than a pre-determined clinically acceptable margin based on the smallest detectable change. This study establishes the feasibility and concurrent validity of the administration of the Movement Assessment Battery for Children - 2nd Edition via telerehabilitation technology. Overall, participants perceived their experience with telerehabilitation positively. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Hassad, Rossi; Coxon, APM
2007-01-01
Despite more than a decade of reform efforts, students continue to experience difficulty understanding and applying statistical concepts. The predominant focus of reform has been on content, pedagogy, technology and assessment, with little attention to instructor characteristics. However, there is strong theoretical and empirical evidence that…
ERIC Educational Resources Information Center
Patacsil, Frederick F.; Tablatin, Christine Lourrine S.
2017-01-01
The research paper proposes a skills gap methodology that utilized the respondent experiences in the internship program to measure the importance of the Information Technology (IT) skills gap as perceived by IT students and the industry. The questionnaires were formulated based on previous studies, however, was slightly modified, validated and…
The Transition from Spacecraft Development Ot Flight Operation: Human Factor Considerations
NASA Technical Reports Server (NTRS)
Basilio, Ralph R.
2000-01-01
In the field of aeronautics and astronautics, a paradigm shift has been witnessed by those in academia, research and development, and private industry. Long development life cycles and the budgets to support such programs and projects has given way to aggressive task schedules and leaner resources to draw from all the while challenging assigned individuals to create and produce improved products of processes. however, this "faster, better, cheaper" concept cannot merely be applied to the design, development, and test of complex systems such as earth-orbiting of interplanetary robotic spacecraft. Full advantage is not possible without due consideration and application to mission operations planning and flight operations, Equally as important as the flight system, the mission operations system consisting of qualified personnel, ground hardware and software tools, and verified and validated operational processes, should also be regarded as a complex system requiring personnel to draw upon formal education, training, related experiences, and heuristic reasoning in engineering an effective and efficient system. Unquestionably, qualified personnel are the most important elements of a mission operations system. This paper examines the experiences of the Deep Space I Project, the first in a series of new technology in-flight validation missions sponsored by the United States National Aeronautics and Space Administration (NASA), specifically, in developing a subsystems analysis and technology validation team comprised of former spacecraft development personnel. Human factor considerations are investigated from initial concept/vision formulation; through operational process development; personnel test and training; to initial uplink product development and test support. Emphasis has been placed on challenges and applied or recommended solutions, so as to provide opportunities for future programs and projects to address and disposition potential issues and concerns as early as possible to reap the benefits associated with learning from other's past experiences.
Hydrazine Catalyst Production: Sustaining S-405 Technology
NASA Technical Reports Server (NTRS)
Wucherer, E. J.; Cook, Timothy; Stiefel, Mark; Humphries, Randy, Jr.; Parker, Janet
2003-01-01
The development of the iridium-based Shell 405 catalyst for spontaneous decomposition of hydrazine was one of the key enabling technologies for today's spacecraft and launch vehicles. To ensure that this crucial technology was not lost when Shell elected to exit the business, Aerojet, supported by NASA, has developed a dedicated catalyst production facility that will supply catalyst for future spacecraft and launch vehicle requirements. We have undertaken a program to transfer catalyst production from Shell Chemical USA (Houston, TX) to Aerojet's Redmond, WA location. This technology transition was aided by Aerojet's 30 years of catalyst manufacturing experience and NASA diligence and support in sustaining essential technologies. The facility has produced and tested S-405 catalyst to existing Shell 405 specifications and standards. Our presentation will describe the technology transition effort including development of the manufacturing facility, capture of the manufacturing process, test equipment validation, initial batch build and final testing.
Technology readiness levels for the new millennium program
NASA Technical Reports Server (NTRS)
Moynihan, P. I.; Minning, C. P.; Stocky, J. F.
2003-01-01
NASA's New Millennium Program (NMP) seeks to advance space exploration by providing an in-space validating mechanism to verify the maturity of promising advanced technologies that cannot be adequately validated with Earth-based testing alone. In meeting this objective, NMP uses NASA Technology Readiness Levels (TRL) as key indicators of technology advancement and assesses development progress against this generalized metric. By providing an opportunity for in-space validation, NMP can mature a suitable advanced technology from TRL 4 (component and/or breadboard validation in laboratory environment) to a TRL 7 (system prototype demonstrated in an Earth-based space environment). Spaceflight technology comprises a myriad of categories, types, and functions, and as each individual technology emerges, a consistent interpretation of its specific state of technological advancement relative to other technologies is problematic.
Development and Validation of the Student Tool for Technology Literacy (ST[superscript 2]L)
ERIC Educational Resources Information Center
Hohlfeld, Tina N.; Ritzhaupt, Albert D.; Barron, Ann E.
2010-01-01
This article provides an overview of the development and validation of the Student Tool for Technology Literacy (ST[superscript 2]L). Developing valid and reliable objective performance measures for monitoring technology literacy is important to all organizations charged with equipping students with the technology skills needed to successfully…
NASA Technical Reports Server (NTRS)
Foster, John D.; Moralez, Ernesto, III; Franklin, James A.; Schroeder, Jeffery A.
1987-01-01
Results of a substantial body of ground-based simulation experiments indicate that a high degree of precision of operation for recovery aboard small ships in heavy seas and low visibility with acceptable levels of effort by the pilot can be achieved by integrating the aircraft flight and propulsion controls. The availability of digital fly-by-wire controls makes it feasible to implement an integrated control design to achieve and demonstrate in flight the operational benefits promised by the simulation experience. It remains to validate these systems concepts in flight to establish their value for advanced short takeoff vertical landing (STOVL) aircraft designs. This paper summarizes analytical studies and simulation experiments which provide a basis for the flight research program that will develop and validate critical technologies for advanced STOVL aircraft through the development and evaluation of advanced, integrated control and display concepts, and lays out the plan for the flight program that will be conducted on NASA's V/STOL Research Aircraft (VSRA).
Development of a PPT for the EO-1 Spacecraft
NASA Technical Reports Server (NTRS)
Benson, Scott W.; Arrington, Lynn A.; Hoskins, W. Andrew; Meckel, Nicole J.
2000-01-01
A Pulsed Plasma Thruster (PPT) has been developed for use in a technology demonstration flight experiment on the Earth Observing 1 (EO-1) New Millennium Program mission. The thruster replaces the spacecraft pitch axis momentum wheel for control and momentum management during an experiment of a minimum three-day duration. The EO-1 PPT configuration is a combination of new technology and design heritage from similar systems flown in the 1970's and 1980's. Acceptance testing of the protoflight unit has validated readiness for flight, and integration with the spacecraft, including initial combined testing, has been completed. The thruster provides a range of capability from 90 microN-sec impulse bit at 650 sec specific impulse for 12 W input power, through 860 microN-sec impulse bit at 1400 see specific impulse for 70 W input power. Development of this thruster reinitiates technology research and development and re-establishes an industry base for production of flight hardware. This paper reviews the EO-1 PPT development, including technology selection, design and fabrication, acceptance testing, and initial spacecraft integration and test.
HWDA: A coherence recognition and resolution algorithm for hybrid web data aggregation
NASA Astrophysics Data System (ADS)
Guo, Shuhang; Wang, Jian; Wang, Tong
2017-09-01
Aiming at the object confliction recognition and resolution problem for hybrid distributed data stream aggregation, a distributed data stream object coherence solution technology is proposed. Firstly, the framework was defined for the object coherence conflict recognition and resolution, named HWDA. Secondly, an object coherence recognition technology was proposed based on formal language description logic and hierarchical dependency relationship between logic rules. Thirdly, a conflict traversal recognition algorithm was proposed based on the defined dependency graph. Next, the conflict resolution technology was prompted based on resolution pattern matching including the definition of the three types of conflict, conflict resolution matching pattern and arbitration resolution method. At last, the experiment use two kinds of web test data sets to validate the effect of application utilizing the conflict recognition and resolution technology of HWDA.
Dozier, Samantha; Brown, Jeffrey; Currie, Alistair
2011-01-01
Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625
A novel augmented reality simulator for skills assessment in minimal invasive surgery.
Lahanas, Vasileios; Loukas, Constantinos; Smailis, Nikolaos; Georgiou, Evangelos
2015-08-01
Over the past decade, simulation-based training has come to the foreground as an efficient method for training and assessment of surgical skills in minimal invasive surgery. Box-trainers and virtual reality (VR) simulators have been introduced in the teaching curricula and have substituted to some extent the traditional model of training based on animals or cadavers. Augmented reality (AR) is a new technology that allows blending of VR elements and real objects within a real-world scene. In this paper, we present a novel AR simulator for assessment of basic laparoscopic skills. The components of the proposed system include: a box-trainer, a camera and a set of laparoscopic tools equipped with custom-made sensors that allow interaction with VR training elements. Three AR tasks were developed, focusing on basic skills such as perception of depth of field, hand-eye coordination and bimanual operation. The construct validity of the system was evaluated via a comparison between two experience groups: novices with no experience in laparoscopic surgery and experienced surgeons. The observed metrics included task execution time, tool pathlength and two task-specific errors. The study also included a feedback questionnaire requiring participants to evaluate the face-validity of the system. Between-group comparison demonstrated highly significant differences (<0.01) in all performance metrics and tasks denoting the simulator's construct validity. Qualitative analysis on the instruments' trajectories highlighted differences between novices and experts regarding smoothness and economy of motion. Subjects' ratings on the feedback questionnaire highlighted the face-validity of the training system. The results highlight the potential of the proposed simulator to discriminate groups with different expertise providing a proof of concept for the potential use of AR as a core technology for laparoscopic simulation training.
A Perspective on Coupled Multiscale Simulation and Validation in Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. P. Short; D. Gaston; C. R. Stanek
2014-01-01
The field of nuclear materials encompasses numerous opportunities to address and ultimately solve longstanding industrial problems by improving the fundamental understanding of materials through the integration of experiments with multiscale modeling and high-performance simulation. A particularly noteworthy example is an ongoing study of axial power distortions in a nuclear reactor induced by corrosion deposits, known as CRUD (Chalk River unidentified deposits). We describe how progress is being made toward achieving scientific advances and technological solutions on two fronts. Specifically, the study of thermal conductivity of CRUD phases has augmented missing data as well as revealed new mechanisms. Additionally, the developmentmore » of a multiscale simulation framework shows potential for the validation of a new capability to predict the power distribution of a reactor, in effect direct evidence of technological impact. The material- and system-level challenges identified in the study of CRUD are similar to other well-known vexing problems in nuclear materials, such as irradiation accelerated corrosion, stress corrosion cracking, and void swelling; they all involve connecting materials science fundamentals at the atomistic- and mesoscales to technology challenges at the macroscale.« less
Carrier Plus: A sensor payload for Living With a Star Space Environment Testbed (LWS/SET)
NASA Technical Reports Server (NTRS)
Marshall, Cheryl J.; Moss, Steven; Howard, Regan; LaBel, Kenneth A.; Grycewicz, Tom; Barth, Janet L.; Brewer, Dana
2003-01-01
The Defense Threat Reduction Agency (DTR4) and National Aeronautics and Space Administration (NASA) Goddard Space Flight Center are collaborating to develop the Carrier Plus sensor experiment platform as a capability of the Space Environments Testbed (SET). The Space Environment Testbed (SET) provides flight opportunities for technology experiments as part of NASA's Living With a Star (LWS) program. The Carrier Plus will provide new capability to characterize sensor technologies such as state-of-the-art visible focal plane arrays (FPAs) in a natural space radiation environment. The technical objectives include on-orbit validation of recently developed FPA technologies and performance prediction methodologies, as well as characterization of the FPA radiation response to total ionizing dose damage, displacement damage and transients. It is expected that the sensor experiment will carry 4-6 FPAs and associated radiation correlative environment monitors (CEMs) for a 2006-2007 launch. Sensor technology candidates may include n- and p-charge coupled devices (CCDs), active pixel sensors (APS), and hybrid CMOS arrays. The presentation will describe the Carrier Plus goals and objectives, as well as provide details about the architecture and design. More information on the LWS program can be found at http://lws.gsfc.nasa.gov/. Business announcements for LWS/SET and program briefings are posted at http://lws-set.gsfc.nasa.gov
Li, Xiuhong; Cheng, Xiao; Yang, Rongjin; Liu, Qiang; Qiu, Yubao; Zhang, Jialin; Cai, Erli; Zhao, Long
2016-01-01
Of the modern technologies in polar-region monitoring, the remote sensing technology that can instantaneously form large-scale images has become much more important in helping acquire parameters such as the freezing and melting of ice as well as the surface temperature, which can be used in the research of global climate change, Antarctic ice sheet responses, and cap formation and evolution. However, the acquirement of those parameters is impacted remarkably by the climate and satellite transit time which makes it almost impossible to have timely and continuous observation data. In this research, a wireless sensor-based online monitoring platform (WSOOP) for the extreme polar environment is applied to obtain a long-term series of data which is site-specific and continuous in time. Those data are compared and validated with the data from a weather station at Zhongshan Station Antarctica and the result shows an obvious correlation. Then those data are used to validate the remote sensing products of the freezing and melting of ice and the surface temperature and the result also indicated a similar correlation. The experiment in Antarctica has proven that WSOOP is an effective system to validate remotely sensed data in the polar region. PMID:27869668
Li, Xiuhong; Cheng, Xiao; Yang, Rongjin; Liu, Qiang; Qiu, Yubao; Zhang, Jialin; Cai, Erli; Zhao, Long
2016-11-17
Of the modern technologies in polar-region monitoring, the remote sensing technology that can instantaneously form large-scale images has become much more important in helping acquire parameters such as the freezing and melting of ice as well as the surface temperature, which can be used in the research of global climate change, Antarctic ice sheet responses, and cap formation and evolution. However, the acquirement of those parameters is impacted remarkably by the climate and satellite transit time which makes it almost impossible to have timely and continuous observation data. In this research, a wireless sensor-based online monitoring platform (WSOOP) for the extreme polar environment is applied to obtain a long-term series of data which is site-specific and continuous in time. Those data are compared and validated with the data from a weather station at Zhongshan Station Antarctica and the result shows an obvious correlation. Then those data are used to validate the remote sensing products of the freezing and melting of ice and the surface temperature and the result also indicated a similar correlation. The experiment in Antarctica has proven that WSOOP is an effective system to validate remotely sensed data in the polar region.
ERIC Educational Resources Information Center
Wong, Kung-Teck; Osman, Rosma bt; Goh, Pauline Swee Choo; Rahmat, Mohd Khairezan
2013-01-01
This study sets out to validate and test the Technology Acceptance Model (TAM) in the context of Malaysian student teachers' integration of their technology in teaching and learning. To establish factorial validity, data collected from 302 respondents were tested against the TAM using confirmatory factor analysis (CFA), and structural equation…
Validation of model predictions of pore-scale fluid distributions during two-phase flow
NASA Astrophysics Data System (ADS)
Bultreys, Tom; Lin, Qingyang; Gao, Ying; Raeini, Ali Q.; AlRatrout, Ahmed; Bijeljic, Branko; Blunt, Martin J.
2018-05-01
Pore-scale two-phase flow modeling is an important technology to study a rock's relative permeability behavior. To investigate if these models are predictive, the calculated pore-scale fluid distributions which determine the relative permeability need to be validated. In this work, we introduce a methodology to quantitatively compare models to experimental fluid distributions in flow experiments visualized with microcomputed tomography. First, we analyzed five repeated drainage-imbibition experiments on a single sample. In these experiments, the exact fluid distributions were not fully repeatable on a pore-by-pore basis, while the global properties of the fluid distribution were. Then two fractional flow experiments were used to validate a quasistatic pore network model. The model correctly predicted the fluid present in more than 75% of pores and throats in drainage and imbibition. To quantify what this means for the relevant global properties of the fluid distribution, we compare the main flow paths and the connectivity across the different pore sizes in the modeled and experimental fluid distributions. These essential topology characteristics matched well for drainage simulations, but not for imbibition. This suggests that the pore-filling rules in the network model we used need to be improved to make reliable predictions of imbibition. The presented analysis illustrates the potential of our methodology to systematically and robustly test two-phase flow models to aid in model development and calibration.
The New Millenium Program: Serving Earth and Space Sciences
NASA Technical Reports Server (NTRS)
Li, Fuk K.
2000-01-01
NASA has exciting plans for space science and Earth observations during the next decade. A broad range of advanced spacecraft and measurement technologies will be needed to support these plans within the existing budget and schedule constraints. Many of these technology needs are common to both NASA's Office of Earth Science (OES) and Office of Space Sciences (OSS). Even though some breakthrough technologies have been identified to address these needs, project managers have traditionally been reluctant to incorporate them into flight programs because their inherent development risk. To accelerate the infusion of new technologies into its OES and OSS missions, NASA established the New Millennium Program (NMP). This program analyzes the capability needs of these enterprises, identifies candidate technologies to address these needs, incorporates advanced technology suites into validation flights, validates them in the relevant space environment, and then proactively infuses the validated technologies into future missions to enhance their capabilities while reducing their life cycle cost. The NMP employs a cross-enterprise Science Working Group, the NASA Enterprise science and technology roadmaps to define the capabilities needed by future Earth and Space science missions. Additional input from the science community is gathered through open workshops and peer-reviewed NASA Research Announcement (NRAs) for advanced measurement concepts. Technology development inputs from the technology organizations within NASA, other government agencies, federally funded research and development centers (FFRDC's), U.S. industry, and academia are sought to identify breakthrough technologies that might address these needs. This approach significantly extends NASA's technology infrastructure. To complement other flight test programs that develop or validate of individual components, the NMP places its highest priority on system-level validations of technology suites in the relevant space environment. This approach is not needed for all technologies, but it is usually essential to validate advanced system architectures or new measurement concepts. The NMP has recently revised its processes for defining candidate validation flights, and selecting technologies for these flights. The NMP now employs integrated project formulation teams, 'Which include scientists, technologists, and mission planners, to incorporate technology suites into candidate validation flights. These teams develop competing concepts, which can be rigorously evaluated prior to selection for flight. The technology providers for each concept are selected through an open, competitive, process during the project formulation phase. If their concept is selected for flight, they are incorporated into the Project Implementation Team, which develops, integrates, tests, launches, and operates the technology validation flight. Throughout the project implementation phase, the Implementation Team will document and disseminate their validation results to facilitate the infusion of their validated technologies into future OSS and OES science missions. The NMP has successfully launched its first two Deep Space flights for the OSS, and is currently implementing its first two Earth Orbiting flights for the OES. The next OSS and OES flights are currently being defined. Even though these flights are focused on specific Space Science and Earth Science themes, they are designed to validate a range of technologies that could benefit both enterprises, including advanced propulsion, communications, autonomous operations and navigation, multifunctional structures, microelectronics, and advanced instruments. Specific examples of these technologies will be provided in our presentation. The processes developed by the NMP also provide benefits across the Space and Earth Science enterprises. In particular, the extensive, nation-wide technology infrastructure developed by the NMP enhances the access to breakthrough technologies for both enterprises.
A design procedure and handling quality criteria for lateral directional flight control systems
NASA Technical Reports Server (NTRS)
Stein, G.; Henke, A. H.
1972-01-01
A practical design procedure for aircraft augmentation systems is described based on quadratic optimal control technology and handling-quality-oriented cost functionals. The procedure is applied to the design of a lateral-directional control system for the F4C aircraft. The design criteria, design procedure, and final control system are validated with a program of formal pilot evaluation experiments.
Stability Models for Augmentor Design Tools and Technology Assessment
2010-07-14
AFRL experiments of the flame holders proposed here for further validation. 63 Appendix A A.1 Contributors Prof. Heinz Pitsch Stanford University Prof...Related Publications & Presentations 1. Hossam El-Asrag, Heinz Pitsch, Wookyung Kim, Hyungrok Do & M. Godfrey Mungal, Flame Stability in Augmentor...Flows, Comb. Sci. Tech., submitted, 2010. 2. Hossam El-Asrag, Heinz Pitsch, Wookyung Kim, Hyungrok Do & M. Godfrey Mungal, A Computational and
Muñoz-Neira, Carlos; López, Oscar L; Riveros, Rodrigo; Núñez-Huasaf, Javier; Flores, Patricia; Slachevsky, Andrea
2012-01-01
Information and communication technology (ICT) has become an increasingly important part of daily life. The ability to use technology is becoming essential for autonomous functioning in society. Current functional scales for patients with cognitive impairment do not evaluate the use of technology. The objective of this study was to develop and validate a new version of the Activities of Daily Living Questionnaire (ADLQ) that incorporates an ICT subscale. A new technology-based subscale was incorporated into the Spanish version of the ADLQ (SV-ADLQ), entitled the Technology version of the ADLQ (T-ADLQ). The T-ADLQ was administered to 63 caregivers of dementia patients, 21 proxies of mild cognitive impairment patients and 44 proxies of normal elderly subjects (mean age of the sample ± SD: 73.5 ± 8.30 years). We analysed the convergent validity, internal consistency, reliability cut-off point, sensitivity and specificity of the T-ADLQ. The results of the T-ADLQ were compared to the SV-ADLQ. The T-ADLQ showed significant correlations with the Mini-Mental State Examination (MMSE), the Frontal Assessment Battery (FAB) as well as other measures of functional impairment and dementia severity (MMSE: r = -0.70; FAB: r = -0.65; Functional Assessment Questionnaire: r = 0.77; Instrumental Activities of Daily Living Scale: r = -0.75; Clinical Dementia Rating Scale: r = 0.72; p < 0.001). The T-ADLQ showed a good reliability with a relatively high Cronbach's α-coefficient (Cronbach's α = 0.861). When considering a functional impairment cut-off point greater than 29.25%, the sensitivity and specificity of the T-ADLQ were 82 and 90%, respectively. The area under the receiver-operating characteristic curve was 0.937 for the T-ADLQ and 0.932 for the original version of the test. The T-ADLQ revealed adequate indicators of validity and reliability for the functional assessment of activities of daily living in dementia patients. However, the inclusion of technology items in the T-ADLQ did not improve the performance of the scale, which may reflect the lack of widespread use of technology by elderly individuals. Thus, although it appeared reasonable to add technology use questions to the ADLQ, our experience suggested that this has to be done cautiously, since the sensitivity of these additional items could vary in different populations. The T-ADLQ needs to be validated in a different population of dementia subjects. Copyright © 2012 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Legnani, Elena; Cavalieri, Sergio; Pinto, Roberto; Dotti, Stefano
In the current competitive environment, companies need to extensively exploit the use of advanced technologies in order to develop a sustainable advantage, enhance their operational efficiency and better serve customers. In this context, RFID technology has emerged as a valid support for the company progress and its value is becoming more and more apparent. In particular, the textile and clothing industry, characterised by short life-cycles , quick response production , fast distribution, erratic customer preferences and impulsive purchasing, is one of the sectors which can extensively benefit from the RFID technology. However, actual applications are still very limited, especially in the upstream side of the supply network. This chapter provides an insight into the main benefits and potentials of this technology and highlights the main issues which are currently inhibiting its large scale development in the textile and clothing industry. The experience of two industry-academia projects and the relative fallouts are reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendt, Fabian F; Yu, Yi-Hsiang; Nielsen, Kim
This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30)more » [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.« less
Da Cruz, M J; Francis, H W
2015-07-01
To assess the face and content validity of a novel synthetic, three-dimensional printed temporal bone for surgical skills development and training. A synthetic temporal bone was printed using composite materials and three-dimensional printing technology. Surgical trainees were asked to complete three structured temporal bone dissection exercises. Attitudes and impressions were then assessed using a semi-structured questionnaire. Previous cadaver and real operating experiences were used as a reference. Trainees' experiences of the synthetic temporal bone were analysed in terms of four domains: anatomical realism, usefulness as a training tool, task-based usefulness and overall reactions. Responses across all domains indicated a high degree of acceptance, suggesting that the three-dimensional printed temporal bone was a useful tool in skills development. A sophisticated three-dimensional printed temporal bone that demonstrates face and content validity was developed. The efficiency in cost savings coupled with low associated biohazards make it likely that the printed temporal bone will be incorporated into traditional temporal bone skills development programmes in the near future.
[Validation of two brief scales for Internet addiction and mobile phone problem use].
Beranuy Fargues, Marta; Chamarro Lusar, Andrés; Graner Jordania, Carla; Carbonell Sánchez, Xavier
2009-08-01
This study describes the construction and validation process of two questionnaires designed to assess the addictive use of Internet and mobile phones. The scales were applied to a sample of 1,879 students. Results support a two-factor model, presenting an acceptable internal consistency and indices of convergent and discriminant validity. The Questionnaire of Experiences Related to Internet was found to assess intra- and interpersonal conflicts related to Internet use. The Questionnaire of Experiences Related to the Mobile Phone was found to assess conflicts related to mobile phone abuse and to maladaptive emotional and communicational patterns. Our results indicate that the mobile phone does not produce the same degree of addictive behavior as Internet; it could rather be interpreted as problematic use. Men displayed more addictive use of Internet, whilst women seemed to use the mobile phone as a means for emotional communication. It seems that the use of both technologies is more problematic during adolescence and normalizes with age toward a more professional and less playful use, and with fewer negative consequences.
Advanced Acid Gas Separation Technology for Clean Power and Syngas Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amy, Fabrice; Hufton, Jeffrey; Bhadra, Shubhra
2015-06-30
Air Products has developed an acid gas removal technology based on adsorption (Sour PSA) that favorably compares with incumbent AGR technologies. During this DOE-sponsored study, Air Products has been able to increase the Sour PSA technology readiness level by successfully operating a two-bed test system on coal-derived sour syngas at the NCCC, validating the lifetime and performance of the adsorbent material. Both proprietary simulation and data obtained during the testing at NCCC were used to further refine the estimate of the performance of the Sour PSA technology when expanded to a commercial scale. In-house experiments on sweet syngas combined withmore » simulation work allowed Air Products to develop new PSA cycles that allowed for further reduction in capital expenditure. Finally our techno economic analysis of the use the Sour PSA technology for both IGCC and coal-to-methanol applications suggests significant improvement of the unit cost of electricity and methanol compared to incumbent AGR technologies.« less
ACTS Ka-Band Earth Stations: Technology, Performance, and Lessons Learned
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Struharik, Steven J.; Diamond, John J.; Stewart, David
2000-01-01
The Advanced Communications Technology Satellite (ACTS) Project invested heavily in prototype Ka-band satellite ground terminals to conduct an experiments program with the ACTS satellite. The ACTS experiment's program proposed to validate Ka-band satellite and ground station technology. demonstrate future telecommunication services. demonstrate commercial viability and market acceptability of these new services, evaluate system networking and processing technology, and characterize Ka-band propagation effects, including development of techniques to mitigate signal fading. This paper will present a summary of the fixed ground terminals developed by the NASA Glenn Research Center and its industry partners, emphasizing the technology and performance of the terminals (Part 1) and the lessons learned throughout their six year operation including the inclined orbit phase of operations (Full Report). An overview of the Ka-band technology and components developed for the ACTS ground stations is presented. Next. the performance of the ground station technology and its evolution during the ACTS campaign are discussed to illustrate the technical tradeoffs made during the program and highlight technical advances by industry to support the ACTS experiments program and terminal operations. Finally. lessons learned during development and operation of the user terminals are discussed for consideration of commercial adoption into future Ka-band systems. The fixed ground stations used for experiments by government, academic, and commercial entities used reflector based offset-fed antenna systems ranging in size from 0.35m to 3.4m antenna diameter. Gateway earth stations included two systems, referred to as the NASA Ground Station (NGS) and the Link Evaluation Terminal (LET). The NGS provides tracking, telemetry, and control (TT&C) and Time Division Multiple Access (TDMA) network control functions. The LET supports technology verification and high data rate experiments. The ground stations successfully demonstrated many services and applications at Ka-band in three different modes of operation: circuit switched TDMA using the satellite on-board processor, satellite switched SS-TDMA applications using the on-board Microwave Switch Matrix (MSM), and conventional transponder (bent-pipe) operation. Data rates ranged from 4.8 kbps up to 622 Mbps. Experiments included: 1) low rate (4.8- 1 00's kbps) remote data acquisition and control using small earth stations, 2) moderate rate (1-45 Mbps) experiments included full duplex voice and video conferencing and both full duplex and asymmetric data rate protocol and network evaluation using mid-size ground stations, and 3) link characterization experiments and high data rate (155-622 Mbps) terrestrial and satellite interoperability application experiments conducted by a consortium of experimenters using the large transportable ground stations.
Multi-object detection and tracking technology based on hexagonal opto-electronic detector
NASA Astrophysics Data System (ADS)
Song, Yong; Hao, Qun; Li, Xiang
2008-02-01
A novel multi-object detection and tracking technology based on hexagonal opto-electronic detector is proposed, in which (1) a new hexagonal detector, which is composed of 6 linear CCDs, has been firstly developed to achieve the field of view of 360 degree, (2) to achieve the detection and tracking of multi-object with high speed, the object recognition criterions of Object Signal Width Criterion (OSWC) and Horizontal Scale Ratio Criterion (HSRC) are proposed. In this paper, Simulated Experiments have been carried out to verify the validity of the proposed technology, which show that the detection and tracking of multi-object can be achieved with high speed by using the proposed hexagonal detector and the criterions of OSWC and HSRC, indicating that the technology offers significant advantages in Photo-electric Detection, Computer Vision, Virtual Reality, Augment Reality, etc.
Framework for the quality assurance of 'omics technologies considering GLP requirements.
Kauffmann, Hans-Martin; Kamp, Hennicke; Fuchs, Regine; Chorley, Brian N; Deferme, Lize; Ebbels, Timothy; Hackermüller, Jörg; Perdichizzi, Stefania; Poole, Alan; Sauer, Ursula G; Tollefsen, Knut E; Tralau, Tewes; Yauk, Carole; van Ravenzwaay, Ben
2017-12-01
'Omics technologies are gaining importance to support regulatory toxicity studies. Prerequisites for performing 'omics studies considering GLP principles were discussed at the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Workshop Applying 'omics technologies in Chemical Risk Assessment. A GLP environment comprises a standard operating procedure system, proper pre-planning and documentation, and inspections of independent quality assurance staff. To prevent uncontrolled data changes, the raw data obtained in the respective 'omics data recording systems have to be specifically defined. Further requirements include transparent and reproducible data processing steps, and safe data storage and archiving procedures. The software for data recording and processing should be validated, and data changes should be traceable or disabled. GLP-compliant quality assurance of 'omics technologies appears feasible for many GLP requirements. However, challenges include (i) defining, storing, and archiving the raw data; (ii) transparent descriptions of data processing steps; (iii) software validation; and (iv) ensuring complete reproducibility of final results with respect to raw data. Nevertheless, 'omics studies can be supported by quality measures (e.g., GLP principles) to ensure quality control, reproducibility and traceability of experiments. This enables regulators to use 'omics data in a fit-for-purpose context, which enhances their applicability for risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.
1981-01-01
predicted that "an enlightened attitude towards the use of personal records for research should lead to a far greater proportion of our experience being...damage in animals tested for lifetime exposure. Use of primates will expand for the validation of neurophysiological and psychological tests designed...technicians require further training. Individuals trained in neurophysiological and tissue culture techniques are also in short supply. Behavioral
2015-03-26
Engineering and Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the...Human Universal Measurement and Assessment Network (HUMAN) Lab human performance experiment trials were used to train , validate and test the...calming music to ease the individual before the start of the study [8]. EEG data contains noise ranging from muscle twitches, blinking and other functions
HOST payload for STS-95 being moved into SSPF
NASA Technical Reports Server (NTRS)
1998-01-01
The Hubble Space Telescope Orbiting Systems Test (HOST) is checked out by technicians in the Space Shuttle Processing Facility. One of the payloads on the STS-95 mission, the HOST platform is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an earth orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry three other payloads: the Spartan solar- observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker, and the SPACEHAB single module with experiments on space flight and the aging process.
HOST payload for STS-95 being moved into SSPF
NASA Technical Reports Server (NTRS)
1998-01-01
Workers watch as the Hubble Space Telescope Orbiting Systems Test (HOST)is moved inside the Space Shuttle Processing Facility. The HOST platform, one of the payloads on the STS-95 mission, is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an earth orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry three other payloads: the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker, and the SPACEHAB single module with experiments on space flight and the aging process.
1998-09-04
Workers watch as the Hubble Space Telescope Orbiting Systems Test (HOST)is moved inside the Space Shuttle Processing Facility. The HOST platform, one of the payloads on the STS-95 mission, is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an earth orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry three other payloads: the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker, and the SPACEHAB single module with experiments on space flight and the aging process
1998-09-04
The Hubble Space Telescope Orbiting Systems Test (HOST)is being raised to a workstand by technicians in the Space Shuttle Processing Facility. One of the payloads on the STS-95 mission, the HOST platform is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an earth orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry three other payloads: the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker, and the SPACEHAB single module with experiments on space flight and the aging process
1998-09-23
KENNEDY SPACE CENTER, FLA. -- The Hubble Space Telescope Orbiting Systems Test (HOST) is suspended above its work stand in the Space Station Processing Facility before moving it to its payload canister. The HOST platform is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an Earth-orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry other payloads such as the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker (IEH-3), and the SPACEHAB single module with experiments on space flight and the aging process
1998-09-04
KENNEDY SPACE CENTER, FLA. -- The Hubble Space Telescope Orbiting Systems Test (HOST) is checked out by technicians in the Space Shuttle Processing Facility. One of the payloads on the STS-95 mission, the HOST platform is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an earth orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry three other payloads: the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker, and the SPACEHAB single module with experiments on space flight and the aging process
Using LGI experiments to achieve better understanding of pedestal-edge coupling in NSTX-U
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Zhehui
2015-02-23
PowerPoint presentation. Latest advances in granule or dust injection technologies, fast and high-resolution imaging, together with micro-/nano-structured material fabrication, provide new opportunities to examine plasma-material interaction (PMI) in magnetic fusion environment. Some of our previous work in these areas is summarized. The upcoming LGI experiments in NSTX-U will shed new light on granular matter transport in the pedestal-edge region. In addition to particle control, these results can also be used for code validation and achieving better understanding of pedestal-edge coupling in fusion plasmas in both NSTX-U and others.
2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation
NASA Technical Reports Server (NTRS)
Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.
2009-01-01
A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.
Forward Technology Solar Cell Experiment First On-Orbit Data
NASA Technical Reports Server (NTRS)
Walters, R. J.; Garner, J. C.; Lam, S. N.; Vazquez, J. A.; Braun, W. R.; Ruth, R. E.; Warner, J. H.; Lorentzen, J. R.; Messenger, S. R.; Bruninga, R.;
2007-01-01
This paper presents first on orbit measured data from the Forward Technology Solar Cell Experiment (FTSCE). FTSCE is a space experiment housed within the 5th Materials on the International Space Station Experiment (MISSE-5). MISSE-5 was launched aboard the Shuttle return to flight mission (STS-114) on July 26, 2005 and deployed on the exterior of the International Space Station (ISS). The experiment will remain in orbit for nominally one year, after which it will be returned to Earth for post-flight testing and analysis. While on orbit, the experiment is designed to measure a 36 point current vs. voltage (IV) curve on each of the experimental solar cells, and the data is continuously telemetered to Earth. The experiment also measures the solar cell temperature and the orientation of the solar cells to the sun. A range of solar cell technologies are included in the experiment including state-of-the-art triple junction InGaP/GaAs/Ge solar cells from several vendors, thin film amorphous Si and CuIn(Ga)Se2 cells, and next-generation technologies like single-junction GaAs cells grown on Si wafers and metamorphic InGaP/InGaAs/Ge triple-junction cells. In addition to FTSCE, MISSE-5 also contains a Thin-Film Materials experiment. This is a passive experiment that will provide data on the effect of the space environment on more than 200 different materials. FTSCE was initially conceived in response to various on-orbit and ground test anomalies associated with space power systems. The Department of Defense (DoD) required a method of rapidly obtaining on orbit validation data for new space solar cell technologies, and NRL was tasked to devise an experiment to meet this requirement. Rapid access to space was provided by the MISSE Program which is a NASA Langley Research Center program. MISSE-5 is a completely self-contained experiment system with its own power generation and storage system and communications system. The communications system, referred to as PCSat, transmits and receives in the Amateur Radio band providing a node on the Amateur Radio Satellite Service. This paper presents an overview of the various aspects of MISSE-5 and a sample of the first measured on orbit data.
de Vries, Peter W; van den Berg, Stéphanie M; Midden, Cees
2015-12-01
The present research addresses the question of how trust in systems is formed when unequivocal information about system accuracy and reliability is absent, and focuses on the interaction of indirect information (others' evaluations) and direct (experiential) information stemming from the interaction process. Trust in decision-supporting technology, such as route planners, is important for satisfactory user interactions. Little is known, however, about trust formation in the absence of outcome feedback, that is, when users have not yet had opportunity to verify actual outcomes. Three experiments manipulated others' evaluations ("endorsement cues") and various forms of experience-based information ("process feedback") in interactions with a route planner and measured resulting trust using rating scales and credits staked on the outcome. Subsequently, an overall analysis was conducted. Study 1 showed that effectiveness of endorsement cues on trust is moderated by mere process feedback. In Study 2, consistent (i.e., nonrandom) process feedback overruled the effect of endorsement cues on trust, whereas inconsistent process feedback did not. Study 3 showed that although the effects of consistent and inconsistent process feedback largely remained regardless of face validity, high face validity in process feedback caused higher trust than those with low face validity. An overall analysis confirmed these findings. Experiential information impacts trust even if outcome feedback is not available, and, moreover, overrules indirect trust cues-depending on the nature of the former. Designing systems so that they allow novice users to make inferences about their inner workings may foster initial trust. © 2015, Human Factors and Ergonomics Society.
Validating Remotely Sensed Land Surface Evapotranspiration Based on Multi-scale Field Measurements
NASA Astrophysics Data System (ADS)
Jia, Z.; Liu, S.; Ziwei, X.; Liang, S.
2012-12-01
The land surface evapotranspiration plays an important role in the surface energy balance and the water cycle. There have been significant technical and theoretical advances in our knowledge of evapotranspiration over the past two decades. Acquisition of the temporally and spatially continuous distribution of evapotranspiration using remote sensing technology has attracted the widespread attention of researchers and managers. However, remote sensing technology still has many uncertainties coming from model mechanism, model inputs, parameterization schemes, and scaling issue in the regional estimation. Achieving remotely sensed evapotranspiration (RS_ET) with confident certainty is required but difficult. As a result, it is indispensable to develop the validation methods to quantitatively assess the accuracy and error sources of the regional RS_ET estimations. This study proposes an innovative validation method based on multi-scale evapotranspiration acquired from field measurements, with the validation results including the accuracy assessment, error source analysis, and uncertainty analysis of the validation process. It is a potentially useful approach to evaluate the accuracy and analyze the spatio-temporal properties of RS_ET at both the basin and local scales, and is appropriate to validate RS_ET in diverse resolutions at different time-scales. An independent RS_ET validation using this method was presented over the Hai River Basin, China in 2002-2009 as a case study. Validation at the basin scale showed good agreements between the 1 km annual RS_ET and the validation data such as the water balanced evapotranspiration, MODIS evapotranspiration products, precipitation, and landuse types. Validation at the local scale also had good results for monthly, daily RS_ET at 30 m and 1 km resolutions, comparing to the multi-scale evapotranspiration measurements from the EC and LAS, respectively, with the footprint model over three typical landscapes. Although some validation experiments demonstrated that the models yield accurate estimates at flux measurement sites, the question remains whether they are performing well over the broader landscape. Moreover, a large number of RS_ET products have been released in recent years. Thus, we also pay attention to the cross-validation method of RS_ET derived from multi-source models. "The Multi-scale Observation Experiment on Evapotranspiration over Heterogeneous Land Surfaces: Flux Observation Matrix" campaign is carried out at the middle reaches of the Heihe River Basin, China in 2012. Flux measurements from an observation matrix composed of 22 EC and 4 LAS are acquired to investigate the cross-validation of multi-source models over different landscapes. In this case, six remote sensing models, including the empirical statistical model, the one-source and two-source models, the Penman-Monteith equation based model, the Priestley-Taylor equation based model, and the complementary relationship based model, are used to perform an intercomparison. All the results from the two cases of RS_ET validation showed that the proposed validation methods are reasonable and feasible.
Fibre optic gyroscopes for space use
NASA Astrophysics Data System (ADS)
Faussot, Nicolas; Cottreau, Yann; Hardy, Guillaume; Simonpietri, Pascal; Gaiffe, Thierry
2017-11-01
Among the technologies available for gyroscopes usable in space, the Fibre Optic Gyroscope (FOG) technology appears to be the most suitable: no moving parts, very good lifetime, low power consumption, very low random walk, arbitrarily low angular resolution and very good behaviour in radiations and vacuum. Benefiting from more than ten years of experience with this technology, Ixsea (formerly the Navigation Division of Photonetics) is developing space FOG under both CNES and ESA contracts since many years. In the 1996-1998 period, two space FOG demonstrators in the 0,01°/h class were manufactured, including an optical head (optic and optoelectronic part) designed for space use and a standard ground electronics. Beyond the demonstration of the specified FOG performances, the behaviour of the optical head has been validated for use in typical space environment: vibrations, shocks, radiations (up to 50 krad) and thermal vacuum. Since the beginning of 1999, Ixsea is developing a space electronics in order to manufacture two complete space FOG. The first one entered in qualification in October. The second one will be delivered beginning of next year, it will be used in a CNES attitude measurement experiment (MAGI) onboard the FrenchBrazilian Microsatellite (FBM) partly dedicated to technology evaluation.
New Reactor Physics Benchmark Data in the March 2012 Edition of the IRPhEP Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; J. Blair Briggs; Jim Gulliford
2012-11-01
The International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications. Numerous experiments that have been performed worldwide, represent a large investment of infrastructure, expertise, and cost, and are valuable resources of data for present and future research. These valuable assets provide the basis for recording, development, and validation of methods. If the experimental data are lost, the high cost to repeat many of these measurements may be prohibitive. The purpose of the IRPhEP is to provide an extensively peer-reviewed set ofmore » reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. Contributors from around the world collaborate in the evaluation and review of selected benchmark experiments for inclusion in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [1]. Several new evaluations have been prepared for inclusion in the March 2012 edition of the IRPhEP Handbook.« less
Flux-Level Transit Injection Experiments with NASA Pleiades Supercomputer
NASA Astrophysics Data System (ADS)
Li, Jie; Burke, Christopher J.; Catanzarite, Joseph; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division
2016-06-01
Flux-Level Transit Injection (FLTI) experiments are executed with NASA's Pleiades supercomputer for the Kepler Mission. The latest release (9.3, January 2016) of the Kepler Science Operations Center Pipeline is used in the FLTI experiments. Their purpose is to validate the Analytic Completeness Model (ACM), which can be computed for all Kepler target stars, thereby enabling exoplanet occurrence rate studies. Pleiades, a facility of NASA's Advanced Supercomputing Division, is one of the world's most powerful supercomputers and represents NASA's state-of-the-art technology. We discuss the details of implementing the FLTI experiments on the Pleiades supercomputer. For example, taking into account that ~16 injections are generated by one core of the Pleiades processors in an hour, the “shallow” FLTI experiment, in which ~2000 injections are required per target star, can be done for 16% of all Kepler target stars in about 200 hours. Stripping down the transit search to bare bones, i.e. only searching adjacent high/low periods at high/low pulse durations, makes the computationally intensive FLTI experiments affordable. The design of the FLTI experiments and the analysis of the resulting data are presented in “Validating an Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments” by Catanzarite et al. (#2494058).Kepler was selected as the 10th mission of the Discovery Program. Funding for the Kepler Mission has been provided by the NASA Science Mission Directorate.
Validation of Safety-Critical Systems for Aircraft Loss-of-Control Prevention and Recovery
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.
2012-01-01
Validation of technologies developed for loss of control (LOC) prevention and recovery poses significant challenges. Aircraft LOC can result from a wide spectrum of hazards, often occurring in combination, which cannot be fully replicated during evaluation. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of hazardous and uncertain conditions, and the validation framework must provide some measure of assurance that the new vehicle safety technologies do no harm (i.e., that they themselves do not introduce new safety risks). This paper summarizes a proposed validation framework for safety-critical systems, provides an overview of validation methods and tools developed by NASA to date within the Vehicle Systems Safety Project, and develops a preliminary set of test scenarios for the validation of technologies for LOC prevention and recovery
Han, Bomie; Higgs, Richard E
2008-09-01
High-throughput HPLC-mass spectrometry (HPLC-MS) is routinely used to profile biological samples for potential protein markers of disease, drug efficacy and toxicity. The discovery technology has advanced to the point where translating hypotheses from proteomic profiling studies into clinical use is the bottleneck to realizing the full potential of these approaches. The first step in this translation is the development and analytical validation of a higher throughput assay with improved sensitivity and selectivity relative to typical profiling assays. Multiple reaction monitoring (MRM) assays are an attractive approach for this stage of biomarker development given their improved sensitivity and specificity, the speed at which the assays can be developed and the quantitative nature of the assay. While the profiling assays are performed with ion trap mass spectrometers, MRM assays are traditionally developed in quadrupole-based mass spectrometers. Development of MRM assays from the same instrument used in the profiling analysis enables a seamless and rapid transition from hypothesis generation to validation. This report provides guidelines for rapidly developing an MRM assay using the same mass spectrometry platform used for profiling experiments (typically ion traps) and reviews methodological and analytical validation considerations. The analytical validation guidelines presented are drawn from existing practices on immunological assays and are applicable to any mass spectrometry platform technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burge, S.W.
Erosion has been identified as one of the significant design issues in fluid beds. A cooperative R&D venture of industry, research, and government organizations was recently formed to meet the industry need for a better understanding of erosion in fluid beds. Research focussed on bed hydrodynamics, which are considered to be the primary erosion mechanism. As part of this work, ANL developed an analytical model (FLUFIX) for bed hydrodynamics. Partial validation was performed using data from experiments sponsored by the research consortium. Development of a three-dimensional fluid bed hydrodynamic model was part of Asea-Babcock`s in-kind contribution to the R&D venture.more » This model, FORCE2, was developed by Babcock & Wilcox`s Research and Development Division existing B&W program and on the gas-solids modeling and was based on an existing B&W program and on the gas-solids modeling technology developed by ANL and others. FORCE2 contains many of the features needed to model plant size beds and, therefore can be used along with the erosion technology to assess metal wastage in industrial equipment. As part of the development efforts, FORCE2 was partially validated using ANL`s two-dimensional model, FLUFIX, and experimental data. Time constraints as well as the lack of good hydrodynamic data, particularly at the plant scale, prohibited a complete validation of FORCE2. This report describes this initial validation of FORCE2.« less
NASA Astrophysics Data System (ADS)
Johnson, Maike; Hübner, Stefan; Reichmann, Carsten; Schönberger, Manfred; Fiß, Michael
2017-06-01
Energy storage systems are a key technology for developing a more sustainable energy supply system and lowering overall CO2 emissions. Among the variety of storage technologies, high temperature phase change material (PCM) storage is a promising option with a wide range of applications. PCM storages using an extended finned tube storage concept have been designed and techno-economically optimized for solar thermal power plant operations. These finned tube components were experimentally tested in order to validate the optimized design and simulation models used. Analysis of the charging and discharging characteristics of the storage at the pilot scale gives insight into the heat distribution both axially as well as radially in the storage material, thereby allowing for a realistic validation of the design. The design was optimized for discharging of the storage, as this is the more critical operation mode in power plant applications. The data show good agreement between the model and the experiments for discharging.
NASA EEE Parts and Advanced Interconnect Program (AIP)
NASA Technical Reports Server (NTRS)
Gindorf, T.; Garrison, A.
1996-01-01
none given From Program Objectives: I. Accelerate the readiness of new technologies through development of validation, assessment and test method/tools II. Provide NASA Projects infusion paths for emerging technologies III. Provide NASA Projects technology selection, application and validation guidelines for harware and processes IV. Disseminate quality assurance, reliability, validation, tools and availability information to the NASA community.
Weinstock, Peter; Rehder, Roberta; Prabhu, Sanjay P; Forbes, Peter W; Roussin, Christopher J; Cohen, Alan R
2017-07-01
OBJECTIVE Recent advances in optics and miniaturization have enabled the development of a growing number of minimally invasive procedures, yet innovative training methods for the use of these techniques remain lacking. Conventional teaching models, including cadavers and physical trainers as well as virtual reality platforms, are often expensive and ineffective. Newly developed 3D printing technologies can recreate patient-specific anatomy, but the stiffness of the materials limits fidelity to real-life surgical situations. Hollywood special effects techniques can create ultrarealistic features, including lifelike tactile properties, to enhance accuracy and effectiveness of the surgical models. The authors created a highly realistic model of a pediatric patient with hydrocephalus via a unique combination of 3D printing and special effects techniques and validated the use of this model in training neurosurgery fellows and residents to perform endoscopic third ventriculostomy (ETV), an effective minimally invasive method increasingly used in treating hydrocephalus. METHODS A full-scale reproduction of the head of a 14-year-old adolescent patient with hydrocephalus, including external physical details and internal neuroanatomy, was developed via a unique collaboration of neurosurgeons, simulation engineers, and a group of special effects experts. The model contains "plug-and-play" replaceable components for repetitive practice. The appearance of the training model (face validity) and the reproducibility of the ETV training procedure (content validity) were assessed by neurosurgery fellows and residents of different experience levels based on a 14-item Likert-like questionnaire. The usefulness of the training model for evaluating the performance of the trainees at different levels of experience (construct validity) was measured by blinded observers using the Objective Structured Assessment of Technical Skills (OSATS) scale for the performance of ETV. RESULTS A combination of 3D printing technology and casting processes led to the creation of realistic surgical models that include high-fidelity reproductions of the anatomical features of hydrocephalus and allow for the performance of ETV for training purposes. The models reproduced the pulsations of the basilar artery, ventricles, and cerebrospinal fluid (CSF), thus simulating the experience of performing ETV on an actual patient. The results of the 14-item questionnaire showed limited variability among participants' scores, and the neurosurgery fellows and residents gave the models consistently high ratings for face and content validity. The mean score for the content validity questions (4.88) was higher than the mean score for face validity (4.69) (p = 0.03). On construct validity scores, the blinded observers rated performance of fellows significantly higher than that of residents, indicating that the model provided a means to distinguish between novice and expert surgical skills. CONCLUSIONS A plug-and-play lifelike ETV training model was developed through a combination of 3D printing and special effects techniques, providing both anatomical and haptic accuracy. Such simulators offer opportunities to accelerate the development of expertise with respect to new and novel procedures as well as iterate new surgical approaches and innovations, thus allowing novice neurosurgeons to gain valuable experience in surgical techniques without exposing patients to risk of harm.
Bridging the Technology Readiness "Valley of Death" Utilizing Nanosats
NASA Technical Reports Server (NTRS)
Bauer, Robert A.; Millar, Pamela S.; Norton, Charles D.
2015-01-01
Incorporating new technology is a hallmark of space missions. Missions demand ever-improving tools and techniques to allow them to meet the mission science requirements. In Earth Science, these technologies are normally expressed in new instrument capabilities that can enable new measurement concepts, extended capabilities of existing measurement techniques, or totally new detection capabilities, and also, information systems technologies that can enhance data analysis or enable new data analyses to advance modeling and prediction capabilities. Incorporating new technologies has never been easy. There is a large development step beyond demonstration in a laboratory or on an airborne platform to the eventual space environment that is sometimes referred to as the "technology valley of death." Studies have shown that non-validated technology is a primary cause of NASA and DoD mission delays and cost overruns. With the demise of the New Millennium Program within NASA, opportunities for demonstrating technologies in space have been rare. Many technologies are suitable for a flight project after only ground testing. However, some require validation in a relevant or a space flight environment, which cannot be fully tested on the ground or in airborne systems. NASA's Earth Science Technology Program has initiated a nimble program to provide a fairly rapid turn-around of space validated technologies, and thereby reducing future mission risk in incorporating new technologies. The program, called In-Space Validation of Earth Science Technology (InVEST), now has five tasks in development. Each are 3U CubeSats and they are targeted for launch opportunities in the 2016 time period. Prior to formalizing an InVEST program, the technology program office was asked to demonstrate how the program would work and what sort of technologies could benefit from space validation. Three projects were developed and launched, and have demonstrated the technologies that they set out to validate. This paper will provide a brief status of the pre-InVEST CubeSats, and discuss the development and status of the InVEST program. Figure
Field testing of thermal canopy models in a spruce-fir forest
NASA Technical Reports Server (NTRS)
1990-01-01
Recent advances in remote sensing technology allow the use of the thermal infrared region to gain information about vegetative surfaces. Extending existing models to account for thermal radiance transfers within rough forest canopies is of paramount importance. This is so since all processes of interest in the physical climate system and biogeochemical cycles are thermally mediated. Model validation experiments were conducted at a well established boreal forest; northern hardwood forest ecotone research site located in central Maine. Data was collected to allow spatial and temporal validation of thermal models. Emphasis was placed primarily upon enhancing submodels of stomatal behavior, and secondarily upon enhancing boundary layer resistance submodels and accounting for thermal storage in soil and vegetation.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.
1989-01-01
The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.
Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results
NASA Technical Reports Server (NTRS)
Burken, John J.; Larson, Richard R.
2009-01-01
F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.
Development and Validation of Information Technology Mentor Teacher Attitude Scale: A Pilot Study
ERIC Educational Resources Information Center
Saltan, Fatih
2015-01-01
The aim of this study development and validation of a teacher attitude scale toward Information Technology Mentor Teachers (ITMT). ITMTs give technological support to other teachers for integration of technology in their lessons. In the literature, many instruments have been developed to measure teachers' attitudes towards the technological tools…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-19
... of Wuxi CR Semiconductor Wafers & Chips Co., Ltd. and CSMC Technologies Fab 1 Co., Ltd., which is... Validated End-User: CSMC Technologies Corporation. Eligible Destinations: CSMC Technologies Fab 1 Co., Ltd., 14 Liangxi Road, Wuxi, Jiangsu 214061, China. CSMC Technologies Fab 2 Co., Ltd., 8 Xinzhou Rd., Wuxi...
Electrolysis Performance Improvement Concept Study (EPICS) flight experiment phase C/D
NASA Technical Reports Server (NTRS)
Schubert, F. H.; Lee, M. G.
1995-01-01
The overall purpose of the Electrolysis Performance Improvement Concept Study flight experiment is to demonstrate and validate in a microgravity environment the Static Feed Electrolyzer concept as well as investigate the effect of microgravity on water electrolysis performance. The scope of the experiment includes variations in microstructural characteristics of electrodes and current densities in a static feed electrolysis cell configuration. The results of the flight experiment will be used to improve efficiency of the static feed electrolysis process and other electrochemical regenerative life support processes by reducing power and expanding the operational range. Specific technologies that will benefit include water electrolysis for propulsion, energy storage, life support, extravehicular activity, in-space manufacturing and in-space science in addition to other electrochemical regenerative life support technologies such as electrochemical carbon dioxide and oxygen separation, electrochemical oxygen compression and water vapor electrolysis. The Electrolysis Performance Improvement Concept Study flight experiment design incorporates two primary hardware assemblies: the Mechanical/Electrochemical Assembly and the Control/Monitor Instrumentation. The Mechanical/Electrochemical Assembly contains three separate integrated electrolysis cells along with supporting pressure and temperature control components. The Control/Monitor Instrumentation controls the operation of the experiment via the Mechanical/Electrochemical Assembly components and provides for monitoring and control of critical parameters and storage of experimental data.
Immersive virtual reality as a teaching tool for neuroanatomy.
Stepan, Katelyn; Zeiger, Joshua; Hanchuk, Stephanie; Del Signore, Anthony; Shrivastava, Raj; Govindaraj, Satish; Iloreta, Alfred
2017-10-01
Three-dimensional (3D) computer modeling and interactive virtual reality (VR) simulation are validated teaching techniques used throughout medical disciplines. Little objective data exists supporting its use in teaching clinical anatomy. Learner motivation is thought to limit the rate of utilization of such novel technologies. The purpose of this study is to evaluate the effectiveness, satisfaction, and motivation associated with immersive VR simulation in teaching medical students neuroanatomy. Images of normal cerebral anatomy were reconstructed from human Digital Imaging and Communications in Medicine (DICOM) computed tomography (CT) imaging and magnetic resonance imaging (MRI) into 3D VR formats compatible with the Oculus Rift VR System, a head-mounted display with tracking capabilities allowing for an immersive VR experience. The ventricular system and cerebral vasculature were highlighted and labeled to create a focused interactive model. We conducted a randomized controlled study with 66 medical students (33 in both the control and experimental groups). Pertinent neuroanatomical structures were studied using either online textbooks or the VR interactive model, respectively. We then evaluated the students' anatomy knowledge, educational experience, and motivation (using the Instructional Materials Motivation Survey [IMMS], a previously validated assessment). There was no significant difference in anatomy knowledge between the 2 groups on preintervention, postintervention, or retention quizzes. The VR group found the learning experience to be significantly more engaging, enjoyable, and useful (all p < 0.01) and scored significantly higher on the motivation assessment (p < 0.01). Immersive VR educational tools awarded a more positive learner experience and enhanced student motivation. However, the technology was equally as effective as the traditional text books in teaching neuroanatomy. © 2017 ARS-AAOA, LLC.
Vissers, Lisenka E. L. M. ; de Vries, Bert B. A. ; Osoegawa, Kazutoyo ; Janssen, Irene M. ; Feuth, Ton ; Choy, Chik On ; Straatman, Huub ; van der Vliet, Walter ; Huys, Erik H. L. P. G. ; van Rijk, Anke ; Smeets, Dominique ; van Ravenswaaij-Arts, Conny M. A. ; Knoers, Nine V. ; van der Burgt, Ineke ; de Jong, Pieter J. ; Brunner, Han G. ; van Kessel, Ad Geurts ; Schoenmakers, Eric F. P. M. ; Veltman, Joris A.
2003-01-01
Microdeletions and microduplications, not visible by routine chromosome analysis, are a major cause of human malformation and mental retardation. Novel high-resolution, whole-genome technologies can improve the diagnostic detection rate of these small chromosomal abnormalities. Array-based comparative genomic hybridization allows such a high-resolution screening by hybridizing differentially labeled test and reference DNAs to arrays consisting of thousands of genomic clones. In this study, we tested the diagnostic capacity of this technology using ∼3,500 flourescent in situ hybridization–verified clones selected to cover the genome with an average of 1 clone per megabase (Mb). The sensitivity and specificity of the technology were tested in normal-versus-normal control experiments and through the screening of patients with known microdeletion syndromes. Subsequently, a series of 20 cytogenetically normal patients with mental retardation and dysmorphisms suggestive of a chromosomal abnormality were analyzed. In this series, three microdeletions and two microduplications were identified and validated. Two of these genomic changes were identified also in one of the parents, indicating that these are large-scale genomic polymorphisms. Deletions and duplications as small as 1 Mb could be reliably detected by our approach. The percentage of false-positive results was reduced to a minimum by use of a dye-swap-replicate analysis, all but eliminating the need for laborious validation experiments and facilitating implementation in a routine diagnostic setting. This high-resolution assay will facilitate the identification of novel genes involved in human mental retardation and/or malformation syndromes and will provide insight into the flexibility and plasticity of the human genome. PMID:14628292
AMTEC flight experiment progress and plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Underwood, M.L.; Dobbs, M.; Giglio, J.
1997-12-31
An experiment is being developed to validate the performance of AMTEC technology in the space microgravity environment. A group of AMTEC cells have been fabricated and assembled into an experiment module and instrumented for operation. The experiment is manifested as a Hitchhiker payload on STS-88 now planned for flight in July 1998. The AMTEC cells will be operated in space for up to ten days. The microgravity developed distribution of the sodium working fluid will be frozen in place before the cells are returned to Earth. Upon return the cells will be destructively evaluated to determine the location of themore » sodium and to assure that the sodium has been properly controlled by the sodium control elements. This paper describes the experiment purpose, status, and plans for the flight operations and data analysis. An overview of how this experiment fits into the overall AMTEC development is also provided.« less
Technology Transfer Challenges for High-Assurance Software Engineering Tools
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.
2003-01-01
In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.
Face and content validity of the virtual reality simulator 'ScanTrainer®'.
Alsalamah, Amal; Campo, Rudi; Tanos, Vasilios; Grimbizis, Gregoris; Van Belle, Yves; Hood, Kerenza; Pugh, Neil; Amso, Nazar
2017-01-01
Ultrasonography is a first-line imaging in the investigation of women's irregular bleeding and other gynaecological pathologies, e.g. ovarian cysts and early pregnancy problems. However, teaching ultrasound, especially transvaginal scanning, remains a challenge for health professionals. New technology such as simulation may potentially facilitate and expedite the process of learning ultrasound. Simulation may prove to be realistic, very close to real patient scanning experience for the sonographer and objectively able to assist the development of basic skills such as image manipulation, hand-eye coordination and examination technique. The aim of this study was to determine the face and content validity of a virtual reality simulator (ScanTrainer®, MedaPhor plc, Cardiff, Wales, UK) as reflective of real transvaginal ultrasound (TVUS) scanning. A questionnaire with 14 simulator-related statements was distributed to a number of participants with differing levels of sonography experience in order to determine the level of agreement between the use of the simulator in training and real practice. There were 36 participants: novices ( n = 25) and experts ( n = 11) who rated the simulator. Median scores of face validity statements between experts and non-experts using a 10-point visual analogue scale (VAS) ratings ranged between 7.5 and 9.0 ( p > 0.05) indicated a high level of agreement. Experts' median scores of content validity statements ranged from 8.4 to 9.0. The findings confirm that the simulator has the feel and look of real-time scanning with high face validity. Similarly, its tutorial structures and learning steps confirm the content validity.
Schnall, Rebecca; Cho, Hwayoung; Liu, Jianfang
2018-01-05
Mobile technology has become a ubiquitous technology and can be particularly useful in the delivery of health interventions. This technology can allow us to deliver interventions to scale, cover broad geographic areas, and deliver technologies in highly tailored ways based on the preferences or characteristics of users. The broad use of mobile technologies supports the need for usability assessments of these tools. Although there have been a number of usability assessment instruments developed, none have been validated for use with mobile technologies. The goal of this work was to validate the Health Information Technology Usability Evaluation Scale (Health-ITUES), a customizable usability assessment instrument in a sample of community-dwelling adults who were testing the use of a new mobile health (mHealth) technology. A sample of 92 community-dwelling adults living with HIV used a new mobile app for symptom self-management and completed the Health-ITUES to assess the usability of the app. They also completed the Post-Study System Usability Questionnaire (PSSUQ), a widely used and well-validated usability assessment tool. Correlations between these scales and each of the subscales were assessed. The subscales of the Health-ITUES showed high internal consistency reliability (Cronbach alpha=.85-.92). Each of the Health-ITUES subscales and the overall scale was moderately to strongly correlated with the PSSUQ scales (r=.46-.70), demonstrating the criterion validity of the Health-ITUES. The Health-ITUES has demonstrated reliability and validity for use in assessing the usability of mHealth technologies in community-dwelling adults living with a chronic illness. ©Rebecca Schnall, Hwayoung Cho, Jianfang Liu. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 05.01.2018.
Cho, Hwayoung; Liu, Jianfang
2018-01-01
Background Mobile technology has become a ubiquitous technology and can be particularly useful in the delivery of health interventions. This technology can allow us to deliver interventions to scale, cover broad geographic areas, and deliver technologies in highly tailored ways based on the preferences or characteristics of users. The broad use of mobile technologies supports the need for usability assessments of these tools. Although there have been a number of usability assessment instruments developed, none have been validated for use with mobile technologies. Objective The goal of this work was to validate the Health Information Technology Usability Evaluation Scale (Health-ITUES), a customizable usability assessment instrument in a sample of community-dwelling adults who were testing the use of a new mobile health (mHealth) technology. Methods A sample of 92 community-dwelling adults living with HIV used a new mobile app for symptom self-management and completed the Health-ITUES to assess the usability of the app. They also completed the Post-Study System Usability Questionnaire (PSSUQ), a widely used and well-validated usability assessment tool. Correlations between these scales and each of the subscales were assessed. Results The subscales of the Health-ITUES showed high internal consistency reliability (Cronbach alpha=.85-.92). Each of the Health-ITUES subscales and the overall scale was moderately to strongly correlated with the PSSUQ scales (r=.46-.70), demonstrating the criterion validity of the Health-ITUES. Conclusions The Health-ITUES has demonstrated reliability and validity for use in assessing the usability of mHealth technologies in community-dwelling adults living with a chronic illness. PMID:29305343
NASA Technical Reports Server (NTRS)
Mandl, Dan; Howard, Joseph
2000-01-01
The New Millennium Program's first Earth-observing mission (EO-1) is a technology validation mission. It is managed by the NASA Goddard Space Flight Center in Greenbelt, Maryland and is scheduled for launch in the summer of 2000. The purpose of this mission is to flight-validate revolutionary technologies that will contribute to the reduction of cost and increase of capabilities for future land imaging missions. In the EO-1 mission, there are five instrument, five spacecraft, and three supporting technologies to flight-validate during a year of operations. EO-1 operations and the accompanying ground system were intended to be simple in order to maintain low operational costs. For purposes of formulating operations, it was initially modeled as a small science mission. However, it quickly evolved into a more complex mission due to the difficulties in effectively integrating all of the validation plans of the individual technologies. As a consequence, more operational support was required to confidently complete the on-orbit validation of the new technologies. This paper will outline the issues and lessons learned applicable to future technology validation missions. Examples of some of these include the following: (1) operational complexity encountered in integrating all of the validation plans into a coherent operational plan, (2) initial desire to run single shift operations subsequently growing to 6 "around-the-clock" operations, (3) managing changes in the technologies that ultimately affected operations, (4) necessity for better team communications within the project to offset the effects of change on the Ground System Developers, Operations Engineers, Integration and Test Engineers, S/C Subsystem Engineers, and Scientists, and (5) the need for a more experienced Flight Operations Team to achieve the necessary operational flexibility. The discussion will conclude by providing several cost comparisons for developing operations from previous missions to EO-1 and discuss some details that might be done differently for future technology validation missions.
How Developments in Psychology and Technology Challenge Validity Argumentation
ERIC Educational Resources Information Center
Mislevy, Robert J.
2016-01-01
Validity is the sine qua non of properties of educational assessment. While a theory of validity and a practical framework for validation has emerged over the past decades, most of the discussion has addressed familiar forms of assessment and psychological framings. Advances in digital technologies and in cognitive and social psychology have…
Potential Benefits of Manmade Opals Demonstrated for First Time (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
NREL experiments show that disordered inverse opals significantly scatter and trap near-infrared light, with possible impact on optoelectronic materials. Inverse opals, familiar in the form of brilliantly colored opal gemstones, are a class of materials that has astounding optical properties. Scientists have been exploring the ability of inverse opals to manipulate light in the hopes of harnessing this capacity for advanced technologies such as displays, detectors, lasers, and photovoltaics. A research group at the National Renewable Energy Laboratory (NREL) discovered that man-made inverse opal films containing significant morphological disorder exhibit substantial light scattering, consequently trapping wavelengths in the near-infrared (NIR),more » which is important to a number of technologies. This discovery is the first experimental evidence to validate a 2005 theoretical model predicting the confinement of light in such structures, and it holds great promise for improving the performance of technologies that rely on careful light control. This breakthrough also makes possible optoelectronic technologies that use a range of low-cost molecular and semiconductor species that otherwise absorb light too weakly to be useful. The disordered inverse opal architecture validates the theoretical model that predicts the diffusion and confinement of light in such structures. Electrochemically deposited CdSe inverse opal films containing significant morphological disorder exhibit substantial light scattering and consequent NIR light trapping. This discovery holds promise for NIR light management in optoelectronic technologies, particularly those involving weakly absorbing molecular and semiconductor photomaterials.« less
Marketing Plan for Demonstration and Validation Assets
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The National Security Preparedness Project (NSPP), is to be sustained by various programs, including technology demonstration and evaluation (DEMVAL). This project assists companies in developing technologies under the National Security Technology Incubator program (NSTI) through demonstration and validation of technologies applicable to national security created by incubators and other sources. The NSPP also will support the creation of an integrated demonstration and validation environment. This report documents the DEMVAL marketing and visibility plan, which will focus on collecting information about, and expanding the visibility of, DEMVAL assets serving businesses with national security technology applications in southern New Mexico.
The Use of Smartphones in Norwegian Social Care Services.
Hansen, Linda Iren Mihaila; Fruhling, Ann; Fossum, Mariann
2016-01-01
This study aims to understand how smartphone technology was perceived by social workers responsible for piloting social services software and the experiences of involving end-users as co-developers. The pilot resulted in an improved match between the smartphone software and workflow as well as mutual learning experiences among the social workers, clients, and the vendor. The pilot study revealed several graphical user interface (GUI) and functionality challenges. Implementing an ICT social service smartphone application may further improve efficiencies for social workers serving citizens, however; this study validates the importance to study end-users' experiences with communication and the real-time use of the system in order reap the anticipated benefits of ICT capabilities for smart phone social service applications.
SMART-1 Technology and Science Experiments in Preparation of Future Missions and ESA Cornerstones
NASA Astrophysics Data System (ADS)
Marini, A. E.; Racca, G. D.; Foing, B. H.; SMART-1 Project
1999-12-01
SMART-1 is the first ESA Small Mission for Advanced Research in Technology, aimed at the demonstration of enabling technologies for future scientific missions. SMART-1's prime technology objective is the demonstration of the solar primary electric propulsion, a key for future interplanetary missions. SMART-1 will use a Stationary Plasma Thruster engine, cruising 15 months to capture a Moon polar orbit. A gallery of images of the spacecraft is available at the web site: http://www.estec.esa.nl/spdwww/smart1/html/11742.html SMART-1 payload aims at monitoring the electric propulsion and its spacecraft environment and to test novel instrument technologies. The Diagnostic Instruments include SPEDE, a spacecraft potential plasma and charged particles detector, to characterise both spacecraft and planetary environment, together with EPDP, a suite of sensors monitoring secondary thrust-ions, charging and deposition effects. Innovative spacecraft technologies will be tested on SMART-1 : Lithium batteries and KATE, an experimental X/Ka-band deep-space transponder, to support radio-science, to monitor the accelerations of the electric propulsion and to test turbo-code technique, enhancing the return of scientific data. The scientific instruments for imaging and spectrometry are: \\begin{itemize} D-CIXS, a compact X-ray spectrometer based on novel SCD detectors and micro-structure optics, to observe X-ray celectial objects and to perform lunar chemistry measurements. SIR, a miniaturised quasi-monolithic point-spectrometer, operating in the Near-IR (0.9 ÷ 2.4 micron), to survey the lunar crust in previously uncovered optical regions. AMIE, a miniature camera based on 3-D integrated electronics, imaging the Moon, and other bodies and supporting LASER-LINK and RSIS. RSIS and LASER-LINK are investigations performed with the SMART-1 Payload: \\begin{itemize} RSIS: A radio-science Experiment to validate in-orbit determination of the libration of the celestial target, based on high-accuracy tracking in Ka-band and imaging of a surface landmark LASER-LINK: a demonstration of acquisition of a deep-space laser-link from the ESA Optical Ground Station at Tenerife, validating also the novel sub-apertured telescope designed for the mitigation of atmospheric scintillation disturbances.
Fallavollita, Pascal; Brand, Alexander; Wang, Lejing; Euler, Ekkehard; Thaller, Peter; Navab, Nassir; Weidert, Simon
2016-11-01
Determination of lower limb alignment is a prerequisite for successful orthopedic surgical treatment. Traditional methods include the electrocautery cord, alignment rod, or axis board which rely solely on C-arm fluoroscopy navigation and are radiation intensive. To assess a new augmented reality technology in determining lower limb alignment. A camera-augmented mobile C-arm (CamC) technology was used to create a panorama image consisting of hip, knee, and ankle X-rays. Twenty-five human cadaver legs were used for validation with random varus or valgus deformations. Five clinicians performed experiments that consisted in achieving acceptable mechanical axis deviation. The applicability of the CamC technology was assessed with direct comparison to ground-truth CT. A t test, Pearson's correlation, and ANOVA were used to determine statistical significance. The value of Pearson's correlation coefficient R was 0.979 which demonstrates a strong positive correlation between the CamC and ground-truth CT data. The analysis of variance produced a p value equal to 0.911 signifying that clinician expertise differences were not significant with regard to the type of system used to assess mechanical axis deviation. All described measurements demonstrated valid measurement of lower limb alignment. With minimal effort, clinicians required only 3 X-ray image acquisitions using the augmented reality technology to achieve reliable mechanical axis deviation.
Animal Models of Depression: Molecular Perspectives
Krishnan, Vaishnav; Nestler, Eric J.
2012-01-01
Much of the current understanding about the pathogenesis of altered mood, impaired concentration and neurovegetative symptoms in major depression has come from animal models. However, because of the unique and complex features of human depression, the generation of valid and insightful depression models has been less straightforward than modeling other disabling diseases like cancer or autoimmune conditions. Today’s popular depression models creatively merge ethologically valid behavioral assays with the latest technological advances in molecular biology and automated video-tracking. This chapter reviews depression assays involving acute stress (e.g., forced swim test), models consisting of prolonged physical or social stress (e.g., social defeat), models of secondary depression, genetic models, and experiments designed to elucidate the mechanisms of antidepressant action. These paradigms are critically evaluated in relation to their ease, validity and replicability, the molecular insights that they have provided, and their capacity to offer the next generation of therapeutics for depression. PMID:21225412
NASA Astrophysics Data System (ADS)
Jarboe, Thomas; Marklin, George; Nelson, Brian; Sutherland, Derek; HIT Team Team
2013-10-01
A proof of principle experiment to study closed-flux energy confinement of a spheromak sustained by imposed dynamo current drive is described. A two-fluid validated NIMROD code has simulated closed-flux sustainment on a stable spheromak using imposed dynamo current drive (IDCD), demonstrating that dynamo current drive is compatible with closed flux. (submitted for publication and see adjacent poster.(spsap)) HIT-SI, a = 0.25 m, has achieved 90 kA of toroidal current, current gains of nearly 4, and operation from 5.5 kHz to 68 kHz, demonstrating the robustness of the method.(spsap) Finally, a reactor design study using fusion technology developed for ITER and modern nuclear technology shows a design that is economically superior to coal.(spsap) The spheromak reactor and development path are about a factor of 10 less expensive than that of the tokamak/stellarator. These exciting results justify a proof of principle (PoP) confinement experiment of a spheromak sustained by IDCD. Such an experiment (R = 1.5 m, a = 1 m, Itor = 3 . 2 MA, n = 4e19/m3, T = 3 keV) is described in detail.
HOST payload for STS-95 being moved into SSPF
NASA Technical Reports Server (NTRS)
1998-01-01
The Hubble Space Telescope Orbiting Systems Test (HOST)is being raised to a workstand by technicians in the Space Shuttle Processing Facility. One of the payloads on the STS-95 mission, the HOST platform is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an earth orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry three other payloads: the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker, and the SPACEHAB single module with experiments on space flight and the aging process.
Potential Follow on Experiments for the Zero Boil Off Tank Experiment
NASA Technical Reports Server (NTRS)
Chato, David; Kassemi, Mohammad
2014-01-01
Cryogenic Storage &Transfer are enabling propulsion technologies in the direct path of nearly all future human or robotic missions; It is identified by NASA as an area with greatest potential for cost saving; This proposal aims at resolving fundamental scientific issues behind the engineering development of the storage tanks; We propose to use the ISS lab to generate & collect archival scientific data:, raise our current state-of-the-art understanding of transport and phase change issues affecting the storage tank cryogenic fluid management (CFM), develop and validate state-of-the-art CFD models to innovate, optimize, and advance the future engineering designs
QuEST: Qualifying Environmentally Sustainable Technologies. Volume 2
NASA Technical Reports Server (NTRS)
Brown, Christina (Editor)
2007-01-01
TEERM focuses its validation efforts on technologies that have shown promise in laboratory testing, but lack testing under realistic or field environment. Mature technologies have advantages over those that are still in the developmental stage such as being more likely to be transitioned into a working environment. One way TEERM begins to evaluate the suitability of technologies is through Technology Readiness Levels (TRLs). TRLs are a systematic metric/measurement system that supports assessments of the maturity of a particular technology and the consistent comparison of maturity between different types of technology. TEERM generally works on demonstrating/validating alternatives that fall within TRLs 5-9. In instances where a mature technology does not exist for a particular Agency application, TEERM works with technology development groups and programs such as NASA's Innovative Partnerships Program (IPP). The IPP's purpose is to identify and document available technologies in light of NASA's needs, evaluate and prioritize those technologies, and reach out to find new partners. All TEERM projects involve multiple partners. Partnering reduces duplication of effort that otherwise might occur if individuals worked their problems alone. Partnering also helps reduce individual contributors' shares of the total cost of technology validation. Through collaboration and financial commitment from project stakeholders and third-party sources, it is possible to fully fund expensive demonstration/validation efforts.
Comparison of Requirements for Composite Structures for Aircraft and Space Applications
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Elliott, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.
2010-01-01
In this paper, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry, and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.
X-37 Flight Demonstrator Project: Capabilities for Future Space Transportation System Development
NASA Technical Reports Server (NTRS)
Dumbacher, Daniel L.
2004-01-01
The X-37 Approach and Landing Vehicle (ALTV) is an automated (unmanned) spacecraft designed to reduce technical risk in the descent and landing phases of flight. ALTV mission requirements and Orbital Vehicle (OV) technology research and development (R&D) goals are formulated to validate and mature high-payoff ground and flight technologies such as Thermal Protection Systems (TPS). It has been more than three decades since the Space Shuttle was designed and built. Real-world hardware experience gained through the multitude of X-37 Project activities has expanded both Government and industry knowledge of the challenges involved in developing new generations of spacecraft that can fulfill the Vision for Space Exploration.
ERIC Educational Resources Information Center
Liou, Pey-Yan; Kuo, Pei-Jung
2014-01-01
Background: Few studies have examined students' attitudinal perceptions of technology. There is no appropriate instrument to measure senior high school students' motivation and self-regulation toward technology learning among the current existing instruments in the field of technology education. Purpose: The present study is to validate an…
Tabacu, Stefan
2015-01-01
In this paper, a methodology for the development and validation of a numerical model of the human head using generic procedures is presented. All steps required, starting with the model generation, model validation and applications will be discussed. The proposed model may be considered as a dual one due to its capabilities to switch from deformable to a rigid body according to the application's requirements. The first step is to generate the numerical model of the human head using geometry files or medical images. The required stiffness and damping for the elastic connection used for the rigid body model are identified by performing a natural frequency analysis. The presented applications for model validation are related to impact analysis. The first case is related to Nahum's (Nahum and Smith 1970) experiments pressure data being evaluated and a pressure map generated using the results from discrete elements. For the second case, the relative displacement between the brain and the skull is evaluated according to Hardy's (Hardy WH, Foster CD, Mason, MJ, Yang KH, King A, Tashman S. 2001.Investigation of head injury mechanisms using neutral density technology and high-speed biplanar X-ray. Stapp Car Crash J. 45:337-368, SAE Paper 2001-22-0016) experiments. The main objective is to validate the rigid model as a quick and versatile tool for acquiring the input data for specific brain analyses.
NASA Aerospace Flight Battery Systems Program Update
NASA Technical Reports Server (NTRS)
Manzo, Michelle; ODonnell, Patricia
1997-01-01
The objectives of NASA's Aerospace Flight Battery Systems Program is to: develop, maintain and provide tools for the validation and assessment of aerospace battery technologies; accelerate the readiness of technology advances and provide infusion paths for emerging technologies; provide NASA projects with the required database and validation guidelines for technology selection of hardware and processes relating to aerospace batteries; disseminate validation and assessment tools, quality assurance, reliability, and availability information to the NASA and aerospace battery communities; and ensure that safe, reliable batteries are available for NASA's future missions.
Combustion Integration Rack (CIR) Testing
2015-02-18
Fluids and Combustion Facility (FCF), Combustion Integration Rack (CIR) during testing in the Structural Dynamics Laboratory (SDL). The Fluids and Combustion Facility (FCF) is a set of two International Space Station (ISS) research facilities designed to support physical and biological experiments in support of technology development and validation in space. The FCF consists of two modular, reconfigurable racks called the Combustion Integration Rack (CIR) and the Fluids Integration Rack (FIR). The CIR and FIR were developed at NASAʼs Glenn Research Center.
Vázquez, Enrique
2017-01-01
Internet of Things platforms for Smart Cities are technologically complex and deploying them at large scale involves high costs and risks. Therefore, pilot schemes that allow validating proof of concepts, experimenting with different technologies and services, and fine-tuning them before migrating them to actual scenarios, are especially important in this context. The IoT platform deployed across the engineering schools of the Universidad Politécnica de Madrid in the Moncloa Campus of International Excellence represents a good example of a test bench for experimentation with Smart City services. This paper presents the main features of this platform, putting special emphasis on the technological challenges faced and on the solutions adopted, as well as on the functionality, services and potential that the platform offers. PMID:29292790
Alvarez-Campana, Manuel; López, Gregorio; Vázquez, Enrique; Villagrá, Víctor A; Berrocal, Julio
2017-12-08
Internet of Things platforms for Smart Cities are technologically complex and deploying them at large scale involves high costs and risks. Therefore, pilot schemes that allow validating proof of concepts, experimenting with different technologies and services, and fine-tuning them before migrating them to actual scenarios, are especially important in this context. The IoT platform deployed across the engineering schools of the Universidad Politécnica de Madrid in the Moncloa Campus of International Excellence represents a good example of a test bench for experimentation with Smart City services. This paper presents the main features of this platform, putting special emphasis on the technological challenges faced and on the solutions adopted, as well as on the functionality, services and potential that the platform offers.
Deep Space Networking Experiments on the EPOXI Spacecraft
NASA Technical Reports Server (NTRS)
Jones, Ross M.
2011-01-01
NASA's Space Communications & Navigation Program within the Space Operations Directorate is operating a program to develop and deploy Disruption Tolerant Networking [DTN] technology for a wide variety of mission types by the end of 2011. DTN is an enabling element of the Interplanetary Internet where terrestrial networking protocols are generally unsuitable because they rely on timely and continuous end-to-end delivery of data and acknowledgments. In fall of 2008 and 2009 and 2011 the Jet Propulsion Laboratory installed and tested essential elements of DTN technology on the Deep Impact spacecraft. These experiments, called Deep Impact Network Experiment (DINET 1) were performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. The DINET 1 software was installed on the backup software partition on the backup flight computer for DINET 1. For DINET 1, the spacecraft was at a distance of about 15 million miles (24 million kilometers) from Earth. During DINET 1 300 images were transmitted from the JPL nodes to the spacecraft. Then, they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. The first DINET 1 experiment successfully validated many of the essential elements of the DTN protocols. DINET 2 demonstrated: 1) additional DTN functionality, 2) automated certain tasks which were manually implemented in DINET 1 and 3) installed the ION SW on nodes outside of JPL. DINET 3 plans to: 1) upgrade the LTP convergence-layer adapter to conform to the international LTP CL specification, 2) add convergence-layer "stewardship" procedures and 3) add the BSP security elements [PIB & PCB]. This paper describes the planning and execution of the flight experiment and the validation results.
NASA Astrophysics Data System (ADS)
Dufaux, Frederic
2011-06-01
The issue of privacy in video surveillance has drawn a lot of interest lately. However, thorough performance analysis and validation is still lacking, especially regarding the fulfillment of privacy-related requirements. In this paper, we first review recent Privacy Enabling Technologies (PET). Next, we discuss pertinent evaluation criteria for effective privacy protection. We then put forward a framework to assess the capacity of PET solutions to hide distinguishing facial information and to conceal identity. We conduct comprehensive and rigorous experiments to evaluate the performance of face recognition algorithms applied to images altered by PET. Results show the ineffectiveness of naïve PET such as pixelization and blur. Conversely, they demonstrate the effectiveness of more sophisticated scrambling techniques to foil face recognition.
Christou, Nicolas; Dinov, Ivo D
2010-09-01
Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources.
Christou, Nicolas; Dinov, Ivo D.
2011-01-01
Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources. PMID:21603097
ERIC Educational Resources Information Center
Scherer, Marcia J.; McKee, Barbara G.
Validity and reliability data are presented for two instruments for assessing the predispositions that people have toward the use of assistive and educational technologies. The two instruments, the Assistive Technology Device Predisposition Assessment (ATDPA) and the Educational Technology Predisposition Assessment (ETPA), are self-report…
ERIC Educational Resources Information Center
Romine, William; Sadler, Troy D.; Presley, Morgan; Klosterman, Michelle L.
2014-01-01
This study presents the systematic development, validation, and use of a new instrument for measuring student interest in science and technology. The Student Interest in Technology and Science (SITS) survey is composed of 5 sub-sections assessing the following dimensions: interest in learning science, using technology to learn science, science…
Overview of a Proposed Flight Validation of Aerocapture System Technology for Planetary Missions
NASA Technical Reports Server (NTRS)
Keys, Andrew S.; Hall, Jeffery L.; Oh, David; Munk, Michelle M.
2006-01-01
Aerocapture System Technology for Planetary Missions is being proposed to NASA's New Millennium Program for flight aboard the Space Technology 9 (ST9) flight opportunity. The proposed ST9 aerocapture mission is a system-level flight validation of the aerocapture maneuver as performed by an instrumented, high-fidelity flight vehicle within a true in-space and atmospheric environment. Successful validation of the aerocapture maneuver will be enabled through the flight validation of an advanced guidance, navigation, and control system as developed by Ball Aerospace and two advanced Thermal Protection System (TPS) materials, Silicon Refined Ablative Material-20 (SRAM-20) and SRAM-14, as developed by Applied Research Associates (ARA) Ablatives Laboratory. The ST9 aerocapture flight validation will be sufficient for immediate infusion of these technologies into NASA science missions being proposed for flight to a variety of Solar System destinations possessing a significant planetary atmosphere.
2018-01-01
Artificial intelligence (AI) is projected to substantially influence clinical practice in the foreseeable future. However, despite the excitement around the technologies, it is yet rare to see examples of robust clinical validation of the technologies and, as a result, very few are currently in clinical use. A thorough, systematic validation of AI technologies using adequately designed clinical research studies before their integration into clinical practice is critical to ensure patient benefit and safety while avoiding any inadvertent harms. We would like to suggest several specific points regarding the role that peer-reviewed medical journals can play, in terms of study design, registration, and reporting, to help achieve proper and meaningful clinical validation of AI technologies designed to make medical diagnosis and prediction, focusing on the evaluation of diagnostic accuracy efficacy. Peer-reviewed medical journals can encourage investigators who wish to validate the performance of AI systems for medical diagnosis and prediction to pay closer attention to the factors listed in this article by emphasizing their importance. Thereby, peer-reviewed medical journals can ultimately facilitate translating the technological innovations into real-world practice while securing patient safety and benefit. PMID:29805337
Park, Seong Ho; Kressel, Herbert Y
2018-05-28
Artificial intelligence (AI) is projected to substantially influence clinical practice in the foreseeable future. However, despite the excitement around the technologies, it is yet rare to see examples of robust clinical validation of the technologies and, as a result, very few are currently in clinical use. A thorough, systematic validation of AI technologies using adequately designed clinical research studies before their integration into clinical practice is critical to ensure patient benefit and safety while avoiding any inadvertent harms. We would like to suggest several specific points regarding the role that peer-reviewed medical journals can play, in terms of study design, registration, and reporting, to help achieve proper and meaningful clinical validation of AI technologies designed to make medical diagnosis and prediction, focusing on the evaluation of diagnostic accuracy efficacy. Peer-reviewed medical journals can encourage investigators who wish to validate the performance of AI systems for medical diagnosis and prediction to pay closer attention to the factors listed in this article by emphasizing their importance. Thereby, peer-reviewed medical journals can ultimately facilitate translating the technological innovations into real-world practice while securing patient safety and benefit.
Corvi, Raffaella; Ahr, Hans-Jürgen; Albertini, Silvio; Blakey, David H.; Clerici, Libero; Coecke, Sandra; Douglas, George R.; Gribaldo, Laura; Groten, John P.; Haase, Bernd; Hamernik, Karen; Hartung, Thomas; Inoue, Tohru; Indans, Ian; Maurici, Daniela; Orphanides, George; Rembges, Diana; Sansone, Susanna-Assunta; Snape, Jason R.; Toda, Eisaku; Tong, Weida; van Delft, Joost H.; Weis, Brenda; Schechtman, Leonard M.
2006-01-01
This is the report of the first workshop “Validation of Toxicogenomics-Based Test Systems” held 11–12 December 2003 in Ispra, Italy. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and organized jointly by ECVAM, the U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM). The primary aim of the workshop was for participants to discuss and define principles applicable to the validation of toxicogenomics platforms as well as validation of specific toxicologic test methods that incorporate toxicogenomics technologies. The workshop was viewed as an opportunity for initiating a dialogue between technologic experts, regulators, and the principal validation bodies and for identifying those factors to which the validation process would be applicable. It was felt that to do so now, as the technology is evolving and associated challenges are identified, would be a basis for the future validation of the technology when it reaches the appropriate stage. Because of the complexity of the issue, different aspects of the validation of toxicogenomics-based test methods were covered. The three focus areas include a) biologic validation of toxicogenomics-based test methods for regulatory decision making, b) technical and bioinformatics aspects related to validation, and c) validation issues as they relate to regulatory acceptance and use of toxicogenomics-based test methods. In this report we summarize the discussions and describe in detail the recommendations for future direction and priorities. PMID:16507466
Corvi, Raffaella; Ahr, Hans-Jürgen; Albertini, Silvio; Blakey, David H; Clerici, Libero; Coecke, Sandra; Douglas, George R; Gribaldo, Laura; Groten, John P; Haase, Bernd; Hamernik, Karen; Hartung, Thomas; Inoue, Tohru; Indans, Ian; Maurici, Daniela; Orphanides, George; Rembges, Diana; Sansone, Susanna-Assunta; Snape, Jason R; Toda, Eisaku; Tong, Weida; van Delft, Joost H; Weis, Brenda; Schechtman, Leonard M
2006-03-01
This is the report of the first workshop "Validation of Toxicogenomics-Based Test Systems" held 11-12 December 2003 in Ispra, Italy. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and organized jointly by ECVAM, the U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM). The primary aim of the workshop was for participants to discuss and define principles applicable to the validation of toxicogenomics platforms as well as validation of specific toxicologic test methods that incorporate toxicogenomics technologies. The workshop was viewed as an opportunity for initiating a dialogue between technologic experts, regulators, and the principal validation bodies and for identifying those factors to which the validation process would be applicable. It was felt that to do so now, as the technology is evolving and associated challenges are identified, would be a basis for the future validation of the technology when it reaches the appropriate stage. Because of the complexity of the issue, different aspects of the validation of toxicogenomics-based test methods were covered. The three focus areas include a) biologic validation of toxicogenomics-based test methods for regulatory decision making, b) technical and bioinformatics aspects related to validation, and c) validation issues as they relate to regulatory acceptance and use of toxicogenomics-based test methods. In this report we summarize the discussions and describe in detail the recommendations for future direction and priorities.
Content-based VLE designs improve learning efficiency in constructivist statistics education.
Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward
2011-01-01
We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content-based design outperforms the traditional VLE-based design.
Validation of Air-Backed Underwater Explosion Experiments with ALE3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leininger, L D
2005-02-04
This paper summarizes an exercise carried out to validate the process of implementing LLNL's ALE3D to predict the permanent deformation and rupture of an air-backed steel plate subjected to underwater shock. Experiments were performed in a shock tank at the Naval Science and Technology Laboratory in Visakhapatnam India, and the results are documented in reference. A consistent set of air-backed plates is subjected to shocks from increasing weights of explosives ranging from 10g-80g. At 40g and above, rupture is recorded in the experiment and, without fracture mechanics implemented in ALE3D, only the cases of 10g, 20g, and 30g are presentedmore » here. This methodology applies the Jones-Wilkins-Lee (JWL) Equation of State (EOS) to predict the pressure of the expanding detonation products, the Gruneisein EOS for water under highly dynamic compressible flow - both on 1-point integrated 3-d continuum elements. The steel plates apply a bilinear elastic-plastic response with failure and are simulated with 3-point integrated shell elements. The failure for this exercise is based on effective (or equivalent) plastic strain.« less
Graduate students' teaching experiences improve their methodological research skills.
Feldon, David F; Peugh, James; Timmerman, Briana E; Maher, Michelle A; Hurst, Melissa; Strickland, Denise; Gilmore, Joanna A; Stiegelmeyer, Cindy
2011-08-19
Science, technology, engineering, and mathematics (STEM) graduate students are often encouraged to maximize their engagement with supervised research and minimize teaching obligations. However, the process of teaching students engaged in inquiry provides practice in the application of important research skills. Using a performance rubric, we compared the quality of methodological skills demonstrated in written research proposals for two groups of early career graduate students (those with both teaching and research responsibilities and those with only research responsibilities) at the beginning and end of an academic year. After statistically controlling for preexisting differences between groups, students who both taught and conducted research demonstrate significantly greater improvement in their abilities to generate testable hypotheses and design valid experiments. These results indicate that teaching experience can contribute substantially to the improvement of essential research skills.
Two Cultures in Modern Science and Technology: For Safety and Validity Does Medicine Have to Update?
Becker, Robert E
2016-01-11
Two different scientific cultures go unreconciled in modern medicine. Each culture accepts that scientific knowledge and technologies are vulnerable to and easily invalidated by methods and conditions of acquisition, interpretation, and application. How these vulnerabilities are addressed separates the 2 cultures and potentially explains medicine's difficulties eradicating errors. A traditional culture, dominant in medicine, leaves error control in the hands of individual and group investigators and practitioners. A competing modern scientific culture accepts errors as inevitable, pernicious, and pervasive sources of adverse events throughout medical research and patient care too malignant for individuals or groups to control. Error risks to the validity of scientific knowledge and safety in patient care require systemwide programming able to support a culture in medicine grounded in tested, continually updated, widely promulgated, and uniformly implemented standards of practice for research and patient care. Experiences from successes in other sciences and industries strongly support the need for leadership from the Institute of Medicine's recommended Center for Patient Safely within the Federal Executive branch of government.
NASA Technical Reports Server (NTRS)
Ziemer, John; Marrese-Reading, Colleen; Dunn, Charley; Romero-Wolf, Andrew; Cutler, Curt; Javidnia, Shahram; Li, Thanh; Li, Irena; Franklin, Garth; Barela, Phil;
2017-01-01
Space Technology 7 Disturbance Reduction System (ST7-DRS) is a NASA technology demonstration payload as part of the ESA LISA Pathfinder (LPF) mission, which launched on December 3, 2015. The ST7-DRS payload includes colloid microthrusters as part of a drag-free dynamic control system (DCS) hosted on an integrated avionics unit (IAU) with spacecraft attitude and test mass position provided by the LPF spacecraft computer and the highly sensitive gravitational reference sensor (GRS) as part of the LISA Technology Package (LTP). The objective of the DRS was to validate two technologies: colloid micro-Newton thrusters (CMNT) to provide low-noise control capability of the spacecraft, and drag-free flight control. The CMNT were developed by Busek Co., Inc., in a partnership with NASA Jet Propulsion Laboratory (JPL), and the DCS algorithms and flight software were developed at NASA Goddard Space Flight Center (GSFC). ST7-DRS demonstrated drag-free operation with 10nmHz level precision spacecraft position control along the primary axis of the LTP using eight CMNTs that provided 5-30 N each with 0.1 N precision. The DCS and CMNTs performed as required and as expected from ground test results, meeting all Level 1 requirements based on on-orbit data and analysis. DRS microthrusters operated for 2400 hours in flight during commissioning activities, a 90-day experiment and the extended mission. This mission represents the first validated demonstration of electrospray thrusters in space, providing precision spacecraft control and drag-free operation in a flight environment with applications to future gravitational wave observatories like LISA.
Autonomous formation flying based on GPS — PRISMA flight results
NASA Astrophysics Data System (ADS)
D'Amico, Simone; Ardaens, Jean-Sebastien; De Florio, Sergio
2013-01-01
This paper presents flight results from the early harvest of the Spaceborne Autonomous Formation Flying Experiment (SAFE) conducted in the frame of the Swedish PRISMA technology demonstration mission. SAFE represents one of the first demonstrations in low Earth orbit of an advanced guidance, navigation and control system for dual-spacecraft formations. Innovative techniques based on differential GPS-based navigation and relative orbital elements control are validated and tuned in orbit to fulfill the typical requirements of future distributed scientific instruments for remote sensing.
1989-09-25
Orders and test specifications. Some mandatory replacement of high failure items are directed by Technical Orders to extend MTBF. Precision bearing and...Experience is very high but natural attrition is reducing the numbers faster than training is furnishing younger mechanics. Surge conditions would be...model validation run output revealed that utilization of equipment is very low and manpower is high . Based on this analysis and the brainstorming
PHM for Ground Support Systems Case Study: From Requirements to Integration
NASA Technical Reports Server (NTRS)
Teubert, Chris
2015-01-01
This session will detail the experience of members of the NASA Ames Prognostic Center of Excellence (PCoE) producing PHM tools for NASA Advanced Ground Support Systems, including the challenges in applying their research in a production environment. Specifically, we will 1) go over the systems engineering and review process used; 2) Discuss the challenges and pitfalls in this process; 3) discuss software architecting, documentation, verification and validation activities and 4) discuss challenges in communicating the benefits and limitations of PHM Technologies.
High-speed inlet research program and supporting analysis
NASA Technical Reports Server (NTRS)
Coltrin, Robert E.
1990-01-01
The technology challenges faced by the high speed inlet designer are discussed by describing the considerations that went into the design of the Mach 5 research inlet. It is shown that the emerging three dimensional viscous computational fluid dynamics (CFD) flow codes, together with small scale experiments, can be used to guide larger scale full inlet systems research. Then, in turn, the results of the large scale research, if properly instrumented, can be used to validate or at least to calibrate the CFD codes.
Validating a Technology Enhanced Student-Centered Learning Model
ERIC Educational Resources Information Center
Kang, Myunghee; Hahn, Jungsun; Chung, Warren
2015-01-01
The Technology Enhanced Student Centered Learning (TESCL) Model in this study presents the core factors that ensure the quality of learning in a technology-supported environment. Although the model was conceptually constructed using a student-centered learning framework and drawing upon previous studies, it should be validated through real-world…
Consequential Validity of an Assistive Technology Supplement for the School Function Assessment
ERIC Educational Resources Information Center
Silverman, Michelle Kaye; Smith, Roger O.
2006-01-01
Educators and therapists implement assistive technology to maximize educational outcomes of students with disabilities. However, few measure the outcomes of interventions because of a lack of valid measurement tools. This study investigated whether an assistive technology supplement for the School Function Assessment demonstrates an important…
SenSyF Experience on Integration of EO Services in a Generic, Cloud-Based EO Exploitation Platform
NASA Astrophysics Data System (ADS)
Almeida, Nuno; Catarino, Nuno; Gutierrez, Antonio; Grosso, Nuno; Andrade, Joao; Caumont, Herve; Goncalves, Pedro; Villa, Guillermo; Mangin, Antoine; Serra, Romain; Johnsen, Harald; Grydeland, Tom; Emsley, Stephen; Jauch, Eduardo; Moreno, Jose; Ruiz, Antonio
2016-08-01
SenSyF is a cloud-based data processing framework for EO- based services. It has been pioneer in addressing Big Data issues from the Earth Observation point of view, and is a precursor of several of the technologies and methodologies that will be deployed in ESA's Thematic Exploitation Platforms and other related systems.The SenSyF system focuses on developing fully automated data management, together with access to a processing and exploitation framework, including Earth Observation specific tools. SenSyF is both a development and validation platform for data intensive applications using Earth Observation data. With SenSyF, scientific, institutional or commercial institutions developing EO- based applications and services can take advantage of distributed computational and storage resources, tailored for applications dependent on big Earth Observation data, and without resorting to deep infrastructure and technological investments.This paper describes the integration process and the experience gathered from different EO Service providers during the project.
MICROROC: MICRO-mesh gaseous structure Read-Out Chip
NASA Astrophysics Data System (ADS)
Adloff, C.; Blaha, J.; Chefdeville, M.; Dalmaz, A.; Drancourt, C.; Dulucq, F.; Espargilière, A.; Gaglione, R.; Geffroy, N.; Jacquemier, J.; Karyotakis, Y.; Martin-Chassard, G.; Prast, J.; Seguin-Moreau, N.; de La Taille, Ch; Vouters, G.
2012-01-01
MICRO MEsh GAseous Structure (MICROMEGAS) and Gas Electron Multipliers (GEM) detectors are two candidates for the active medium of a Digital Hadronic CALorimeter (DHCAL) as part of a high energy physics experiment at a future linear collider (ILC/CLIC). Physics requirements lead to a highly granular hadronic calorimeter with up to thirty million channels with probably only hit information (digital readout calorimeter). To validate the concept of digital hadronic calorimetry with such small cell size, the construction and test of a cubic meter technological prototype, made of 40 planes of one square meter each, is necessary. This technological prototype would contain about 400 000 electronic channels, thus requiring the development of front-end ASIC. Based on the experience gained with previous ASIC that were mounted on detectors and tested in particle beams, a new ASIC called MICROROC has been developped. This paper summarizes the caracterisation campaign that was conducted on this new chip as well as its integration into a large area Micromegas chamber of one square meter.
Implementation of quantum key distribution network simulation module in the network simulator NS-3
NASA Astrophysics Data System (ADS)
Mehic, Miralem; Maurhart, Oliver; Rass, Stefan; Voznak, Miroslav
2017-10-01
As the research in quantum key distribution (QKD) technology grows larger and becomes more complex, the need for highly accurate and scalable simulation technologies becomes important to assess the practical feasibility and foresee difficulties in the practical implementation of theoretical achievements. Due to the specificity of the QKD link which requires optical and Internet connection between the network nodes, to deploy a complete testbed containing multiple network hosts and links to validate and verify a certain network algorithm or protocol would be very costly. Network simulators in these circumstances save vast amounts of money and time in accomplishing such a task. The simulation environment offers the creation of complex network topologies, a high degree of control and repeatable experiments, which in turn allows researchers to conduct experiments and confirm their results. In this paper, we described the design of the QKD network simulation module which was developed in the network simulator of version 3 (NS-3). The module supports simulation of the QKD network in an overlay mode or in a single TCP/IP mode. Therefore, it can be used to simulate other network technologies regardless of QKD.
Attentional models of multitask pilot performance using advanced display technology.
Wickens, Christopher D; Goh, Juliana; Helleberg, John; Horrey, William J; Talleur, Donald A
2003-01-01
In the first part of the reported research, 12 instrument-rated pilots flew a high-fidelity simulation, in which air traffic control presentation of auditory (voice) information regarding traffic and flight parameters was compared with advanced display technology presentation of equivalent information regarding traffic (cockpit display of traffic information) and flight parameters (data link display). Redundant combinations were also examined while pilots flew the aircraft simulation, monitored for outside traffic, and read back communications messages. The data suggested a modest cost for visual presentation over auditory presentation, a cost mediated by head-down visual scanning, and no benefit for redundant presentation. The effects in Part 1 were modeled by multiple-resource and preemption models of divided attention. In the second part of the research, visual scanning in all conditions was fit by an expected value model of selective attention derived from a previous experiment. This model accounted for 94% of the variance in the scanning data and 90% of the variance in a second validation experiment. Actual or potential applications of this research include guidance on choosing the appropriate modality for presenting in-cockpit information and understanding task strategies induced by introducing new aviation technology.
NASA Technical Reports Server (NTRS)
Maul, William A.; Chicatelli, Amy; Fulton, Christopher E.; Balaban, Edward; Sweet, Adam; Hayden, Sandra Claire; Bajwa, Anupa
2005-01-01
The Propulsion IVHM Technology Experiment (PITEX) has been an on-going research effort conducted over several years. PITEX has developed and applied a model-based diagnostic system for the main propulsion system of the X-34 reusable launch vehicle, a space-launch technology demonstrator. The application was simulation-based using detailed models of the propulsion subsystem to generate nominal and failure scenarios during captive carry, which is the most safety-critical portion of the X-34 flight. Since no system-level testing of the X-34 Main Propulsion System (MPS) was performed, these simulated data were used to verify and validate the software system. Advanced diagnostic and signal processing algorithms were developed and tested in real-time on flight-like hardware. In an attempt to expose potential performance problems, these PITEX algorithms were subject to numerous real-world effects in the simulated data including noise, sensor resolution, command/valve talkback information, and nominal build variations. The current research has demonstrated the potential benefits of model-based diagnostics, defined the performance metrics required to evaluate the diagnostic system, and studied the impact of real-world challenges encountered when monitoring propulsion subsystems.
Propulsion IVHM Technology Experiment
NASA Technical Reports Server (NTRS)
Chicatelli, Amy K.; Maul, William A.; Fulton, Christopher E.
2006-01-01
The Propulsion IVHM Technology Experiment (PITEX) successfully demonstrated real-time fault detection and isolation of a virtual reusable launch vehicle (RLV) main propulsion system (MPS). Specifically, the PITEX research project developed and applied a model-based diagnostic system for the MPS of the X-34 RLV, a space-launch technology demonstrator. The demonstration was simulation-based using detailed models of the propulsion subsystem to generate nominal and failure scenarios during captive carry, which is the most safety-critical portion of the X-34 flight. Since no system-level testing of the X-34 Main Propulsion System (MPS) was performed, these simulated data were used to verify and validate the software system. Advanced diagnostic and signal processing algorithms were developed and tested in real time on flight-like hardware. In an attempt to expose potential performance problems, the PITEX diagnostic system was subjected to numerous realistic effects in the simulated data including noise, sensor resolution, command/valve talkback information, and nominal build variations. In all cases, the PITEX system performed as required. The research demonstrated potential benefits of model-based diagnostics, defined performance metrics required to evaluate the diagnostic system, and studied the impact of real-world challenges encountered when monitoring propulsion subsystems.
Commercial involvement in the development of space-based plant growing technology
NASA Astrophysics Data System (ADS)
Bula, R. J.; Tibbitts, T. W.; Morrow, R. C.; Dinauer, W. R.
1992-07-01
Considerable technological progress has been made in the development of controlled environment facilities for plant growth. Although not all of the technology used for terrestrial facilities is applicable to space-based plant growth facilities, the information resident in the commercial organizations that market these facilities can provide a significant resource for the development of the plant growing component of a CELSS. In 1985, NASA initiated an effort termed the Centers for the Commercial Development of Space (CCDS). This program endeavors to develop cooperative research and technology development programs with industrial companies that capitalize on the strengths of industry-university working relationships. One of the these CCDSs, the Wisconsin Center for Space Automation and Robotics (WCSAR), deals with developing automated plant growth facilities for space, in cooperation with several industrial partners. Concepts have been developed with industrial partners for the irradiation, water and nutrient delivery, nutrient composition control and automation and robotics subsystems of plant growing units. Space flight experiments are planned for validation of the concepts in a space environment.
Commercial involvement in the development of space-based plant growing technology.
Bula, R J; Tibbitts, T W; Morrow, R C; Dinauer, W R
1992-01-01
Considerable technological progress has been made in the development of controlled environment facilities for plant growth. Although not all of the technology used for terrestrial facilities is applicable to space-based plant growth facilities, the information resident in the commercial organizations that market these facilities can provide a significant resource for the development of the plant growing component of a CELSS. In 1985, NASA initiated an effort termed the Centers for the Commercial Development of Space (CCDS). This program endeavors to develop cooperative research and technology development programs with industrial companies that capitalize on the strengths of industry-university working relationships. One of the these CCDSs, the Wisconsin Center for Space Automation and Robotics (WCSAR), deals with developing automated plant growth facilities for space, in cooperation with several industrial partners. Concepts have been developed with industrial partners for the irradiation, water and nutrient delivery, nutrient composition control and automation and robotics subsystems of plant growing units. Space flight experiments are planned for validation of the concepts in a space environment.
Experience with The Use of Warm Mix Asphalt Additives in Bitumen Binders
NASA Astrophysics Data System (ADS)
Cápayová, Silvia; Unčík, Stanislav; Cihlářová, Denisa
2018-03-01
In most European countries, Hot Mix Asphalt (HMA) technology is still being used as the standard for the production and processing of bituminous mixtures. However, from the perspective of environmental acceptability, global warming and greenhouse gas production, Slovakia is making an effort to put into practice modern technology, which is characterized by lower energy consumption and reducing negative impacts on the environment. Warm mix asphalt technologies (WMA), which have been verified at the Department of Transportation Engineering laboratory, Faculty of Civil Engineering, Slovak University of Technology (FCE, SUT) can provide the required mixture properties and can be used not only for the construction of new roads, but also for their renovation and reconstruction. The paper was created in cooperation with the Technical University of Ostrava, Czech Republic, which also deals with the addition of additives to asphalt mixtures and binders. It describes a comparison of the impact of some organic and chemical additives on the properties of commonly used bitumen binders in accordance with valid standards and technical regulations.
ERIC Educational Resources Information Center
Deng, Feng; Chai, Ching Sing; So, Hyo-Jeong; Qian, Yangyi; Chen, Lingling
2017-01-01
While various quantitative measures for assessing teachers' technological pedagogical content knowledge (TPACK) have developed rapidly, few studies to date have comprehensively validated the structure of TPACK through various criteria of validity especially for content specific areas. In this paper, we examined how the TPACK survey measure is…
Opportunistic Mobility Support for Resource Constrained Sensor Devices in Smart Cities
Granlund, Daniel; Holmlund, Patrik; Åhlund, Christer
2015-01-01
A multitude of wireless sensor devices and technologies are being developed and deployed in cities all over the world. Sensor applications in city environments may include highly mobile installations that span large areas which necessitates sensor mobility support. This paper presents and validates two mechanisms for supporting sensor mobility between different administrative domains. Firstly, EAP-Swift, an Extensible Authentication Protocol (EAP)-based sensor authentication protocol is proposed that enables light-weight sensor authentication and key generation. Secondly, a mechanism for handoffs between wireless sensor gateways is proposed. We validate both mechanisms in a real-life study that was conducted in a smart city environment with several fixed sensors and moving gateways. We conduct similar experiments in an industry-based anechoic Long Term Evolution (LTE) chamber with an ideal radio environment. Further, we validate our results collected from the smart city environment against the results produced under ideal conditions to establish best and real-life case scenarios. Our results clearly validate that our proposed mechanisms can facilitate efficient sensor authentication and handoffs while sensors are roaming in a smart city environment. PMID:25738767
Dozier, Samantha; Brown, Jeffrey; Currie, Alistair
2011-11-29
In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.
Opportunistic mobility support for resource constrained sensor devices in smart cities.
Granlund, Daniel; Holmlund, Patrik; Åhlund, Christer
2015-03-02
A multitude of wireless sensor devices and technologies are being developed and deployed in cities all over the world. Sensor applications in city environments may include highly mobile installations that span large areas which necessitates sensor mobility support. This paper presents and validates two mechanisms for supporting sensor mobility between different administrative domains. Firstly, EAP-Swift, an Extensible Authentication Protocol (EAP)-based sensor authentication protocol is proposed that enables light-weight sensor authentication and key generation. Secondly, a mechanism for handoffs between wireless sensor gateways is proposed. We validate both mechanisms in a real-life study that was conducted in a smart city environment with several fixed sensors and moving gateways. We conduct similar experiments in an industry-based anechoic Long Term Evolution (LTE) chamber with an ideal radio environment. Further, we validate our results collected from the smart city environment against the results produced under ideal conditions to establish best and real-life case scenarios. Our results clearly validate that our proposed mechanisms can facilitate efficient sensor authentication and handoffs while sensors are roaming in a smart city environment.
1998-09-04
Workers watch as the Hubble Space Telescope Orbiting Systems Test (HOST)is lowered onto a workstand in the Space Shuttle Processing Facility. To the right can be seen the Rack Insertion Device and Leonardo, a Multi-Purpose Logistics Module. The HOST platform, one of the payloads on the STS-95 mission, is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an earth orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry three other payloads: the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker, and the SPACEHAB single module with experiments on space flight and the aging process
1998-09-23
KENNEDY SPACE CENTER, FLA. -- The Hubble Space Telescope Orbiting Systems Test (HOST), one of the payloads on the STS-95 mission, is placed inside its payload canister in the Space Station Processing Facility. The canister is 65 feet long, 18 feet wide and 18 feet, 7 inches high. The HOST platform is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an Earth-orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry other payloads such as the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker (IEH-3), and the SPACEHAB single module with experiments on space flight and the aging process
1998-09-23
KENNEDY SPACE CENTER, FLA. -- The Hubble Space Telescope Orbiting Systems Test (HOST), one of the payloads on the STS-95 mission, is suspended above its payload canister in the Space Station Processing Facility. The canister is 65 feet long, 18 feet wide and 18 feet, 7 inches high. The HOST platform is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an Earth-orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry other payloads such as the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker (IEH-3), and the SPACEHAB single module with experiments on space flight and the aging process
1998-09-23
KENNEDY SPACE CENTER, FLA. -- The Hubble Space Telescope Orbiting Systems Test Platform (HOST) is lifted off its work stand in the Space Station Processing Facility before moving it to its payload canister. One of the payloads on the STS-95 mission, the HOST platform is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an Earth-orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry other payloads such as the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker (IEH-3), and the SPACEHAB single module with experiments on space flight and the aging process
The Zero Boil-Off Tank Experiment Contributions to the Development of Cryogenic Fluid Management
NASA Technical Reports Server (NTRS)
Chato, David J.; Kassemi, Mohammad
2015-01-01
The Zero Boil-Off Technology (ZBOT) Experiment involves performing a small scale ISS experiment to study tank pressurization and pressure control in microgravity. The ZBOT experiment consists of a vacuum jacketed test tank filled with an inert fluorocarbon simulant liquid. Heaters and thermo-electric coolers are used in conjunction with an axial jet mixer flow loop to study a range of thermal conditions within the tank. The objective is to provide a high quality database of low gravity fluid motions and thermal transients which will be used to validate Computational Fluid Dynamic (CFD) modeling. This CFD can then be used in turn to predict behavior in larger systems with cryogens. This paper will discuss the current status of the ZBOT experiment as it approaches its flight to installation on the International Space Station, how its findings can be scaled to larger and more ambitious cryogenic fluid management experiments, as well as ideas for follow-on investigations using ZBOT like hardware to study other aspects of cryogenic fluid management.
Hypersonic Inflatable Aerodynamic Decelerator (HIAD) Technology Development Overview
NASA Technical Reports Server (NTRS)
Hughes, Stephen J.; Cheatwood, F. McNeil; Calomino, Anthony M.; Wright, Henry S.
2013-01-01
The successful flight of the Inflatable Reentry Vehicle Experiment (IRVE)-3 has further demonstrated the potential value of Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This technology development effort is funded by NASA's Space Technology Mission Directorate (STMD) Game Changing Development Program (GCDP). This paper provides an overview of a multi-year HIAD technology development effort, detailing the projects completed to date and the additional testing planned for the future. The effort was divided into three areas: Flexible Systems Development (FSD), Mission Advanced Entry Concepts (AEC), and Flight Validation. FSD consists of a Flexible Thermal Protection Systems (FTPS) element, which is investigating high temperature materials, coatings, and additives for use in the bladder, insulator, and heat shield layers; and an Inflatable Structures (IS) element which includes manufacture and testing (laboratory and wind tunnel) of inflatable structures and their associated structural elements. AEC consists of the Mission Applications element developing concepts (including payload interfaces) for missions at multiple destinations for the purpose of demonstrating the benefits and need for the HIAD technology as well as the Next Generation Subsystems element. Ground test development has been pursued in parallel with the Flight Validation IRVE-3 flight test. A larger scale (6m diameter) HIAD inflatable structure was constructed and aerodynamically tested in the National Full-scale Aerodynamics Complex (NFAC) 40ft by 80ft test section along with a duplicate of the IRVE-3 3m article. Both the 6m and 3m articles were tested with instrumented aerodynamic covers which incorporated an array of pressure taps to capture surface pressure distribution to validate Computational Fluid Dynamics (CFD) model predictions of surface pressure distribution. The 3m article also had a duplicate IRVE-3 Thermal Protection System (TPS) to test in addition to testing with the Aerocover configuration. Both the Aerocovers and the TPS were populated with high contrast targets so that photogrammetric solutions of the loaded surface could be created. These solutions both refined the aerodynamic shape for CFD modeling and provided a deformed shape to validate structural Finite Element Analysis (FEA) models. Extensive aerothermal testing has been performed on the TPS candidates. This testing has been conducted in several facilities across the country. The majority of the testing has been conducted in the Boeing Large Core Arc Tunnel (LCAT). HIAD is continuing to mature testing methodology in this facility and is developing new test sample fixtures and control methodologies to improve understanding and quality of the environments to which the samples are subjected. Additional testing has been and continues to be performed in the NASA LaRC 8ft High Temperature Tunnel, where samples up to 2ft by 2ft are being tested over representative underlying structures incorporating construction features such as sewn seams and through-thickness quilting. With the successful completion to the IRVE-3 flight demonstration, mission planning efforts are ramping up on the development of the HIAD Earth Atmospheric Reenty Test (HEART) which will demonstrate a relevant scale vehicle in relevant environments via a large-scale aeroshell (approximately 8.5m) entering at orbital velocity (approximately 7km/sec) with an entry mass on the order of 4MT. Also, the Build to Print (BTP) hardware built as a risk mitigation for the IRVE-3 project to have a "spare" ready to go in the event of a launch vehicle delivery failure is now available for an additional sub-orbital flight experiment. Mission planning is underway to define a mission that can utilize this existing hardware and help the HIAD project further mature this technology.
OECD-NEA Expert Group on Multi-Physics Experimental Data, Benchmarks and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valentine, Timothy; Rohatgi, Upendra S.
High-fidelity, multi-physics modeling and simulation (M&S) tools are being developed and utilized for a variety of applications in nuclear science and technology and show great promise in their abilities to reproduce observed phenomena for many applications. Even with the increasing fidelity and sophistication of coupled multi-physics M&S tools, the underpinning models and data still need to be validated against experiments that may require a more complex array of validation data because of the great breadth of the time, energy and spatial domains of the physical phenomena that are being simulated. The Expert Group on Multi-Physics Experimental Data, Benchmarks and Validationmore » (MPEBV) of the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) was formed to address the challenges with the validation of such tools. The work of the MPEBV expert group is shared among three task forces to fulfill its mandate and specific exercises are being developed to demonstrate validation principles for common industrial challenges. This paper describes the overall mission of the group, the specific objectives of the task forces, the linkages among the task forces, and the development of a validation exercise that focuses on a specific reactor challenge problem.« less
Comparison of Requirements for Composite Structures for Aircraft and Space Applications
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Elliot, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.
2010-01-01
In this report, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from aircraft and other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.
Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)
NASA Technical Reports Server (NTRS)
Folta, David C.; Hawkins, Albin; Bauer, Frank H. (Technical Monitor)
2001-01-01
NASA's first autonomous formation flying mission completed its primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center (GSFC) implemented a universal 3-axis formation flying algorithm in an autonomous executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard flight design and presents the validation results of this unique system. Results from functionality assessment through fully autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a standalone algorithm.
Kuiper, RuthAnne
2010-01-01
The utility of personal digital assistants (PDA) as a point of care resource in health care practice and education presents new challenges for nursing faculty. While there is a plethora of PDA resources available, little is known about the variables that effect student learning and technology adoption. In this study nursing students used PDA software programs which included a drug guide, medical dictionary, laboratory manual and nursing diagnosis manual during acute care clinical experiences. Analysis of student journals comparative reflective statements about the PDA as an adjunct to other available resources in clinical practice are presented. The benefits of having a PDA included readily available data, validation of thinking processes, and facilitation of care plan re-evaluation. Students reported increased frequency of use and independence. Significant correlations between user perceptions and computer self-efficacy suggested greater confidence in abilities with technology resulting in increased self-awareness and achievement of learning outcomes.
Development of stimulation diagnostic technology. Annual report, May 1990--December 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.; Lorenz, J.C.
The objective of this project is to apply Sandia`s expertise and technology towards the development of stimulation diagnostic technology in the areas of in situ stress, natural fracturing, stimulation processes and instrumentation systems. Initial work has concentrated on experiment planning for a site where hydraulic fracturing could be evaluated and design models and fracture diagnostics could be validated and improved. Important issues have been defined and new diagnostics, such as inclinometers, identified. In the area of in situ stress, circumferential velocity analysis is proving to be a useful diagnostic for stress orientation. Natural fracture studies of the Frontier formation aremore » progressing; two fracture sets have been found and their relation to tectonic events have been hypothesized. Analyses of stimulation data have been performed for several sites, primarily for in situ stress information. Some new ideas in stimulation diagnostics have been proposed; these ideas may significantly improve fracture diagnostic capabilities.« less
Development of stimulation diagnostic technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.; Lorenz, J.C.
The objective of this project is to apply Sandia's expertise and technology towards the development of stimulation diagnostic technology in the areas of in situ stress, natural fracturing, stimulation processes and instrumentation systems. Initial work has concentrated on experiment planning for a site where hydraulic fracturing could be evaluated and design models and fracture diagnostics could be validated and improved. Important issues have been defined and new diagnostics, such as inclinometers, identified. In the area of in situ stress, circumferential velocity analysis is proving to be a useful diagnostic for stress orientation. Natural fracture studies of the Frontier formation aremore » progressing; two fracture sets have been found and their relation to tectonic events have been hypothesized. Analyses of stimulation data have been performed for several sites, primarily for in situ stress information. Some new ideas in stimulation diagnostics have been proposed; these ideas may significantly improve fracture diagnostic capabilities.« less
Chang, Chiung-Sui
2008-01-01
The purpose of this study was to describe the development and validation of an instrument to identify various dimensions of the computer technology literacy self-assessment scale (CTLS) for elementary school students. The instrument included five CTLS dimensions (subscales): the technology operation skills, the computer usages concepts, the attitudes toward computer technology, the learning with technology, and the Internet operation skills. Participants were 1,539 elementary school students in Taiwan. Data analysis indicated that the instrument developed in the study had satisfactory validity and reliability. Correlations analysis supported the legitimacy of using multiple dimensions in representing students' computer technology literacy. Significant differences were found between male and female students, and between grades on some CTLS dimensions. Suggestions are made for use of the instrument to examine complicated interplays between students' computer behaviors and their computer technology literacy.
Fernandez, Patrick G; Brockel, Megan A; Lipscomb, Lisa L; Ing, Richard J; Tailounie, Muayyad
2017-07-15
Effective communication with patients is essential to quality care. Obviously, language barriers significantly impact this and can increase the risk of poor patient outcomes. Smartphones and mobile health technology are valuable resources that are beginning to break down language barriers in health care. We present a case of a challenging language barrier where successful perioperative communication was achieved using mobile technology. Although quite beneficial, use of technology that is not validated exposes providers to unnecessary medicolegal risk. We hope to highlight the need for validation of such technology to ensure that these tools are an effective way to accurately communicate with patients in the perioperative setting.
In vivo performance of a microelectrode neural probe with integrated drug delivery
Rohatgi, Pratik; Langhals, Nicholas B.; Kipke, Daryl R.; Patil, Parag G.
2014-01-01
Object The availability of sophisticated neural probes is a key prerequisite in the development of future brain machine interfaces (BMI). In this study, we developed and validated a neural probe design capable of simultaneous drug delivery and electrophysiology recordings in vivo. Focal drug delivery has promise to dramatically extend the recording lives of neural probes, a limiting factor to clinical adoption of BMI technology. Methods To form the multifunctional neural probe, we affixed a 16-channel microfabricated silicon electrode array to a fused silica catheter. Three experiments were conducted to characterize the performance of the device. Experiment 1 examines cellular damage from probe insertion and the drug distribution in tissue. Experiment 2 measures the effects of saline infusions delivered through the probe on concurrent electrophysiology. Experiment 3 demonstrates that a physiologically relevant amount of drug can be delivered in a controlled fashion. For these experiments, Hoechst and propidium iodide were used to assess insertion trauma and the tissue distribution of the infusate. Artificial cerebral spinal fluid and tetrodotoxin were injected to determine the efficacy of drug delivery. Results The newly developed multifunctional neural probes were successfully inserted into rat cortex and were able to deliver fluids and drugs that resulted in the expected electrophysiological and histological responses. The damage from insertion of the device into brain tissue was substantially less than the volume of drug dispersion in tissue. Electrophysiological activity, including both individual spikes as well as local field potentials, was successfully recorded with this device during real-time drug delivery. No significant changes were seen in response to delivery of artificial cerebral spinal fluid as a control experiment, whereas delivery of tetrodotoxin produced the expected result of suppressing all spiking activity in the vicinity of the catheter outlet. Conclusions Multifunctional neural probes such as the ones developed and validated within this study have great potential to help further understand the design space and criteria for the next generation of neural probe technology. By incorporating integrated drug delivery functionality into the probes, new treatment options for neurological disorders and regenerative neural interfaces utilizing localized and feedback controlled delivery of drugs can be realized in the near future. PMID:19569896
Test and Demonstration Assets of New Mexico
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This document was developed by the Arrowhead Center of New Mexico State University as part of the National Security Preparedness Project (NSPP), funded by a DOE/NNSA grant. The NSPP has three primary components: business incubation, workforce development, and technology demonstration and validation. The document contains a survey of test and demonstration assets in New Mexico available for external users such as small businesses with security technologies under development. Demonstration and validation of national security technologies created by incubator sources, as well as other sources, are critical phases of technology development. The NSPP will support the utilization of an integrated demonstrationmore » and validation environment.« less
Adapting the Media and Technology Usage and Attitudes Scale to Turkish
ERIC Educational Resources Information Center
Özgür, Hasan
2016-01-01
Due to the requirement of a current, valid, and reliable assessment instrument for determining usage frequencies of technology-based media and the attitudes towards these, this study intends to determine the validity and reliability of the Media and Technology Usage and Attitudes Scale, developed by researchers from California State University,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irminger, Philip; Starke, Michael R; Dimitrovski, Aleksandar D
2014-01-01
Power system equipment manufacturers and researchers continue to experiment with novel overhead electric conductor designs that support better conductor performance and address congestion issues. To address the technology gap in testing these novel designs, Oak Ridge National Laboratory constructed the Powerline Conductor Accelerated Testing (PCAT) facility to evaluate the performance of novel overhead conductors in an accelerated fashion in a field environment. Additionally, PCAT has the capability to test advanced sensors and measurement methods for accessing overhead conductor performance and condition. Equipped with extensive measurement and monitoring devices, PCAT provides a platform to improve/validate conductor computer models and assess themore » performance of novel conductors. The PCAT facility and its testing capabilities are described in this paper.« less
Technology Readiness of the NEXT Ion Propulsion System
NASA Technical Reports Server (NTRS)
Benson, Scott W.; Patterson, Michael J.
2008-01-01
The NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system has been in advanced technology development under the NASA In-Space Propulsion Technology project. The highest fidelity hardware planned has now been completed by the government/industry team, including: a flight prototype model (PM) thruster, an engineering model (EM) power processing unit, EM propellant management assemblies, a breadboard gimbal, and control unit simulators. Subsystem and system level technology validation testing is in progress. To achieve the objective Technology Readiness Level 6, environmental testing is being conducted to qualification levels in ground facilities simulating the space environment. Additional tests have been conducted to characterize the performance range and life capability of the NEXT thruster. This paper presents the status and results of technology validation testing accomplished to date, the validated subsystem and system capabilities, and the plans for completion of this phase of NEXT development. The next round of competed planetary science mission announcements of opportunity, and directed mission decisions, are anticipated to occur in 2008 and 2009. Progress to date, and the success of on-going technology validation, indicate that the NEXT ion propulsion system will be a primary candidate for mission consideration in these upcoming opportunities.
Emmons, Karen M; Doubeni, Chyke A; Fernandez, Maria E; Miglioretti, Diana L; Samet, Jonathan M
2018-06-05
On 5 and 6 December 2017, the National Institutes of Health (NIH) convened the Pathways to Prevention Workshop: Methods for Evaluating Natural Experiments in Obesity to identify the status of methods for assessing natural experiments to reduce obesity, areas in which these methods could be improved, and research needs for advancing the field. This article considers findings from a systematic evidence review on methods for evaluating natural experiments in obesity, workshop presentations by experts and stakeholders, and public comment. Research gaps are identified, and recommendations related to 4 key issues are provided. Recommendations on population-based data sources and data integration include maximizing use and sharing of existing surveillance and research databases and ensuring significant effort to integrate and link databases. Recommendations on measurement include use of standardized and validated measures of obesity-related outcomes and exposures, systematic measurement of co-benefits and unintended consequences, and expanded use of validated technologies for measurement. Study design recommendations include improving guidance, documentation, and communication about methods used; increasing use of designs that minimize bias in natural experiments; and more carefully selecting control groups. Cross-cutting recommendations target activities that the NIH and other funders might undertake to improve the rigor of natural experiments in obesity, including training and collaboration on modeling and causal inference, promoting the importance of community engagement in the conduct of natural experiments, ensuring maintenance of relevant surveillance systems, and supporting extended follow-up assessments for exemplar natural experiments. To combat the significant public health threat posed by obesity, researchers should continue to take advantage of natural experiments. The recommendations in this report aim to strengthen evidence from such studies.
NASA Astrophysics Data System (ADS)
Liou, Pey-Yan; Kuo, Pei-Jung
2014-05-01
Background:Few studies have examined students' attitudinal perceptions of technology. There is no appropriate instrument to measure senior high school students' motivation and self-regulation toward technology learning among the current existing instruments in the field of technology education. Purpose:The present study is to validate an instrument for assessing senior high school students' motivation and self-regulation towards technology learning. Sample:A total of 1822 Taiwanese senior high school students (1020 males and 802 females) responded to the newly developed instrument. Design and method:The Motivation and Self-regulation towards Technology Learning (MSRTL) instrument was developed based on the previous instruments measuring students' motivation and self-regulation towards science learning. Exploratory and confirmatory factor analyses were utilized to investigate the structure of the items. Cronbach's alpha was applied for measuring the internal consistency of each scale. Furthermore, multivariate analysis of variance was used to examine gender differences. Results:Seven scales, including 'Technology learning self-efficacy,' 'Technology learning value,' 'Technology active learning strategies,' 'Technology learning environment stimulation,' 'Technology learning goal-orientation,' 'Technology learning self-regulation-triggering,' and 'Technology learning self-regulation-implementing' were confirmed for the MSRTL instrument. Moreover, the results also showed that male and female students did not present the same degree of preference in all of the scales. Conclusions:The MSRTL instrument composed of seven scales corresponding to 39 items was shown to be valid based on validity and reliability analyses. While male students tended to express more positive and active performance in the motivation scales, no gender differences were found in the self-regulation scales.
ERIC Educational Resources Information Center
Gogus, Aytac; Nistor, Nicolae; Riley, Richard W.; Lerche, Thomas
2012-01-01
The Unified Theory of Acceptance and Use of Technology (UTAUT; Venkatesh et al., 2003, 2012) proposes a major model of educational technology acceptance (ETA) which has been yet validated only in few languages and cultures. Therefore, this study aims at extending the applicability of UTAUT to Turkish culture. Based on acceptance and cultural data…
Pitteri, Sharon J.; Amon, Lynn M.; Buson, Tina Busald; Zhang, Yuzheng; Johnson, Melissa M.; Chin, Alice; Kennedy, Jacob; Wong, Chee-Hong; Zhang, Qing; Wang, Hong; Lampe, Paul D.; Prentice, Ross L.; McIntosh, Martin W.; Hanash, Samir M.; Li, Christopher I.
2010-01-01
Applying advanced proteomic technologies to prospectively collected specimens from large studies is one means of identifying preclinical changes in plasma proteins that are potentially relevant to the early detection of diseases like breast cancer. We conducted fourteen independent quantitative proteomics experiments comparing pooled plasma samples collected from 420 estrogen receptor positive (ER+) breast cancer patients ≤17 months prior to their diagnosis and matched controls. Based on the over 3.4 million tandem mass spectra collected in the discovery set, 503 proteins were quantified of which 57 differentiated cases from controls with a p-value<0.1. Seven of these proteins, for which quantitative ELISA assays were available, were assessed in an independent validation set. Of these candidates, epidermal growth factor receptor (EGFR) was validated as a predictor of breast cancer risk in an independent set of preclinical plasma samples for women overall [odds ratio (OR)=1.44, p-value=0.0008], and particularly for current users of estrogen plus progestin (E+P) menopausal hormone therapy (OR=2.49, p-value=0.0001). Among current E+P users EGFR's sensitivity for breast cancer risk was 31% with 90% specificity. While EGFR's sensitivity and specificity are insufficient for a clinically useful early detection biomarker, this study suggests that proteins that are elevated preclinically in women who go on to develop breast cancer can be discovered and validated using current proteomic technologies. Further studies are warranted to both examine the role of EGFR and to discover and validate other proteins that could potentially be used for breast cancer early detection. PMID:20959476
Munn, Zachary; Jordan, Zoe
When presenting to an imaging department, the person who is to be imaged is often in a vulnerable state, and out of their comfort zone. It is the role of the medical imaging technician to produce a high quality image and facilitate patient care throughout the imaging process. Qualitative research is necessary to better inform the medical imaging technician and to help them to understand the experience of the person being imaged. Some issues that have been identified in the literature include fear, claustrophobia, dehumanisation, and an uncomfortable or unusual experience. There is now a small but worthwhile qualitative literature base focusing on the patient experience in high technology imaging. There is no current qualitative synthesis of the literature on the patient experience in high technology imaging. It is therefore timely and worthwhile to produce a systematic review to identify and summarise the existent literature exploring the patient experience of high technology imaging. To identify the patient experience of high technology medical imaging. Studies that were of a qualitative design that explored the phenomenon of interest, the patient experience of high technology medical imaging. Participants included anyone who had undergone one of these procedures. The search strategy aimed to find both published and unpublished studies, and was conducted over a period from June - September 2010. No time limits were imposed on this search strategy. A three-step search strategy was utilised in this review. All studies that met the criteria were selected for retrieval. They were then assessed by two independent reviewers for methodological validity prior to inclusion in the review using standardised critical appraisal instruments from the Joanna Briggs Institute Qualitative Assessment and Review Instrument. Data was extracted from papers included in the review using the standardised data extraction tool from the Joanna Briggs Institute Qualitative Assessment and Review Instrument. Research findings were pooled using the Qualitative Assessment and Review Instrument. Following the search and critical appraisal processes, 15 studies were identified that were deemed of suitable quality to be included in the review. From these 15 studies, 127 findings were extracted, forming 33 categories and 11 synthesised findings. These synthesised findings related to the patient experience, the emotions they felt (whether negative or positive), the need for support and information, and highlighted the importance of imaging to the patient. The synthesised findings in this review highlight the diverse, unique and challenging ways in which people experience imaging with MRI and CT scanners. All health professionals involved in imaging need to be aware of the different ways each patient may experience imaging, and provide them with ongoing support and information. The implications for practice are derived directly from the results of the meta-synthesis, and each of the 11 synthesised findings. There is still scope for further high methodological qualitative studies to be conducted in this field, particularly in the field of nuclear medicine imaging and Positron Emission Tomography. Further studies may be conducted in certain patient groups, and in certain age ranges. No studies were found assessing the experience of children undergoing high technology imaging.
Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.
2009-01-01
Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.
An industrial approach to design compelling VR and AR experience
NASA Astrophysics Data System (ADS)
Richir, Simon; Fuchs, Philippe; Lourdeaux, Domitile; Buche, Cédric; Querrec, Ronan
2013-03-01
The convergence of technologies currently observed in the field of VR, AR, robotics and consumer electronic reinforces the trend of new applications appearing every day. But when transferring knowledge acquired from research to businesses, research laboratories are often at a loss because of a lack of knowledge of the design and integration processes in creating an industrial scale product. In fact, the innovation approaches that take a good idea from the laboratory to a successful industrial product are often little known to researchers. The objective of this paper is to present the results of the work of several research teams that have finalized a working method for researchers and manufacturers that allow them to design virtual or augmented reality systems and enable their users to enjoy "a compelling VR experience". That approach, called "the I2I method", present 11 phases from "Establishing technological and competitive intelligence and industrial property" to "Improvements" through the "Definition of the Behavioral Interface, Virtual Environment and Behavioral Software Assistance". As a result of the experience gained by various research teams, this design approach benefits from contributions from current VR and AR research. Our objective is to validate and continuously move such multidisciplinary design team methods forward.
How to Build a Hybrid Neurofeedback Platform Combining EEG and fMRI
Mano, Marsel; Lécuyer, Anatole; Bannier, Elise; Perronnet, Lorraine; Noorzadeh, Saman; Barillot, Christian
2017-01-01
Multimodal neurofeedback estimates brain activity using information acquired with more than one neurosignal measurement technology. In this paper we describe how to set up and use a hybrid platform based on simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI), then we illustrate how to use it for conducting bimodal neurofeedback experiments. The paper is intended for those willing to build a multimodal neurofeedback system, to guide them through the different steps of the design, setup, and experimental applications, and help them choose a suitable hardware and software configuration. Furthermore, it reports practical information from bimodal neurofeedback experiments conducted in our lab. The platform presented here has a modular parallel processing architecture that promotes real-time signal processing performance and simple future addition and/or replacement of processing modules. Various unimodal and bimodal neurofeedback experiments conducted in our lab showed high performance and accuracy. Currently, the platform is able to provide neurofeedback based on electroencephalography and functional magnetic resonance imaging, but the architecture and the working principles described here are valid for any other combination of two or more real-time brain activity measurement technologies. PMID:28377691
Horsting, Julie M H; Dlouhy, Stephen R; Hanson, Katelyn; Quaid, Kimberly; Bai, Shaochun; Hines, Karrie A
2014-06-01
First identified in 1997, cell-free fetal DNA (cffDNA) has just recently been used to detect fetal aneuploidy of chromosomes 13, 18, and 21, showing its potential to revolutionize prenatal genetic testing as a non-invasive screening tool. Although this technological advancement is exciting and has certain medical applications, it has been unclear how it will be implemented in a clinical setting. Genetic counselors will likely be instrumental in answering that question, but to date, there is no published research regarding prenatal counselors' implementation of and experiences with cffDNA testing. We developed a 67 question survey to gather descriptive information from counselors regarding their personal opinions, experiences, thoughts, and concerns regarding the validity, usefulness, and implementation of this new technology. A total of 236 individuals completed a portion of the survey; not all respondents answered all questions. Qualitative questions complemented quantitative survey items, allowing respondents to voice their thoughts directly. Results indicate that counselors value cffDNA testing as a screening option but are concerned regarding how some obstetricians and patients make use of this testing. Further results, discussion, and practice implications are presented.
Validated simulator for space debris removal with nets and other flexible tethers applications
NASA Astrophysics Data System (ADS)
Gołębiowski, Wojciech; Michalczyk, Rafał; Dyrek, Michał; Battista, Umberto; Wormnes, Kjetil
2016-12-01
In the context of active debris removal technologies and preparation activities for the e.Deorbit mission, a simulator for net-shaped elastic bodies dynamics and their interactions with rigid bodies, has been developed. Its main application is to aid net design and test scenarios for space debris deorbitation. The simulator can model all the phases of the debris capturing process: net launch, flight and wrapping around the target. It handles coupled simulation of rigid and flexible bodies dynamics. Flexible bodies were implemented using Cosserat rods model. It allows to simulate flexible threads or wires with elasticity and damping for stretching, bending and torsion. Threads may be combined into structures of any topology, so the software is able to simulate nets, pure tethers, tether bundles, cages, trusses, etc. Full contact dynamics was implemented. Programmatic interaction with simulation is possible - i.e. for control implementation. The underlying model has been experimentally validated and due to significant gravity influence, experiment had to be performed in microgravity conditions. Validation experiment for parabolic flight was a downscaled process of Envisat capturing. The prepacked net was launched towards the satellite model, it expanded, hit the model and wrapped around it. The whole process was recorded with 2 fast stereographic camera sets for full 3D trajectory reconstruction. The trajectories were used to compare net dynamics to respective simulations and then to validate the simulation tool. The experiments were performed on board of a Falcon-20 aircraft, operated by National Research Council in Ottawa, Canada. Validation results show that model reflects phenomenon physics accurately enough, so it may be used for scenario evaluation and mission design purposes. The functionalities of the simulator are described in detail in the paper, as well as its underlying model, sample cases and methodology behind validation. Results are presented and typical use cases are discussed showing that the software may be used to design throw nets for space debris capturing, but also to simulate deorbitation process, chaser control system or general interactions between rigid and elastic bodies - all in convenient and efficient way. The presented work was led by SKA Polska under the ESA contract, within the CleanSpace initiative.
Workshop Report: Crystal City VI-Bioanalytical Method Validation for Biomarkers.
Arnold, Mark E; Booth, Brian; King, Lindsay; Ray, Chad
2016-11-01
With the growing focus on translational research and the use of biomarkers to drive drug development and approvals, biomarkers have become a significant area of research within the pharmaceutical industry. However, until the US Food and Drug Administration's (FDA) 2013 draft guidance on bioanalytical method validation included consideration of biomarker assays using LC-MS and LBA, those assays were created, validated, and used without standards of performance. This lack of expectations resulted in the FDA receiving data from assays of varying quality in support of efficacy and safety claims. The AAPS Crystal City VI (CC VI) Workshop in 2015 was held as the first forum for industry-FDA discussion around the general issues of biomarker measurements (e.g., endogenous levels) and specific technology strengths and weaknesses. The 2-day workshop served to develop a common understanding among the industrial scientific community of the issues around biomarkers, informed the FDA of the current state of the science, and will serve as a basis for further dialogue as experience with biomarkers expands with both groups.
NASA Electronic Parts and Packaging Program
NASA Technical Reports Server (NTRS)
Kayali, Sammy
2000-01-01
NEPP program objectives are to: (1) Access the reliability of newly available electronic parts and packaging technologies for usage on NASA projects through validations, assessments, and characterizations, and the development of test methods/tools; (2)Expedite infusion paths for advanced (emerging) electronic parts and packaging technologies by evaluations of readiness for manufacturability and project usage consideration; (3) Provide NASA projects with technology selection, application, and validation guidelines for electronic parts and packaging hardware and processes; nd (4) Retain and disseminate electronic parts and packaging quality assurance, reliability validations, tools, and availability information to the NASA community.
Nuclear Systems Kilopower Overview
NASA Technical Reports Server (NTRS)
Palac, Don; Gibson, Marc; Mason, Lee; Houts, Michael; McClure, Patrick; Robinson, Ross
2016-01-01
The Nuclear Systems Kilopower Project was initiated by NASAs Space Technology Mission Directorate Game Changing Development Program in fiscal year 2015 to demonstrate subsystem-level technology readiness of small space fission power in a relevant environment (Technology Readiness Level 5) for space science and human exploration power needs. The Nuclear Systems Kilopower Project consists of two elements. The primary element is the Kilopower Prototype Test, also called the Kilopower Reactor Using Stirling Technology(KRUSTY) Test. This element consists of the development and testing of a fission ground technology demonstrator of a 1 kWe fission power system. A 1 kWe system matches requirements for some robotic precursor exploration systems and future potential deep space science missions, and also allows a nuclear ground technology demonstration in existing nuclear test facilities at low cost. The second element, the Mars Kilopower Scalability Study, consists of the analysis and design of a scaled-up version of the 1 kWe reference concept to 10 kWe for Mars surface power projected requirements, and validation of the applicability of the KRUSTY experiment to key technology challenges for a 10 kWe system. If successful, these two elements will lead to initiation of planning for a technology demonstration of a 10 kWe fission power capability for Mars surface outpost power.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amerio, S.; Behari, S.; Boyd, J.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Amerio, S.; Behari, S.; Boyd, J.; Brochmann, M.; Culbertson, R.; Diesburg, M.; Freeman, J.; Garren, L.; Greenlee, H.; Herner, K.; Illingworth, R.; Jayatilaka, B.; Jonckheere, A.; Li, Q.; Naymola, S.; Oleynik, G.; Sakumoto, W.; Varnes, E.; Vellidis, C.; Watts, G.; White, S.
2017-04-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. These efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.
Multiparametric Experiments and Multiparametric Setups for Metering Explosive Eruptions
NASA Astrophysics Data System (ADS)
Taddeucci, J.; Scarlato, P.; Del Bello, E.
2016-12-01
Explosive eruptions are multifaceted processes best studied by integrating a variety of observational perspectives. This need marries well with the continuous stream of new means that technological progress provides to volcanologists to parameterize these eruptions. Since decades, new technologies have been tested and integrated approaches have been attempted during so-called multiparametric experiments, i.e., short field campaigns with many, different instruments (and scientists) targeting natural laboratory volcanoes. Recently, portable multiparametric setups have been developed, including a few, highly complementary instruments to be rapidly deployed at any erupting volcano. Multiparametric experiments and setups share most of their challenges, like technical issues, site logistics, and data processing and interpretation. Our FAMoUS (FAst MUltiparametric Setup) setup pivots around coupled, high-speed imaging (visible and thermal) and acoustic (infrasonic to audible) recording, plus occasional seismic recording and sample collection. FAMoUS provided new insights on pyroclasts ejection and settling and jet noise dynamics at volcanoes worldwide. In the last years we conducted a series of BAcIO (Broadband ACquisition and Imaging Operation) experiments at Stromboli (Italy). These hosted state-of-the-art and prototypal eruption-metering technologies, including: multiple high-speed high-definition cameras for 3-D imaging; combined visible-infrared-ultraviolet imaging; in-situ and remote gas measurements; UAV aerial surveys; Doppler radar, and microphone arrays. This combined approach provides new understandings of the fundamental controls of Strombolian-style activity, and allows for crucial cross-validation of instruments and techniques. Several documentary expeditions participated in the BAcIO, attesting its tremendous potential for public outreach. Finally, sharing field work promotes interdisciplinary discussions and cooperation like nothing in the world.
Integrated Cryogenic Satellite Communications Cross-Link Receiver Experiment
NASA Technical Reports Server (NTRS)
Romanofsky, R. R.; Bhasin, K. B.; Downey, A. N.; Jackson, C. J.; Silver, A. H.; Javadi, H. H. S.
1995-01-01
An experiment has been devised which will validate, in space, a miniature, high-performance receiver. The receiver blends three complementary technologies; high temperature superconductivity (HTS), pseudomorphic high electron mobility transistor (PHEMT) monolithic microwave integrated circuits (MMIC), and a miniature pulse tube cryogenic cooler. Specifically, an HTS band pass filter, InP MMIC low noise amplifier, HTS-sapphire resonator stabilized local oscillator (LO), and a miniature pulse tube cooler will be integrated into a complete 20 GHz receiver downconverter. This cooled downconverter will be interfaced with customized signal processing electronics and integrated onto the space shuttle's 'HitchHiker' carrier. A pseudorandom data sequence will be transmitted to the receiver, which is in low Earth orbit (LEO), via the Advanced Communication Technology Satellite (ACTS) on a 20 GHz carrier. The modulation format is QPSK and the data rate is 2.048 Mbps. The bit error rate (BER) will be measured in situ. The receiver is also equipped with a radiometer mode so that experiment success is not totally contingent upon the BER measurement. In this mode, the receiver uses the Earth and deep space as a hot and cold calibration source, respectively. The experiment closely simulates an actual cross-link scenario. Since the receiver performance depends on channel conditions, its true characteristics would be masked in a terrestrial measurement by atmospheric absorption and background radiation. Furthermore, the receiver's performance depends on its physical temperature, which is a sensitive function of platform environment, thermal design, and cryocooler performance. This empirical data is important for building confidence in the technology.
ERIC Educational Resources Information Center
Cavanagh, Robert F.; Koehler, Matthew J.
2013-01-01
The impetus for this paper stems from a concern about directions and progress in the measurement of the Technological Pedagogical Content Knowledge (TPACK) framework for effective technology integration. In this paper, we develop the rationale for using a seven-criterion lens, based upon contemporary validity theory, for critiquing empirical…
Validity and Appropriate Uses of the Revised Technology Uses and Perceptions Survey (TUPS)
ERIC Educational Resources Information Center
Ritzhaupt, Albert D.; Huggins-Manley, A. Corinne; Dawson, Kara; Agaçli-Dogan, Nihan; Dogan, Selcuk
2017-01-01
The purpose of this article is to explore validity evidence and appropriate uses of the revised Technology Uses and Perceptions Survey (TUPS) designed to measure in-service teacher perspectives about technology integration in K-12 schools and classrooms. The revised TUPS measures 10 domains, including Access and Support; Preparation of Technology…
Electrostatic Microactuators for Precise Positioning of Neural Microelectrodes
Muthuswamy, Jit; Okandan, Murat; Jain, Tilak; Gilletti, Aaron
2006-01-01
Microelectrode arrays used for monitoring single and multineuronal action potentials often fail to record from the same population of neurons over a period of time likely due to micromotion of neurons away from the microelectrode, gliosis around the recording site and also brain movement due to behavior. We report here novel electrostatic microactuated microelectrodes that will enable precise repositioning of the microelectrodes within the brain tissue. Electrostatic comb-drive microactuators and associated microelectrodes are fabricated using the SUMMiT V™ (Sandia's Ultraplanar Multilevel MEMS Technology) process, a five-layer polysilicon micromachining technology of the Sandia National labs, NM. The microfabricated microactuators enable precise bidirectional positioning of the microelectrodes in the brain with accuracy in the order of 1 μm. The microactuators allow for a linear translation of the microelectrodes of up to 5 mm in either direction making it suitable for positioning microelectrodes in deep structures of a rodent brain. The overall translation was reduced to approximately 2 mm after insulation of the microelectrodes with epoxy for monitoring multiunit activity. The microactuators are capable of driving the microelectrodes in the brain tissue with forces in the order of several micro-Newtons. Single unit recordings were obtained from the somatosensory cortex of adult rats in acute experiments demonstrating the feasibility of this technology. Further optimization of the insulation, packaging and interconnect issues will be necessary before this technology can be validated in long-term experiments. PMID:16235660
Imaging Exoplanets with the Exo-S Starshade Mission: Key Enabling Technologies
NASA Astrophysics Data System (ADS)
Kasdin, N. Jeremy; Lisman, Doug; Shaklan, Stuart; Thomson, Mark; Webb, David; Cady, Eric; Exo-S Science; Technology Definition Team, Exoplanet Program Probe Study Design Team
2015-01-01
There is increasing interest in the use of a starshade, a spacecraft employing a large screen flying in formation with a space telescope, for providing the starlight suppression needed to detect and characterize exoplanets. In particular, Exo-S is a NASA study directed at designing a probe-scale exoplanet mission employing a starshade. In this poster we present the enabling technologies needed to make a starshade mission a reality: flight-like petals, a deployable truss to support the petals, optical edges, optical diffraction studies, and formation sensing and control. We show the status of each technology gap and summarize our progress over the past 5 years with plans for the next 3 years in demonstrating feasibility in all these areas. In particular, since no optical end-to-end test is possible, it is necessary to both show that a starshade can be built and deployed to the required accuracy and, via laboratory experiments at smaller scale, that the optical modeling upon which the accuracy requirements are based is validated. We show our progress verifying key enabling technologies, including demonstrating that a starshade petal made from flight-like materials can be manufactured to the needed accuracy and that a central truss with attached petals can be deployed with the needed precision. We also summarize our sub-scale lab experiments that demonstrate we can achieve the contrast predicted by our optical models.
Study on embedding fiber Bragg grating sensor into the 3D printing structure for health monitoring
NASA Astrophysics Data System (ADS)
Li, Ruiya; Tan, Yuegang; Zhou, Zude; Fang, Liang; Chen, Yiyang
2016-10-01
3D printing technology is a rapidly developing manufacturing technology, which is known as a core technology in the third industrial revolution. With the continuous improvement of the application of 3D printing products, the health monitoring of the 3D printing structure is particularly important. Fiber Bragg grating (FBG) sensing technology is a new type of optical sensing technology with unique advantages comparing to traditional sensing technology, and it has great application prospects in structural health monitoring. In this paper, the FBG sensors embedded in the internal structure of the 3D printing were used to monitor the static and dynamic strain variation of 3D printing structure during loading process. The theoretical result and experimental result has good consistency and the characteristic frequency detected by FBG sensor is consistent with the testing results of traditional accelerator in the dynamic experiment. The results of this paper preliminary validate that FBG embedded in the 3D printing structure can effectively detecting the static and dynamic stain change of the 3D printing structure, which provide some guidance for the health monitoring of 3D printing structure.
NASA Technical Reports Server (NTRS)
Culbert, Chris; French, Scott W.; Hamilton, David
1994-01-01
Knowledge-based systems (KBS's) are in general use in a wide variety of domains, both commercial and government. As reliance on these types of systems grows, the need to assess their quality and validity reaches critical importance. As with any software, the reliability of a KBS can be directly attributed to the application of disciplined programming and testing practices throughout the development life-cycle. However, there are some essential differences between conventional software and KBSs, both in construction and use. The identification of these differences affect the verification and validation (V&V) process and the development of techniques to handle them. The recognition of these differences is the basis of considerable on-going research in this field. For the past three years IBM (Federal Systems Company - Houston) and the Software Technology Branch (STB) of NASA/Johnson Space Center have been working to improve the 'state of the practice' in V&V of Knowledge-based systems. This work was motivated by the need to maintain NASA's ability to produce high quality software while taking advantage of new KBS technology. To date, the primary accomplishment has been the development and teaching of a four-day workshop on KBS V&V. With the hope of improving the impact of these workshops, we also worked directly with NASA KBS projects to employ concepts taught in the workshop. This paper describes two projects that were part of this effort. In addition to describing each project, this paper describes problems encountered and solutions proposed in each case, with particular emphasis on implications for transferring KBS V&V technology beyond the NASA domain.
In-situ Testing of the EHT High Gain and Frequency Ultra-Stable Integrators
NASA Astrophysics Data System (ADS)
Miller, Kenneth; Ziemba, Timothy; Prager, James; Slobodov, Ilia; Lotz, Dan
2014-10-01
Eagle Harbor Technologies (EHT) has developed a long-pulse integrator that exceeds the ITER specification for integration error and pulse duration. During the Phase I program, EHT improved the RPPL short-pulse integrators, added a fast digital reset, and demonstrated that the new integrators exceed the ITER integration error and pulse duration requirements. In Phase II, EHT developed Field Programmable Gate Array (FPGA) software that allows for integrator control and real-time signal digitization and processing. In the second year of Phase II, the EHT integrator will be tested at a validation platform experiment (HIT-SI) and tokamak (DIII-D). In the Phase IIB program, EHT will continue development of the EHT integrator to reduce overall cost per channel. EHT will test lower cost components, move to surface mount components, and add an onboard Field Programmable Gate Array and data acquisition to produce a stand-alone system with lower cost per channel and increased the channel density. EHT will test the Phase IIB integrator at a validation platform experiment (HIT-SI) and tokamak (DIII-D). Work supported by the DOE under Contract Number (DE-SC0006281).
Construction of Virtual-Experiment Systems for Information Science Education
NASA Astrophysics Data System (ADS)
She, Jin-Hua; Amano, Naoki
Practice is very important in education because it not only can stimulate the motivation of learning, but also can deepen the understanding of theory. However, due to the limitations on the time and experiment resources, experiments cannot be simply introduced in every lesson. To make the best use of multimedia technology, this paper designs five virtual experiment systems, which are based on the knowledge of physics at the high-school lever, to improve the effectiveness of teaching data processing. The systems are designed by employing the cognitive theory of multimedia learning and the inner game principle to ensure the easy use and to reduce the cognitive load. The learning process is divided into two stages: the first stage teaches the basic concepts of data processing; and the second stage practices the techniques taught in the first stage and uses them to build a linear model and to carry out estimation. The virtual experiment systems have been tested in an university's data processing course, and have demonstrated their validity.
Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue
Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; ...
2017-10-01
Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less
Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey
Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less
Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue
NASA Astrophysics Data System (ADS)
Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; Bagliesi, Giuseppe; Belforte, Stephano; Campana, Simone; Dimou, Maria; Flix, Jose; Forti, Alessandra; di Girolamo, A.; Karavakis, Edward; Lammel, Stephan; Litmaath, Maarten; Sciaba, Andrea; Valassi, Andrea
2017-10-01
The Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a model does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.
NASA Technical Reports Server (NTRS)
Stone, Noble H.
2007-01-01
The Tethered Satellite System Space Shuttle missions, TSS-1 in 1993 and TSS-1R in 1996, were the height of space tether technology development. Since NASA's investment of some $200M and two Shuttle missions in those two pioneering missions, there have been several smaller tether flight experiments, but interest in this promising technology has waned within NASA as well as the DOD agencies. This is curious in view of the unique capabilities of space tether systems and the fact that they have been flight validated and shown to perform as, or better than, expected in earth orbit. While it is true that the TSS-1, TSS-1R and SEDS-2 missions experienced technical difficulties, the causes of these early developmental problems are now known to be design or materials flaws that are (1) unrelated to the basic viability of space tether technology, and (2) they are readily corrected. The purpose of this paper is to review the dynamic and electrodynamic fundamentals of space tethers and the unique capabilities they afford (that are enabling to certain types of space missions); to elucidate the nature, cause, and solution of the early developmental problems; and to provide an update on progress made in development of the technology. Finally, it is shown that (1) all problems experienced during early development of the technology now have solutions; and (2) the technology has been matured by advances made in strength and robustness of tether materials, high voltage engineering in the space environment, tether health and status monitoring, and the elimination of the broken tether hazard. In view of this, it is inexplicable why this flight-validated technology has not been utilized in the past decade, considering the powerful and unique capabilities that space tethers can afford that are, not only required to carryout, otherwise, unobtainable missions, but can also greatly reduce the cost of certain on-going space operations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sabharwall, Piyush; O'Brien, James E.; McKellar, Michael G.
2015-03-01
Hybrid energy system research has the potential to expand the application for nuclear reactor technology beyond electricity. The purpose of this research is to reduce both technical and economic risks associated with energy systems of the future. Nuclear hybrid energy systems (NHES) mitigate the variability of renewable energy sources, provide opportunities to produce revenue from different product streams, and avoid capital inefficiencies by matching electrical output to demand by using excess generation capacity for other purposes when it is available. An essential step in the commercialization and deployment of this advanced technology is scaled testing to demonstrate integrated dynamic performancemore » of advanced systems and components when risks cannot be mitigated adequately by analysis or simulation. Further testing in a prototypical environment is needed for validation and higher confidence. This research supports the development of advanced nuclear reactor technology and NHES, and their adaptation to commercial industrial applications that will potentially advance U.S. energy security, economy, and reliability and further reduce carbon emissions. Experimental infrastructure development for testing and feasibility studies of coupled systems can similarly support other projects having similar developmental needs and can generate data required for validation of models in thermal energy storage and transport, energy, and conversion process development. Experiments performed in the Systems Integration Laboratory will acquire performance data, identify scalability issues, and quantify technology gaps and needs for various hybrid or other energy systems. This report discusses detailed scaling (component and integrated system) and heat transfer figures of merit that will establish the experimental infrastructure for component, subsystem, and integrated system testing to advance the technology readiness of components and systems to the level required for commercial application and demonstration under NHES.« less
FlySec: a risk-based airport security management system based on security as a service concept
NASA Astrophysics Data System (ADS)
Kyriazanos, Dimitris M.; Segou, Olga E.; Zalonis, Andreas; Thomopoulos, Stelios C. A.
2016-05-01
Complementing the ACI/IATA efforts, the FLYSEC European H2020 Research and Innovation project (http://www.fly-sec.eu/) aims to develop and demonstrate an innovative, integrated and end-to-end airport security process for passengers, enabling a guided and streamlined procedure from the landside to airside and into the boarding gates, and offering for an operationally validated innovative concept for end-to-end aviation security. FLYSEC ambition turns through a well-structured work plan into: (i) innovative processes facilitating risk-based screening; (ii) deployment and integration of new technologies and repurposing existing solutions towards a risk-based Security paradigm shift; (iii) improvement of passenger facilitation and customer service, bringing security as a real service in the airport of tomorrow;(iv) achievement of measurable throughput improvement and a whole new level of Quality of Service; and (v) validation of the results through advanced "in-vitro" simulation and "in-vivo" pilots. On the technical side, FLYSEC achieves its ambitious goals by integrating new technologies on video surveillance, intelligent remote image processing and biometrics combined with big data analysis, open-source intelligence and crowdsourcing. Repurposing existing technologies is also in the FLYSEC objectives, such as mobile application technologies for improved passenger experience and positive boarding applications (i.e. services to facilitate boarding and landside/airside way finding) as well as RFID for carry-on luggage tracking and quick unattended luggage handling. In this paper, the authors will describe the risk based airport security management system which powers FLYSEC intelligence and serves as the backend on top of which FLYSEC's front end technologies reside for security services management, behaviour and risk analysis.
New Millenium Program Serving Earth and Space Sciences
NASA Technical Reports Server (NTRS)
Li, Fuk
1999-01-01
A cross-Enterprise program is to identify and validate flight breakthrough technologies that will significantly benefit future space science and earth science missions. The breakthrough technologies are: enable new capabilities to meet earth and space science needs and reducing costs of future missions. The flight validation are: mitigates risks to first users and enables rapid technology infusion into future missions.
HOST payload for STS-95 being moved into SSPF
NASA Technical Reports Server (NTRS)
1998-01-01
Workers watch as the Hubble Space Telescope Orbiting Systems Test (HOST)is lowered onto a workstand in the Space Shuttle Processing Facility. To the right can be seen the Rack Insertion Device and Leonardo, a Multi-Purpose Logistics Module. The HOST platform, one of the payloads on the STS-95 mission, is carrying four experiments to validate components planned for installation during the third Hubble Space Telescope servicing mission and to evaluate new technologies in an earth orbiting environment. The STS-95 mission is scheduled to launch Oct. 29. It will carry three other payloads: the Spartan solar-observing deployable spacecraft, the International Extreme Ultraviolet Hitchhiker, and the SPACEHAB single module with experiments on space flight and the aging process.
Thermal Vacuum Testing of a Multi-Evaporator Miniature Loop Heat Pipe
NASA Technical Reports Server (NTRS)
Ku, Jentung; Ottenstein, Laura; Nagano, Hosei
2008-01-01
Under NASA's New Millennium Program Space Technology 8 Project, four experiments are being developed for future small system applications requiring low mass, low power, and compactness. GSFC is responsible for developing the Thermal Loop experiment, which is an advanced thermal control system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and condensers. The objective is to validate the operation of an MLHP, including reliable start-ups, steady operation, heat load sharing, and tight temperature control over the range of 273K to 308K. An MLHP Breadboard has been built and tested for 1200 hours under the laboratory environment and 500 hours in a thermal vacuum chamber. Results of the TV tests are presented here.
IVHM for the 3rd Generation RLV Program: Technology Development
NASA Technical Reports Server (NTRS)
Kahle, Bill
2000-01-01
The objective behind the Integrated Vehicle Health Management (IVHM) project is to develop and integrate the technologies which can provide a continuous, intelligent, and adaptive health state of a vehicle and use this information to improve safety and reduce costs of operations. Technological areas discussed include: developing, validating, and transfering next generation IVHM technologies to near term industry and government reusable launch systems; focus NASA on the next generation and highly advanced sensor and software technologies; and validating IVHM systems engineering design process for future programs.
von Barnekow, Ariel; Bonet-Codina, Núria; Tost, Dani
2017-03-23
To investigate if 3D gamified simulations can be valid vocational training tools for persons with intellectual disability. A 3D gamified simulation composed by a set of training tasks for cleaning in hostelry was developed in collaboration with professionals of a real hostel and pedagogues of a special needs school. The learning objectives focus on the acquisition of vocabulary skills, work procedures, social abilities and risk prevention. Several accessibility features were developed to make the tasks easy to do from a technological point-of-view. A pilot experiment was conducted to test the pedagogical efficacy of this tool on intellectually disabled workers and students. User scores in the gamified simulation follow a curve of increasing progression. When confronted with reality, they recognized the scenario and tried to reproduce what they had learned in the simulation. Finally, they were interested in the tool, they showed a strong feeling of immersion and engagement, and they reported having fun. On the basis of this experiment we believe that 3D gamified simulations can be efficient tools to train social and professional skills of persons with intellectual disabilities contributing thus to foster their social inclusion through work.
NASA Technical Reports Server (NTRS)
Smith-Taylor, Rudeen; Tanner, Sharon E.
1993-01-01
The NASA Controls-Structures Interaction (CSI) Guest Investigator program is described in terms of its support of the development of CSI technologies. The program is based on the introduction of CSI researchers from industry and academia to available test facilities for experimental validation of technologies and methods. Phase 1 experimental results are reviewed with attention given to their use of the Mini-MAST test facility and the facility for the Advance Control Evaluation of Structures. Experiments were conducted regarding the following topics: collocated/noncollocated controllers, nonlinear math modeling, controller design, passive/active suspension systems design, and system identification and fault isolation. The results demonstrate that significantly enhanced performance from the control techniques can be achieved by integrating knowledge of the structural dynamics under consideration into the approaches.
Preliminary Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)
NASA Technical Reports Server (NTRS)
Folta, David; Hawkins, Albin
2001-01-01
NASA's first autonomous formation flying mission is completing a primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center has implemented an autonomous universal three-axis formation flying algorithm in executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard design and presents the preliminary validation results of this unique system. Results from functionality assessment and autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a stand-alone algorithm.
Pseudoisochromatic test plate colour representation dependence on printing technology
NASA Astrophysics Data System (ADS)
Luse, K.; Fomins, S.; Ozolinsh, M.
2012-08-01
The aim of the study is to determine best printing technology for creation of colour vision deficiency tests. Valid tests for protanopia and deuteranopia were created from perceived colour matching experiments from printed colour samples by colour deficient individuals. Calibrated EpsonStylus Pro 7800 printer for ink prints and Noritsu HD 3701 digital printer for photographic prints were used. Multispectral imagery (by tunable liquid crystal filters system CRI Nuance Vis 07) data analysis show that in case of ink prints, the measured pixel colour coordinate dispersion (in the CIExy colour diagram) of similar colour arrays is smaller than in case of photographic printing. The print quality in terms of colour coordinate dispersion for printing methods used is much higher than in case of commercially available colour vision deficiency tests.
NASA Technical Reports Server (NTRS)
Tang, Chun; Muppidi, Suman; Bose, Deepak; Van Norman, John W.; Tanimoto, Rebekah; Clark, Ian
2015-01-01
NASA's Low Density Supersonic Decelerator Program is developing new technologies that will enable the landing of heavier payloads in low density environments, such as Mars. A recent flight experiment conducted high above the Hawaiian Islands has demonstrated the performance of several decelerator technologies. In particular, the deployment of the Robotic class Supersonic Inflatable Aerodynamic Decelerator (SIAD-R) was highly successful, and valuable data were collected during the test flight. This paper outlines the Computational Fluid Dynamics (CFD) analysis used to estimate the aerodynamic and aerothermal characteristics of the SIAD-R. Pre-flight and post-flight predictions are compared with the flight data, and a very good agreement in aerodynamic force and moment coefficients is observed between the CFD solutions and the reconstructed flight data.
Verification of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.
NASA Technical Reports Server (NTRS)
Ryan, Margaret A.; Shevade, A. V.; Taylor, C. J.; Homer, M. L.; Jewell, A. D.; Kisor, A.; Manatt, K. S .; Yen, S. P. S.; Blanco, M.; Goddard, W. A., III
2006-01-01
An array-based sensing system based on polymer/carbon composite conductometric sensors is under development at JPL for use as an environmental monitor in the International Space Station. Sulfur dioxide has been added to the analyte set for this phase of development. Using molecular modeling techniques, the interaction energy between SO2 and polymer functional groups has been calculated, and polymers selected as potential SO2 sensors. Experiment has validated the model and two selected polymers have been shown to be promising materials for SO2 detection.
NASA Astrophysics Data System (ADS)
Prasad, K.; Thorpe, A. K.; Duren, R. M.; Thompson, D. R.; Whetstone, J. R.
2016-12-01
The National Institute of Standards and Technology (NIST) has supported the development and demonstration of a measurement capability to accurately locate greenhouse gas sources and measure their flux to the atmosphere over urban domains. However, uncertainties in transport models which form the basis of all top-down approaches can significantly affect our capability to attribute sources and predict their flux to the atmosphere. Reducing uncertainties between bottom-up and top-down models will require high resolution transport models as well as validation and verification of dispersion models over an urban domain. Tracer experiments involving the release of Perfluorocarbon Tracers (PFTs) at known flow rates offer the best approach for validating dispersion / transport models. However, tracer experiments are limited by cost, ability to make continuous measurements, and environmental concerns. Natural tracer experiments, such as the leak from the Aliso Canyon underground storage facility offers a unique opportunity to improve and validate high resolution transport models, test leak hypothesis, and to estimate the amount of methane released.High spatial resolution (10 m) Large Eddy Simulations (LES) coupled with WRF atmospheric transport models were performed to simulate the dynamics of the Aliso Canyon methane plume and to quantify the source. High resolution forward simulation results were combined with aircraft and tower based in-situ measurements as well as data from NASA airborne imaging spectrometers. Comparison of simulation results with measurement data demonstrate the capability of the LES models to accurately model transport and dispersion of methane plumes over urban domains.
Serial album validation for promotion of infant body weight control
Saraiva, Nathalia Costa Gonzaga; Medeiros, Carla Campos Muniz; de Araujo, Thelma Leite
2018-01-01
ABSTRACT Objective: to validate the content and appearance of a serial album for children aged from 7 to 10 years addressing the topic of prevention and control of body weight. Method: methodological study with descriptive nature. The validation process was attended by 33 specialists in educational technologies and/or in excess of infantile weight. The agreement index of 80% was the minimum considered to guarantee the validation of the material. Results: most of the specialists had a doctoral degree and a graduate degree in nursing. Regarding content, illustrations, layout and relevance, all items were validated and 69.7% of the experts considered the album as great. The overall agreement validation index for the educational technology was 0.88. Only the script-sheet 3 did not reach the cutoff point of the content validation index. Changes were made to the material, such as title change, inclusion of the school context and insertion of nutritionist and physical educator in the story narrated in the album. Conclusion: the proposed serial album was considered valid by experts regarding content and appearance, suggesting that this technology has the potential to contribute in health education by promoting healthy weight in the age group of 7 to 10 years. PMID:29791665
An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to
Validation of a novel virtual reality simulator for robotic surgery.
Schreuder, Henk W R; Persson, Jan E U; Wolswijk, Richard G H; Ihse, Ingmar; Schijven, Marlies P; Verheijen, René H M
2014-01-01
With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were "time to complete" and "economy of motion" (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery.
Validation of a Novel Virtual Reality Simulator for Robotic Surgery
Schreuder, Henk W. R.; Persson, Jan E. U.; Wolswijk, Richard G. H.; Ihse, Ingmar; Schijven, Marlies P.; Verheijen, René H. M.
2014-01-01
Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery. PMID:24600328
Ben-Moussa, Maher; Rubo, Marius; Debracque, Coralie; Lange, Wolf-Gero
2017-01-01
The present paper explores the benefits and the capabilities of various emerging state-of-the-art interactive 3D and Internet of Things technologies and investigates how these technologies can be exploited to develop a more effective technology supported exposure therapy solution for social anxiety disorder. "DJINNI" is a conceptual design of an in vivo augmented reality (AR) exposure therapy mobile support system that exploits several capturing technologies and integrates the patient's state and situation by vision-based, audio-based, and physiology-based analysis as well as by indoor/outdoor localization techniques. DJINNI also comprises an innovative virtual reality exposure therapy system that is adaptive and customizable to the demands of the in vivo experience and therapeutic progress. DJINNI follows a gamification approach where rewards and achievements are utilized to motivate the patient to progress in her/his treatment. The current paper reviews the state of the art of technologies needed for such a solution and recommends how these technologies could be integrated in the development of an individually tailored and yet feasible and effective AR/virtual reality-based exposure therapy. Finally, the paper outlines how DJINNI could be part of classical cognitive behavioral treatment and how to validate such a setup.
Ben-Moussa, Maher; Rubo, Marius; Debracque, Coralie; Lange, Wolf-Gero
2017-01-01
The present paper explores the benefits and the capabilities of various emerging state-of-the-art interactive 3D and Internet of Things technologies and investigates how these technologies can be exploited to develop a more effective technology supported exposure therapy solution for social anxiety disorder. “DJINNI” is a conceptual design of an in vivo augmented reality (AR) exposure therapy mobile support system that exploits several capturing technologies and integrates the patient’s state and situation by vision-based, audio-based, and physiology-based analysis as well as by indoor/outdoor localization techniques. DJINNI also comprises an innovative virtual reality exposure therapy system that is adaptive and customizable to the demands of the in vivo experience and therapeutic progress. DJINNI follows a gamification approach where rewards and achievements are utilized to motivate the patient to progress in her/his treatment. The current paper reviews the state of the art of technologies needed for such a solution and recommends how these technologies could be integrated in the development of an individually tailored and yet feasible and effective AR/virtual reality-based exposure therapy. Finally, the paper outlines how DJINNI could be part of classical cognitive behavioral treatment and how to validate such a setup. PMID:28503155
NASA Astrophysics Data System (ADS)
Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.
2017-06-01
Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Taiping; Khangaonkar, Tarang; Long, Wen
2014-02-07
In recent years, with the rapid growth of global energy demand, the interest in extracting uranium from seawater for nuclear energy has been renewed. While extracting seawater uranium is not yet commercially viable, it serves as a “backstop” to the conventional uranium resources and provides an essentially unlimited supply of uranium resource. With recent advances in seawater uranium extraction technology, extracting uranium from seawater could be economically feasible when the extraction devices are deployed at a large scale (e.g., several hundred km2). There is concern however that the large scale deployment of adsorbent farms could result in potential impacts tomore » the hydrodynamic flow field in an oceanic setting. In this study, a kelp-type structure module was incorporated into a coastal ocean model to simulate the blockage effect of uranium extraction devices on the flow field. The module was quantitatively validated against laboratory flume experiments for both velocity and turbulence profiles. The model-data comparison showed an overall good agreement and validated the approach of applying the model to assess the potential hydrodynamic impact of uranium extraction devices or other underwater structures in coastal oceans.« less
WaferOptics® mass volume production and reliability
NASA Astrophysics Data System (ADS)
Wolterink, E.; Demeyer, K.
2010-05-01
The Anteryon WaferOptics® Technology platform contains imaging optics designs, materials, metrologies and combined with wafer level based Semicon & MEMS production methods. WaferOptics® first required complete new system engineering. This system closes the loop between application requirement specifications, Anteryon product specification, Monte Carlo Analysis, process windows, process controls and supply reject criteria. Regarding the Anteryon product Integrated Lens Stack (ILS), new design rules, test methods and control systems were assessed, implemented, validated and customer released for mass production. This includes novel reflowable materials, mastering process, replication, bonding, dicing, assembly, metrology, reliability programs and quality assurance systems. Many of Design of Experiments were performed to assess correlations between optical performance parameters and machine settings of all process steps. Lens metrologies such as FFL, BFL, and MTF were adapted for wafer level production and wafer mapping was introduced for yield management. Test methods for screening and validating suitable optical materials were designed. Critical failure modes such as delamination and popcorning were assessed and modeled with FEM. Anteryon successfully managed to integrate the different technologies starting from single prototypes to high yield mass volume production These parallel efforts resulted in a steep yield increase from 30% to over 90% in a 8 months period.
USDA-ARS?s Scientific Manuscript database
The NASA SMAP (Soil Moisture Active Passive) mission conducted the SMAP Validation Experiment 2015 (SMAPVEX15) in order to support the calibration and validation activities of SMAP soil moisture data product.The main goals of the experiment were to address issues regarding the spatial disaggregation...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Racine, E; Hautvast, G; Binnekamp, D
Purpose: To report on preliminary results validating the performance of a specially designed LDR brachytherapy needle prototype possessing both electromagnetic (EM) tracking and seed drop detection abilities. Methods: An EM hollow needle prototype has been designed and constructed in collaboration with research partner Philips Healthcare. The needle possesses conventional 3D tracking capabilities, along with a novel seed drop detection mechanism exploiting local changes of electromagnetic properties generated by the passage of seeds in the needle's embedded sensor coils. These two capabilities are exploited by proprietary engineering and signal processing techniques to generate seed drop position estimates in real-time treatment delivery.more » The electromagnetic tracking system (EMTS) used for the experiment is the NDI Aurora Planar Field Generator. The experiment consisted of dropping a total of 35 seeds in a prismatic agarose phantom, and comparing the 3D seed drop positions of the EMTS to those obtained by an image analysis of subsequent micro-CT scans. Drop position error computations and statistical analysis were performed after a 3D registration of the two seed distributions. Results: Of the 35 seeds dropped in the phantom, 32 were properly detected by the needle prototype. Absolute drop position errors among the detected seeds ranged from 0.5 to 4.8 mm with mean and standard deviation values of 1.6 and 0.9 mm, respectively. Error measurements also include undesirable and uncontrollable effects such as seed motion upon deposition. The true accuracy performance of the needle prototype is therefore underestimated. Conclusion: This preliminary study demonstrates the potential benefits of EM technologies in detecting the passage of seeds in a hollow needle as a means of generating drop position estimates in real-time treatment delivery. Such tools could therefore represent a potentially interesting addition to existing brachytherapy protocols for rapid dosimetry validation. Equipments and fundings for this project were provided by Philips Medical.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Spencer; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London
2013-01-01
Purpose: To perform a rigorous technological assessment and statistical validation of a software technology for anatomic delineations of the prostate on MRI datasets. Methods and Materials: A 3-phase validation strategy was used. Phase I consisted of anatomic atlas building using 100 prostate cancer MRI data sets to provide training data sets for the segmentation algorithms. In phase II, 2 experts contoured 15 new MRI prostate cancer cases using 3 approaches (manual, N points, and region of interest). In phase III, 5 new physicians with variable MRI prostate contouring experience segmented the same 15 phase II datasets using 3 approaches: manual,more » N points with no editing, and full autosegmentation with user editing allowed. Statistical analyses for time and accuracy (using Dice similarity coefficient) endpoints used traditional descriptive statistics, analysis of variance, analysis of covariance, and pooled Student t test. Results: In phase I, average (SD) total and per slice contouring time for the 2 physicians was 228 (75), 17 (3.5), 209 (65), and 15 seconds (3.9), respectively. In phase II, statistically significant differences in physician contouring time were observed based on physician, type of contouring, and case sequence. The N points strategy resulted in superior segmentation accuracy when initial autosegmented contours were compared with final contours. In phase III, statistically significant differences in contouring time were observed based on physician, type of contouring, and case sequence again. The average relative timesaving for N points and autosegmentation were 49% and 27%, respectively, compared with manual contouring. The N points and autosegmentation strategies resulted in average Dice values of 0.89 and 0.88, respectively. Pre- and postedited autosegmented contours demonstrated a higher average Dice similarity coefficient of 0.94. Conclusion: The software provided robust contours with minimal editing required. Observed time savings were seen for all physicians irrespective of experience level and baseline manual contouring speed.« less
Space Technology 5 - A Successful Micro-Satellite Constellation Mission
NASA Technical Reports Server (NTRS)
Carlisle, Candace; Webb, Evan H.
2007-01-01
The Space Technology 5 (ST5) constellation of three micro-satellites was launched March 22, 2006. During the three-month flight demonstration phase, the ST5 team validated key technologies that will make future low-cost micro-sat constellations possible, demonstrated operability concepts for future micro-sat science constellation missions, and demonstrated the utility of a micro-satellite constellation to perform research-quality science. The ST5 mission was successfully completed in June 2006, demonstrating high-quality science and technology validation results.
New Source Code: Spelman Women Transforming the Grid of Science and Technology
NASA Astrophysics Data System (ADS)
Okonkwo, Holly
From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.
ERIC Educational Resources Information Center
Teo, Timothy; Tan, Lynde
2012-01-01
This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…
The uses of cognitive training technologies in the treatment of autism spectrum disorders.
Wass, Sam V; Porayska-Pomsta, Kaska
2014-11-01
In this review, we focus on research that has used technology to provide cognitive training - i.e. to improve performance on some measurable aspect of behaviour - in individuals with autism spectrum disorders. We review technology-enhanced interventions that target three different cognitive domains: (a) emotion and face recognition, (b) language and literacy, and (c) social skills. The interventions reviewed allow for interaction through different modes, including point-and-click and eye-gaze contingent software, and are delivered through diverse implementations, including virtual reality and robotics. In each case, we examine the evidence of the degree of post-training improvement observed following the intervention, including evidence of transfer to altered behaviour in ecologically valid contexts. We conclude that a number of technological interventions have found that observed improvements within the computerised training paradigm fail to generalise to altered behaviour in more naturalistic settings, which may result from problems that people with autism spectrum disorders experience in generalising and extrapolating knowledge. However, we also point to several promising findings in this area. We discuss possible directions for future work. © The Author(s) 2013.
Data Analysis for the LISA Pathfinder Mission
NASA Technical Reports Server (NTRS)
Thorpe, James Ira
2009-01-01
The LTP (LISA Technology Package) is the core part of the Laser Interferometer Space Antenna (LISA) Pathfinder mission. The main goal of the mission is to study the sources of any disturbances that perturb the motion of the freely-falling test masses from their geodesic trajectories as well as 10 test various technologies needed for LISA. The LTP experiment is designed as a sequence of experimental runs in which the performance of the instrument is studied and characterized under different operating conditions. In order to best optimize subsequent experimental runs, each run must be promptly analysed to ensure that the following ones make best use of the available knowledge of the instrument ' In order to do this, all analyses must be designed and tested in advance of the mission and have sufficient built-in flexibility to account for unexpected results or behaviour. To support this activity, a robust and flexible data analysis software package is also required. This poster presents two of the main components that make up the data analysis effort: the data analysis software and the mock-data challenges used to validate analysis procedures and experiment designs.
NASA Astrophysics Data System (ADS)
Jumisko-Pyykkö, Satu; Häkkinen, Jukka
2008-02-01
In the product development of services it is important to adjust mobile video quality according to the quality requirements of potential users. Therefore, a careful participant selection is very important. However, in the literature the details of participant selection are often handled without great detail. This is also reflected in the handling of experimental results, where the impact of psychographic factors on quality is rarely reported. As the user attributes potentially have a large effect to the results, we investigated the role of various psychographical variables on the subjective evaluation of audiovisual video quality in two different experiments. The studied variables were age, gender, education, professionalism, television consumption, experiences of different digital video qualities, and attitude towards technology. The results showed that quality evaluations were affected by almost all background factors. The most significant variables were age, professionalism, knowledge of digital quality features and attitude towards technology. The knowledge of these factors can be exploited in careful participant selection, which will in turn increase the validity of results as the subjective evaluations reflect better the requirements of potential users.
Overview of Active Flow Control at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Pack, L. G.; Joslin, R. D.
1998-01-01
The paper summarizes Active Flow Control projects currently underway at the NASA Langley Research Center. Technology development is being pursued within a multidisciplinary, cooperative approach, involving the classical disciplines of fluid mechanics, structural mechanics, material science, acoustics, and stability and control theory. Complementing the companion papers in this session, the present paper will focus on projects that have the goal of extending the state-of-the-art in the measurement, prediction, and control of unsteady, nonlinear aerodynamics. Toward this goal, innovative actuators, micro and macro sensors, and control strategies are considered for high payoff flow control applications. The target payoffs are outlined within each section below. Validation of the approaches range from bench-top experiments to wind-tunnel experiments to flight tests. Obtaining correlations for future actuator and sensor designs are implicit in the discussion. The products of the demonstration projects and design tool development from the fundamental NASA R&D level technology will then be transferred to the Applied Research components within NASA, DOD, and US Industry. Keywords: active flow control, separation control, MEMS, review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, J.; Herner, K.; Jayatilaka, B.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in bothmore » software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. Furthermore, these efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.« less
Preparing for Harvesting Radioisotopes from FRIB
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lapi, Suzanne
2015-11-30
In the second quarter of this grant, work has progressed smoothly at all three collaborating institutions. We have recently completed our first experiment at the NSCL under this grant successfully, where 79Kr was collected by cryotrapping from our water target apparatus. The three PI’s, one undergraduate (Boone Marois), two graduate students (Stacy Queern and Matt Scott) and one post-doc (Aranh Pen) were assisted by Dave Morrissey at the NSCL to perform this experiment. The experiment also provided the opportunity for a collaboration meeting of the PI’s to discuss future work on this proposal. Significant progress has been made on bothmore » novel radiochemical separations technology at the University of Missouri, and validating a radiochemical separation procedure for 48V at Washington University. The only change in the work-scope of the original proposal is the transition of the Washington University PI to the University of Alabama at Birmingham.« less
Data preservation at the Fermilab Tevatron
Amerio, S.; Behari, S.; Boyd, J.; ...
2017-01-22
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
Boyd, J.; Herner, K.; Jayatilaka, B.; ...
2015-12-23
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in bothmore » software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. Furthermore, these efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Boyd, J.; Herner, K.; Jayatilaka, B.; Roser, R.; Sakumoto, W.
2015-12-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. These efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.
Creating the environment for driver distraction: A thematic framework of sociotechnical factors.
Parnell, Katie J; Stanton, Neville A; Plant, Katherine L
2018-04-01
As modern society becomes more reliant on technology, its use within the vehicle is becoming a concern for road safety due to both portable and built-in devices offering sources of distraction. While the effects of distracting technologies are well documented, little is known about the causal factors that lead to the drivers' engagement with technological devices. The relevance of the sociotechnical system within which the behaviour occurs requires further research. This paper presents two experiments, the first aims to assess the drivers self-reported decision to engage with technological tasks while driving and their reasoning for doing so with respect to the wider sociotechnical system. This utilised a semi-structured interview method, conducted with 30 drivers to initiate a discussion on their likelihood of engaging with 22 different tasks across 7 different road types. Inductive thematic analysis provided a hierarchical thematic framework that detailed the self-reported causal factors that influence the drivers' use of technology whilst driving. The second experiment assessed the relevance of the hierarchical framework to a model of distraction that was established from within the literature on the drivers use of distracting technologies while driving. The findings provide validation for some relationships studied in the literature, as well as providing insights into relationships that require further study. The role of the sociotechnical system in the engagement of distractions while driving is highlighted, with the causal factors reported by drivers suggesting the importance of considering the wider system within which the behaviour is occurring and how it may be creating the conditions for distraction to occur. This supports previous claims made within the literature based model. Recommendations are proposed that encourage a movement away from individual focused countermeasures towards systemic actors. Copyright © 2017 Elsevier Ltd. All rights reserved.
CFD validation experiments for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1992-01-01
A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.
Inlet Flow Control and Prediction Technologies for Embedded Propulsion Systems
NASA Technical Reports Server (NTRS)
McMillan, Michelle L.; Mackie, Scott A.; Gissen, Abe; Vukasinovic, Bojan; Lakebrink, Matthew T.; Glezer, Ari; Mani, Mori; Mace, James L.
2011-01-01
Fail-safe, hybrid, flow control (HFC) is a promising technology for meeting high-speed cruise efficiency, low-noise signature, and reduced fuel-burn goals for future, Hybrid-Wing-Body (HWB) aircraft with embedded engines. This report details the development of HFC technology that enables improved inlet performance in HWB vehicles with highly integrated inlets and embedded engines without adversely affecting vehicle performance. In addition, new test techniques for evaluating Boundary-Layer-Ingesting (BLI)-inlet flow-control technologies developed and demonstrated through this program are documented, including the ability to generate a BLI-like inlet-entrance flow in a direct-connect, wind-tunnel facility, as well as, the use of D-optimal, statistically designed experiments to optimize test efficiency and enable interpretation of results. Validated improvements in numerical analysis tools and methods accomplished through this program are also documented, including Reynolds-Averaged Navier-Stokes CFD simulations of steady-state flow physics for baseline, BLI-inlet diffuser flow, as well as, that created by flow-control devices. Finally, numerical methods were employed in a ground-breaking attempt to directly simulate dynamic distortion. The advances in inlet technologies and prediction tools will help to meet and exceed "N+2" project goals for future HWB aircraft.
Yee, Kwang Chien; Wong, Ming Chao; Turner, Paul
2017-01-01
Considerable effort and resources have been dedicated to improving the quality and safety of patient care through health information systems, but there is still significant scope for improvement. One contributing factor to the lack of progress in patient safety improvement especially where technology has been deployed relates to an over-reliance on purely objective, quantitative, positivist research paradigms as the basis for generating and validating evidence of improvement. This paper argues the need for greater recognition and accommodation of evidence of improvement generated through more subjective, qualitative and pragmatic research paradigms to aid patient safety especially where technology is deployed. This paper discusses how acknowledging the role and value of more subjective ontologies and pragmatist epistemologies can support improvement science research. This paper illustrates some challenges and benefits from adopting qualitative research methods in patient safety improvement projects, particularly focusing challenges in the technological era. While adopting methods that can more readily capture, analyse and interpret direct user experiences, attitudes, insights and behaviours in their contextual settings, patient safety can be enhanced 'on the ground' and errors reduced and/or mitigated, challenges of using these methods with the younger "technologically-centred" healthcare professionals and patients needs to recognised.
NASA Technical Reports Server (NTRS)
Newsom, Jerry R.
1991-01-01
Control-Structures Interaction (CSI) technology embraces the understanding of the interaction between the spacecraft structure and the control system, and the creation and validation of concepts, techniques, and tools, for enabling the interdisciplinary design of an integrated structure and control system, rather than the integration of a structural design and a control system design. The goal of this program is to develop validated CSI technology for integrated design/analysis and qualification of large flexible space systems and precision space structures. A description of the CSI technology program is presented.
Pain assessment: subjectivity, objectivity, and the use of neurotechnology.
Giordano, James; Abramson, Kim; Boswell, Mark V
2010-01-01
The pain clinician is confronted with the formidable task of objectifying the subjective phenomenon of pain so as to determine the right treatments for both the pain syndrome and the patient in whom the pathology is expressed. However, the experience of pain - and its expression - remains enigmatic. Can currently available evaluative tools, questionnaires, and scales actually provide adequately objective information about the experiential dimensions of pain? Can, or will, current and future iterations of biotechnology - whether used singularly or in combination (with other technologies as well as observational-behavioral methods) - afford objective validation of pain? And what of the clinical, ethical, legal and social issues that arise in and from the use - and potential misuse - of these approaches? Subsequent trajectories of clinical care depend upon the findings gained through the use of these techniques and their inappropriate employment - or misinterpretation of the results they provide - can lead to misdiagnoses and incorrect treatment. This essay is the first of a two-part series that explicates how the intellectual tasks of knowing about pain and the assessment of its experience and expression in the pain patient are constituent to the moral responsibility of pain medicine. Herein, we discuss the problem of pain and its expression, and those methods, techniques, and technologies available to bridge the gap between subjective experience and objective evaluation. We address how these assessment approaches are fundamental to apprehend both pain as an objective, neurological event, and its impact upon the subjective experience, existence, and expectations of the person in pain. In this way, we argue that the right use of technology - together with inter-subjectivity, compassion, and insight - can sustain the good of pain care as both a therapeutic and moral enterprise.
NASA Astrophysics Data System (ADS)
Wright, M. W.; Wilkerson, M. W.; Tang, R. R.
2017-11-01
Qualification testing of fiber based laser transmitters is required for NASA's Deep Space Optical Communications program to mature the technology for space applications. In the absence of fully space qualified systems, commercial systems have been investigated in order to demonstrate the robustness of the technology. To this end, a 2.5 W fiber based laser source was developed as the transmitter for an optical communications experiment flown aboard the ISS as a part of a technology demonstration mission. The low cost system leveraged Mil Standard design principles and Telcordia certified components to the extent possible and was operated in a pressure vessel with active cooling. The laser was capable of high rate modulation but was limited by the mission requirements to 50 Mbps for downlinking stored video from the OPALS payload, externally mounted on the ISS. Environmental testing and space qualification of this unit will be discussed along with plans for a fully space qualified laser transmitter.
Li, Jia; Gao, Bei; Xu, Zhenming
2014-05-06
New recycling technologies have been developed lately to enhance the value of the fiberglass powder-resin powder fraction (FRP) from waste printed circuit boards. The definite aim of the present paper is to present some novel methods that use the image forces for the separation of the resin powder and fiberglass powder generated from FRP during the corona electrostatic separating process. The particle shape charactization and particle trajectory simulation were performed on samples of mixed non-metallic particles. The simulation results pointed out that particles of resin powder and particles of fiberglass powder had different detach trajectories at the conditions of the same size and certain device parameters. An experiment carried out using a corona electrostatic separator validated the possibility of sorting these particles based on the differences in their shape characteristics. The differences in the physical properties of the different types of particles provided the technical basis for the development of electrostatic separation technologies for the recycling industry.
A Framework to Expand and Advance Probabilistic Risk Assessment to Support Small Modular Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis Smith; David Schwieder; Robert Nourgaliev
2012-09-01
During the early development of nuclear power plants, researchers and engineers focused on many aspects of plant operation, two of which were getting the newly-found technology to work and minimizing the likelihood of perceived accidents through redundancy and diversity. As time, and our experience, has progressed, the realization of plant operational risk/reliability has entered into the design, operation, and regulation of these plants. But, to date, we have only dabbled at the surface of risk and reliability technologies. For the next generation of small modular reactors (SMRs), it is imperative that these technologies evolve into an accepted, encompassing, validated, andmore » integral part of the plant in order to reduce costs and to demonstrate safe operation. Further, while it is presumed that safety margins are substantial for proposed SMR designs, the depiction and demonstration of these margins needs to be better understood in order to optimize the licensing process.« less
Rucio, the next-generation Data Management system in ATLAS
NASA Astrophysics Data System (ADS)
Serfon, C.; Barisits, M.; Beermann, T.; Garonne, V.; Goossens, L.; Lassnig, M.; Nairz, A.; Vigne, R.; ATLAS Collaboration
2016-04-01
Rucio is the next-generation of Distributed Data Management (DDM) system benefiting from recent advances in cloud and ;Big Data; computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quixote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 160 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio addresses these issues by relying on new technologies to ensure system scalability, cover new user requirements and employ new automation framework to reduce operational overheads. This paper shows the key concepts of Rucio, details the Rucio design, and the technology it employs, the tests that were conducted to validate it and finally describes the migration steps that were conducted to move from DQ2 to Rucio.
Research and application on imaging technology of line structure light based on confocal microscopy
NASA Astrophysics Data System (ADS)
Han, Wenfeng; Xiao, Zexin; Wang, Xiaofen
2009-11-01
In 2005, the theory of line structure light confocal microscopy was put forward firstly in China by Xingyu Gao and Zexin Xiao in the Institute of Opt-mechatronics of Guilin University of Electronic Technology. Though the lateral resolution of line confocal microscopy can only reach or approach the level of the traditional dot confocal microscopy. But compared with traditional dot confocal microscopy, it has two advantages: first, by substituting line scanning for dot scanning, plane imaging only performs one-dimensional scanning, with imaging velocity greatly improved and scanning mechanism simplified, second, transfer quantity of light is greatly improved by substituting detection hairline for detection pinhole, and low illumination CCD is used directly to collect images instead of photoelectric intensifier. In order to apply the line confocal microscopy to practical system, based on the further research on the theory of the line confocal microscopy, imaging technology of line structure light is put forward on condition of implementation of confocal microscopy. Its validity and reliability are also verified by experiments.
Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education
Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward
2011-01-01
Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content–based design outperforms the traditional VLE–based design. PMID:21998652
Using CFD as a Rocket Injector Design Tool: Recent Progress at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Tucker, Kevin; West, Jeff; Williams, Robert; Lin, Jeff; Canabal, Francisco; Rocker, marvin; Robles, Bryan; Garcia, Robert; Chenoweth, James
2005-01-01
New programs are forcing American propulsion system designers into unfamiliar territory. For instance, industry s answer to the cost and reliability goals set out by the Next Generation Launch Technology Program are engine concepts based on the Oxygen- Rich Staged Combustion Cycle. Historical injector design tools are not well suited for this new task. The empirical correlations do not apply directly to the injector concepts associated with the ORSC cycle. These legacy tools focus primarily on performance with environment evaluation a secondary objective. Additionally, the environmental capability of these tools is usually one-dimensional while the actual environments are at least two- and often three-dimensional. CFD has the potential to calculate performance and multi-dimensional environments but its use in the injector design process has been retarded by long solution turnaround times and insufficient demonstrated accuracy. This paper has documented the parallel paths of program support and technology development currently employed at Marshall Space Flight Center in an effort to move CFD to the forefront of injector design. MSFC has established a long-term goal for use of CFD for combustion devices design. The work on injector design is the heart of that vision and the Combustion Devices CFD Simulation Capability Roadmap that focuses the vision. The SRL concept, combining solution fidelity, robustness and accuracy, has been established as a quantitative gauge of current and desired capability. Three examples of current injector analysis for program support have been presented and discussed. These examples are used to establish the current capability at MSFC for these problems. Shortcomings identified from this experience are being used as inputs to the Roadmap process. The SRL evaluation identified lack of demonstrated solution accuracy as a major issue. Accordingly, the MSFC view of code validation and current MSFC-funded validation efforts were discussed in some detail. The objectives of each effort were noted. Issues relative to code validation for injector design were discussed in some detail. The requirement for CFD support during the design of the experiment was noted and discussed in terms of instrumentation placement and experimental rig uncertainty. In conclusion, MSFC has made significant progress in the last two years in advancing CFD toward the goal of application to injector design. A parallel effort focused on program support and technology development via the SCIT Task have enabled the progress.
NASA Astrophysics Data System (ADS)
Gerhard, J.; Zanoni, M. A. B.; Torero, J. L.
2017-12-01
Smouldering (i.e., flameless combustion) underpins the technology Self-sustaining Treatment for Active Remediation (STAR). STAR achieves the in situ destruction of nonaqueous phase liquids (NAPLs) by generating a self-sustained smouldering reaction that propagates through the source zone. This research explores the nature of the travelling reaction and the influence of key in situ and engineered characteristics. A novel one-dimensional numerical model was developed (in COMSOL) to simulate the smouldering remediation of bitumen-contaminated sand. This model was validated against laboratory column experiments. Achieving model validation depended on correctly simulating the energy balance at the reaction front, including properly accounting for heat transfer, smouldering kinetics, and heat losses. Heat transfer between soil and air was demonstrated to be generally not at equilibrium. Moreover, existing heat transfer correlations were found to be inappropriate for the low air flow Reynold's numbers (Re < 30) relevant in this and similar thermal remediation systems. Therefore, a suite of experiments were conducted to generate a new heat transfer correlation, which generated correct simulations of convective heat flow through soil. Moreover, it was found that, for most cases of interest, a simple two-step pyrolysis/oxidation set of kinetic reactions was sufficient. Arrhenius parameters, calculated independently from thermogravimetric experiments, allowed the reaction kinetics to be validated in the smouldering model. Furthermore, a simple heat loss term sufficiently accounted for radial heat losses from the column. Altogether, these advances allow this simple model to reasonably predict the self-sustaining process including the peak reaction temperature, the reaction velocity, and the complete destruction of bitumen behind the front. Simulations with the validated model revealed numerous unique insights, including how the system inherently recycles energy, how air flow rate and NAPL saturation dictate contaminant destruction rates, and the extremes that lead to extinction. Overall, this research provides unique insights into the complex interplay of thermochemical processes that govern the success of smouldering as well as other thermal remediation approaches.
Design of the protoDUNE raw data management infrastructure
Fuess, S.; Illingworth, R.; Mengel, M.; ...
2017-10-01
The Deep Underground Neutrino Experiment (DUNE) will employ a set of Liquid Argon Time Projection Chambers (LArTPC) with a total mass of 40 kt as the main components of its Far Detector. In order to validate this technology and characterize the detector performance at full scale, an ambitious experimental program (called “protoDUNE”) has been initiated which includes a test of the large-scale prototypes for the single-phase and dual-phase LArTPC technologies, which will run in a beam at CERN. The total raw data volume that is slated to be collected during the scheduled 3-month beam run is estimated to be inmore » excess of 2.5 PB for each detector. This data volume will require that the protoDUNE experiment carefully design the DAQ, data handling and data quality monitoring systems to be capable of dealing with challenges inherent with peta-scale data management while simultaneously fulfilling the requirements of disseminating the data to a worldwide collaboration and DUNE associated computing sites. Here in this paper, we present our approach to solving these problems by leveraging the design, expertise and components created for the LHC and Intensity Frontier experiments into a unified architecture that is capable of meeting the needs of protoDUNE.« less
NASA Technical Reports Server (NTRS)
Saunders, J. D.; Stueber, T. J.; Thomas, S. R.; Suder, K. L.; Weir, L. J.; Sanders, B. W.
2012-01-01
Status on an effort to develop Turbine Based Combined Cycle (TBCC) propulsion is described. This propulsion technology can enable reliable and reusable space launch systems. TBCC propulsion offers improved performance and safety over rocket propulsion. The potential to realize aircraft-like operations and reduced maintenance are additional benefits. Among most the critical TBCC enabling technologies are: 1) mode transition from turbine to scramjet propulsion, 2) high Mach turbine engines and 3) TBCC integration. To address these TBCC challenges, the effort is centered on a propulsion mode transition experiment and includes analytical research. The test program, the Combined-Cycle Engine Large Scale Inlet Mode Transition Experiment (CCE LIMX), was conceived to integrate TBCC propulsion with proposed hypersonic vehicles. The goals address: (1) dual inlet operability and performance, (2) mode-transition sequences enabling a switch between turbine and scramjet flow paths, and (3) turbine engine transients during transition. Four test phases are planned from which a database can be used to both validate design and analysis codes and characterize operability and integration issues for TBCC propulsion. In this paper we discuss the research objectives, features of the CCE hardware and test plans, and status of the parametric inlet characterization testing which began in 2011. This effort is sponsored by the NASA Fundamental Aeronautics Hypersonics project
Design of the protoDUNE raw data management infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuess, S.; Illingworth, R.; Mengel, M.
The Deep Underground Neutrino Experiment (DUNE) will employ a set of Liquid Argon Time Projection Chambers (LArTPC) with a total mass of 40 kt as the main components of its Far Detector. In order to validate this technology and characterize the detector performance at full scale, an ambitious experimental program (called “protoDUNE”) has been initiated which includes a test of the large-scale prototypes for the single-phase and dual-phase LArTPC technologies, which will run in a beam at CERN. The total raw data volume that is slated to be collected during the scheduled 3-month beam run is estimated to be inmore » excess of 2.5 PB for each detector. This data volume will require that the protoDUNE experiment carefully design the DAQ, data handling and data quality monitoring systems to be capable of dealing with challenges inherent with peta-scale data management while simultaneously fulfilling the requirements of disseminating the data to a worldwide collaboration and DUNE associated computing sites. Here in this paper, we present our approach to solving these problems by leveraging the design, expertise and components created for the LHC and Intensity Frontier experiments into a unified architecture that is capable of meeting the needs of protoDUNE.« less
Campbell, Jeffrey I; Aturinda, Isaac; Mwesigwa, Evans; Burns, Bridget; Santorino, Data; Haberer, Jessica E; Bangsberg, David R; Holden, Richard J; Ware, Norma C; Siedner, Mark J
2017-11-01
Although mobile health (mHealth) technologies have shown promise in improving clinical care in resource-limited settings (RLS), they are infrequently brought to scale. One limitation to the success of many mHealth interventions is inattention to end-user acceptability, which is an important predictor of technology adoption. We conducted in-depth interviews with 43 people living with HIV in rural Uganda who had participated in a clinical trial of a short messaging system (SMS)-based intervention designed to prompt return to clinic after an abnormal laboratory test. Interviews focused on established features of technology acceptance models, including perceived ease of use and perceived usefulness, and included open-ended questions to gain insight into unexplored issues related to the intervention's acceptability. We used conventional (inductive) and direct content analysis to derive categories describing use behaviors and acceptability. Interviews guided development of a proposed conceptual framework, the technology acceptance model for resource-limited settings (TAM-RLS). This framework incorporates both classic technology acceptance model categories as well as novel factors affecting use in this setting. Participants described how SMS message language, phone characteristics, and experience with similar technologies contributed to the system's ease of use. Perceived usefulness was shaped by the perception that the system led to augmented HIV care services and improved access to social support from family and colleagues. Emergent themes specifically related to mHealth acceptance among PLWH in Uganda included (1) the importance of confidentiality, disclosure, and stigma, and (2) the barriers and facilitators downstream from the intervention that impacted achievement of the system's target outcome. The TAM-RLS is a proposed model of mHealth technology acceptance based upon end-user experiences in rural Uganda. Although the proposed model requires validation, the TAM-RLS may serve as a useful tool to guide design and implementation of mHealth interventions.
Hydrography for the non-Hydrographer: A Paradigm shift in Data Processing
NASA Astrophysics Data System (ADS)
Malzone, C.; Bruce, S.
2017-12-01
Advancements in technology have led to overall systematic improvements including; hardware design, software architecture, data transmission/ telepresence. Historically, utilization of this technology has required a high knowledge level obtained with many years of experience, training and/or education. High training costs are incurred to achieve and maintain an acceptable level proficiency within an organization. Recently, engineers have developed off-the-shelf software technology called Qimera that has simplified the processing of hydrographic data. The core technology is centered around the isolation of tasks within the work- flow to capitalize on the technological advances in computing technology to automate the mundane error prone tasks to bring more value to the stages in which the human brain brings value. Key design features include: guided workflow, transcription automation, processing state management, real-time QA, dynamic workflow for validation, collaborative cleaning and production line processing. Since, Qimera is designed to guide the user, it allows expedition leaders to focus on science while providing an educational opportunity for students to quickly learn the hydrographic processing workflow including ancillary data analysis, trouble-shooting, calibration and cleaning. This paper provides case studies on how Qimera is currently implemented in scientific expeditions, benefits of implementation and how it is directing the future of on-board research for the non-hydrographer.
Emerging Technologies for the Diagnosis of Perihilar Cholangiocarcinoma.
Rizvi, Sumera; Eaton, John; Yang, Ju Dong; Chandrasekhara, Vinay; Gores, Gregory J
2018-05-01
The diagnosis of malignant biliary strictures remains problematic, especially in the perihilar region and in primary sclerosing cholangitis (PSC). Conventional cytology obtained during endoscopic retrograde cholangiography (ERC)-guided brushings of biliary strictures is suboptimal due to limited sensitivity, albeit it remains the gold standard with a high specificity. Emerging technologies are being developed and validated to address this pressing unmet patient need. Such technologies include enhanced visualization of the biliary tree by cholangioscopy, intraductal ultrasound, and confocal laser endomicroscopy. Conventional cytology can be aided by employing complementary and advanced cytologic techniques such as fluorescent in situ hybridization (FISH), and this technique should be widely adapted. Interrogation of bile and serum by examining extracellular vesicle number and cargo, and exploiting next-generation sequencing and proteomic technologies, is also being explored. Examination of circulating cell-free deoxyribonucleic acid (cfDNA) for differentially methylated regions is a promising test which is being rigorously validated. The special expertise required for these analyses has to date hampered their validation and adaptation. Herein, we will review these emerging technologies to inform the reader of the progress made and encourage further studies, as well as adaptation of validated approaches. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
FuGEFlow: data model and markup language for flow cytometry.
Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R
2009-06-16
Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development.
Hyper-X: Flight Validation of Hypersonic Airbreathing Technology
NASA Technical Reports Server (NTRS)
Rausch, Vincent L.; McClinton, Charles R.; Crawford, J. Larry
1997-01-01
This paper provides an overview of NASA's focused hypersonic technology program, i.e. the Hyper-X program. This program is designed to move hypersonic, air breathing vehicle technology from the laboratory environment to the flight environment, the last stage preceding prototype development. This paper presents some history leading to the flight test program, research objectives, approach, schedule and status. Substantial experimental data base and concept validation have been completed. The program is concentrating on Mach 7 vehicle development, verification and validation in preparation for wind tunnel testing in 1998 and flight testing in 1999. It is also concentrating on finalization of the Mach 5 and 10 vehicle designs. Detailed evaluation of the Mach 7 vehicle at the flight conditions is nearing completion, and will provide a data base for validation of design methods once flight test data are available.
Development and Validation of the Educational Technologist Multimedia Competency Survey
ERIC Educational Resources Information Center
Ritzhaupt, Albert D.; Martin, Florence
2014-01-01
The purpose of this research study was to identify the multimedia competencies of an educational technologist by creating a valid and reliable survey instrument to administer to educational technology professionals. The educational technology multimedia competency survey developed through this research is based on a conceptual framework that…
About the Cancer Biomarkers Research Group | Division of Cancer Prevention
The Cancer Biomarkers Research Group promotes research to identify, develop, and validate biological markers for early cancer detection and cancer risk assessment. Activities include development and validation of promising cancer biomarkers, collaborative databases and informatics systems, and new technologies or the refinement of existing technologies. NCI DCP News Note
Validation of the Proficiency Examination for Diagnostic Radiologic Technology. Final Report.
ERIC Educational Resources Information Center
Educational Testing Service, Princeton, NJ.
The validity of the Proficiency Examination for Diagnostic Radiologic Technology was investigated, using 140 radiologic technologists who took both the written Proficiency Examination and a performance test. As an additional criterion measure of job proficiency, supervisors' assessments were obtained for 128 of the technologists. The resulting…
Radon mitigation for the SuperCDMS SNOLAB dark matter experiment
NASA Astrophysics Data System (ADS)
Street, J.; Bunker, R.; Miller, E. H.; Schnee, R. W.; Snyder, S.; So, J.
2018-01-01
A potential background for the SuperCDMS SNOLAB dark matter experiment is from radon daughters that have plated out onto detector surfaces. To reach desired backgrounds, understanding plate-out rates during detector fabrication as well as mitigating radon in surrounding air is critical. A radon mitigated cleanroom planned at SNOLAB builds upon a system commissioned at the South Dakota School of Mines & Technology (SD Mines). The ultra-low radon cleanroom at SD Mines has air supplied by a vacuum-swing-adsorption radon mitigation system that has achieved >1000× reduction for a cleanroom activity consistent with zero and <0.067 Bq m-3 at 90% confidence. Our simulation of this system, validated against calibration data, provides opportunity for increased understanding and optimization for this and future systems.
Recent technology products from Space Human Factors research
NASA Technical Reports Server (NTRS)
Jenkins, James P.
1991-01-01
The goals of the NASA Space Human Factors program and the research carried out concerning human factors are discussed with emphasis given to the development of human performance models, data, and tools. The major products from this program are described, which include the Laser Anthropometric Mapping System; a model of the human body for evaluating the kinematics and dynamics of human motion and strength in microgravity environment; an operational experience data base for verifying and validating the data repository of manned space flights; the Operational Experience Database Taxonomy; and a human-computer interaction laboratory whose products are the display softaware and requirements and the guideline documents and standards for applications on human-computer interaction. Special attention is given to the 'Convoltron', a prototype version of a signal processor for synthesizing the head-related transfer functions.
NASA Stennis Space Center integrated system health management test bed and development capabilities
NASA Astrophysics Data System (ADS)
Figueroa, Fernando; Holland, Randy; Coote, David
2006-05-01
Integrated System Health Management (ISHM) capability for rocket propulsion testing is rapidly evolving and promises substantial reduction in time and cost of propulsion systems development, with substantially reduced operational costs and evolutionary improvements in launch system operational robustness. NASA Stennis Space Center (SSC), along with partners that includes NASA, contractor, and academia; is investigating and developing technologies to enable ISHM capability in SSC's rocket engine test stands (RETS). This will enable validation and experience capture over a broad range of rocket propulsion systems of varying complexity. This paper describes key components that constitute necessary ingredients to make possible implementation of credible ISHM capability in RETS, other NASA ground test and operations facilities, and ultimately spacecraft and space platforms and systems: (1) core technologies for ISHM, (2) RETS as ISHM testbeds, and (3) RETS systems models.
Review on the EFDA programme on tungsten materials technology and science
NASA Astrophysics Data System (ADS)
Rieth, M.; Boutard, J. L.; Dudarev, S. L.; Ahlgren, T.; Antusch, S.; Baluc, N.; Barthe, M.-F.; Becquart, C. S.; Ciupinski, L.; Correia, J. B.; Domain, C.; Fikar, J.; Fortuna, E.; Fu, C.-C.; Gaganidze, E.; Galán, T. L.; García-Rosales, C.; Gludovatz, B.; Greuner, H.; Heinola, K.; Holstein, N.; Juslin, N.; Koch, F.; Krauss, W.; Kurzydlowski, K. J.; Linke, J.; Linsmeier, Ch.; Luzginova, N.; Maier, H.; Martínez, M. S.; Missiaen, J. M.; Muhammed, M.; Muñoz, A.; Muzyk, M.; Nordlund, K.; Nguyen-Manh, D.; Norajitra, P.; Opschoor, J.; Pintsuk, G.; Pippan, R.; Ritz, G.; Romaner, L.; Rupp, D.; Schäublin, R.; Schlosser, J.; Uytdenhouwen, I.; van der Laan, J. G.; Veleva, L.; Ventelon, L.; Wahlberg, S.; Willaime, F.; Wurster, S.; Yar, M. A.
2011-10-01
All the recent DEMO design studies for helium cooled divertors utilize tungsten materials and alloys, mainly due to their high temperature strength, good thermal conductivity, low erosion, and comparably low activation under neutron irradiation. The long-term objective of the EFDA fusion materials programme is to develop structural as well as armor materials in combination with the necessary production and fabrication technologies for future divertor concepts. The programmatic roadmap is structured into four engineering research lines which comprise fabrication process development, structural material development, armor material optimization, and irradiation performance testing, which are complemented by a fundamental research programme on "Materials Science and Modeling". This paper presents the current research status of the EFDA experimental and testing investigations, and gives a detailed overview of the latest results on fabrication, joining, high heat flux testing, plasticity, modeling, and validation experiments.
Teleglaucoma: improving access and efficiency for glaucoma care.
Kassam, Faazil; Yogesan, Kanagasingam; Sogbesan, Enitan; Pasquale, Louis R; Damji, Karim F
2013-01-01
Teleglaucoma is the application of telemedicine for glaucoma. We review and present the current literature on teleglaucoma; present our experience with teleglaucoma programs in Alberta, Canada and Western Australia; and discuss the challenges and opportunities in this emerging field. Teleglaucoma is a novel area that was first explored a little over a decade ago and early studies highlighted the technical challenges of delivering glaucoma care remotely. Advanced technologies have since emerged that show great promise in providing access to underserviced populations. Additionally, these technologies can improve the efficiency of healthcare systems burdened with an increasing number of patients with glaucoma, and a limited supply of ophthalmologists. Additional benefits of teleglaucoma systems include e-learning and e-research. Further work is needed to fully validate and study the cost and comparative effectiveness of this approach relative to traditional models of healthcare.
Teleglaucoma: Improving Access and Efficiency for Glaucoma Care
Kassam, Faazil; Yogesan, Kanagasingam; Sogbesan, Enitan; Pasquale, Louis R.; Damji, Karim F.
2013-01-01
Teleglaucoma is the application of telemedicine for glaucoma. We review and present the current literature on teleglaucoma; present our experience with teleglaucoma programs in Alberta, Canada and Western Australia; and discuss the challenges and opportunities in this emerging field. Teleglaucoma is a novel area that was first explored a little over a decade ago and early studies highlighted the technical challenges of delivering glaucoma care remotely. Advanced technologies have since emerged that show great promise in providing access to underserviced populations. Additionally, these technologies can improve the efficiency of healthcare systems burdened with an increasing number of patients with glaucoma, and a limited supply of ophthalmologists. Additional benefits of teleglaucoma systems include e-learning and e-research. Further work is needed to fully validate and study the cost and comparative effectiveness of this approach relative to traditional models of healthcare. PMID:23741133
4D pressure MRI: validation through in-vitro experiments and simulations
NASA Astrophysics Data System (ADS)
Schiavazzi, Daniele; Amili, Omid; Coletti, Filippo
2017-11-01
Advances in MRI scan technology and recently developed acquisition sequences have led to the development of 4D flow MRI, a protocol capable of characterizing in-vivo hemodynamics in patients. Thus, the availability of phase-averaged time-resolved three-dimensional blood velocities has opened new opportunities for computing a wide spectrum of totally non-invasive hemodynamic indicators. In this regard, relative pressures play a particularly important role, as they are routinely employed in the clinic to detect cardiovascular abnormalities (e.g., in peripheral artery disease, valve stenosis, hypertension, etc.). In the first part of the talk, we discuss how the relative pressures can be robustly computed through the solution of a pressure Poisson equation and how noise in the velocities affects their estimate. Routine application of these techniques in the clinic, require however a thorough validation on multiple patients/anatomies and systematic comparisons with in-vitro and simulated representations. Thus, the second part of the talk illustrates the use of numerical simulation and in-vitro experimental protocols to validate these indicators with reference to aortic and cerebral vascular anatomies.
Optical Closed-Loop Propulsion Control System Development
NASA Technical Reports Server (NTRS)
Poppel, Gary L.
1998-01-01
The overall objective of this program was to design and fabricate the components required for optical closed-loop control of a F404-400 turbofan engine, by building on the experience of the NASA Fiber Optic Control System Integration (FOCSI) program. Evaluating the performance of fiber optic technology at the component and system levels will result in helping to validate its use on aircraft engines. This report includes descriptions of three test plans. The EOI Acceptance Test is designed to demonstrate satisfactory functionality of the EOI, primarily fail-safe throughput of the F404 sensor signals in the normal mode, and validation, switching, and output of the five analog sensor signals as generated from validated optical sensor inputs, in the optical mode. The EOI System Test is designed to demonstrate acceptable F404 ECU functionality as interfaced with the EOI, making use of a production ECU test stand. The Optical Control Engine Test Request describes planned hardware installation, optical signal calibrations, data system coordination, test procedures, and data signal comparisons for an engine test demonstration of the optical closed-loop control.
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-03-01
The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.
Experimental characterization of an adaptive aileron: lab tests and FE correlation
NASA Astrophysics Data System (ADS)
Amendola, Gianluca; Dimino, Ignazio; Amoroso, Francesco; Pecora, Rosario
2016-04-01
Like any other technology, morphing has to demonstrate system level performance benefits prior to implementation onto a real aircraft. The current status of morphing structures research efforts (as the ones, sponsored by the European Union) involves the design of several subsystems which have to be individually tested in order to consolidate their general performance in view of the final integration into a flyable device. This requires a fundamental understanding of the interaction between aerodynamic, structure and control systems. Important worldwide research collaborations were born in order to exchange acquired experience and better investigate innovative technologies devoted to morphing structures. The "Adaptive Aileron" project represents a joint cooperation between Canadian and Italian research centers and leading industries. In this framework, an overview of the design, manufacturing and testing of a variable camber aileron for a regional aircraft is presented. The key enabling technology for the presented morphing aileron is the actuation structural system, integrating a suitable motor and a load-bearing architecture. The paper describes the lab test campaign of the developed device. The implementation of a distributed actuation system fulfills the actual tendency of the aeronautical research to move toward the use of electrical power to supply non-propulsive systems. The aileron design features are validated by targeted experimental tests, demonstrating both its adaptive capability and robustness under operative loads and its dynamic behavior for further aeroelastic analyses. The experimental results show a satisfactory correlation with the numerical expectations thus validating the followed design approach.
NASA Technical Reports Server (NTRS)
Jeracki, Robert J. (Technical Monitor); Topol, David A.; Ingram, Clint L.; Larkin, Michael J.; Roche, Charles H.; Thulin, Robert D.
2004-01-01
This report presents results of the work completed on the preliminary design of Fan 3 of NASA s 22-inch Fan Low Noise Research project. Fan 3 was intended to build on the experience gained from Fans 1 and 2 by demonstrating noise reduction technology that surpasses 1992 levels by 6 dB. The work was performed as part of NASA s Advanced Subsonic Technology (AST) program. Work on this task was conducted in the areas of CFD code validation, acoustic prediction and validation, rotor parametric studies, and fan exit guide vane (FEGV) studies up to the time when a NASA decision was made to cancel the design, fabrication and testing phases of the work. The scope of the program changed accordingly to concentrate on two subtasks: (1) Rig data analysis and CFD code validation and (2) Fan and FEGV optimization studies. The results of the CFD code validation work showed that this tool predicts 3D flowfield features well from the blade trailing edge to about a chord downstream. The CFD tool loses accuracy as the distance from the trailing edge increases beyond a blade chord. The comparisons of noise predictions to rig test data showed that both the tone noise tool and the broadband noise tool demonstrated reasonable agreement with the data to the degree that these tools can reliably be used for design work. The section on rig airflow and inlet separation analysis describes the method used to determine total fan airflow, shows the good agreement of predicted boundary layer profiles to measured profiles, and shows separation angles of attack ranging from 29.5 to 27deg for the range of airflows tested. The results of the rotor parametric studies were significant in leading to the decision not to pursue a new rotor design for Fan 3 and resulted in recommendations to concentrate efforts on FEGV stator designs. The ensuing parametric study on FEGV designs showed the potential for 8 to 10 EPNdB noise reduction relative to the baseline.
Adjustable ECR Ion Source Control System: Ion Source Hydrogen Positive Project
NASA Astrophysics Data System (ADS)
Arredondo, I.; Eguiraun, M.; Jugo, J.; Piso, D.; del Campo, M.; Poggi, T.; Varnasseri, S.; Feuchtwanger, J.; Bilbao, J.; Gonzalez, X.; Harper, G.; Muguira, L.; Miracoli, R.; Corres, J.; Belver, D.; Echevarria, P.; Garmendia, N.; Gonzalez, P.; Etxebarria, V.
2015-06-01
ISHP (Ion Source Hydrogen Positive) project consists of a highly versatile ECR type ion source. It has been built for several purposes, on the one hand, to serve as a workbench to test accelerator related technologies and validate in-house made developments, at the first stages. On the other hand, to design an ion source valid as the first step in an actual LINAC. Since this paper is focused on the control system of ISHP, besides the ion source, all the hardware and its control architecture is presented. Nowadays the ion source is able to generate a pulse of positive ions of Hydrogen from 2 μs to a few ms range with a repetition rate ranging from 1 Hz to 50 Hz with a maximum of 45 mA of current. Furthermore, the first experiments with White Rabbit (WR) synchronization system are presented.
A holistic approach to SIM platform and its application to early-warning satellite system
NASA Astrophysics Data System (ADS)
Sun, Fuyu; Zhou, Jianping; Xu, Zheyao
2018-01-01
This study proposes a new simulation platform named Simulation Integrated Management (SIM) for the analysis of parallel and distributed systems. The platform eases the process of designing and testing both applications and architectures. The main characteristics of SIM are flexibility, scalability, and expandability. To improve the efficiency of project development, new models of early-warning satellite system were designed based on the SIM platform. Finally, through a series of experiments, the correctness of SIM platform and the aforementioned early-warning satellite models was validated, and the systematical analyses for the orbital determination precision of the ballistic missile during its entire flight process were presented, as well as the deviation of the launch/landing point. Furthermore, the causes of deviation and prevention methods will be fully explained. The simulation platform and the models will lay the foundations for further validations of autonomy technology in space attack-defense architecture research.
NASA Astrophysics Data System (ADS)
Deng, Dong; Cheng, Jiang-ping; Zhang, Sheng-chang; Ge, Fang-gen
2017-08-01
As a clean fuel, LNG has been used in heavy vehicles widely in China. Before reaching the engine for combustion, LNG store in a high vacuum multi-layer thermal insulation tank and need to be evaporated from its cryogenic state to natural gas. During the evaporation, the available cold energy of LNG has been calculated. The concept has been proposed that the separated type heat pipe technology is employed to utilize the available cold energy for automotive air-conditioning. The experiment has been conducted to validate the proposal. It is found that it is feasible to use the separated type heat pipe to convey the cold energy from LNG to automotive air-conditioning. And the cooling capacity of the automotive air-conditioning increase with the LNG consumption and air flow rate increasing.
FastaValidator: an open-source Java library to parse and validate FASTA formatted sequences.
Waldmann, Jost; Gerken, Jan; Hankeln, Wolfgang; Schweer, Timmy; Glöckner, Frank Oliver
2014-06-14
Advances in sequencing technologies challenge the efficient importing and validation of FASTA formatted sequence data which is still a prerequisite for most bioinformatic tools and pipelines. Comparative analysis of commonly used Bio*-frameworks (BioPerl, BioJava and Biopython) shows that their scalability and accuracy is hampered. FastaValidator represents a platform-independent, standardized, light-weight software library written in the Java programming language. It targets computer scientists and bioinformaticians writing software which needs to parse quickly and accurately large amounts of sequence data. For end-users FastaValidator includes an interactive out-of-the-box validation of FASTA formatted files, as well as a non-interactive mode designed for high-throughput validation in software pipelines. The accuracy and performance of the FastaValidator library qualifies it for large data sets such as those commonly produced by massive parallel (NGS) technologies. It offers scientists a fast, accurate and standardized method for parsing and validating FASTA formatted sequence data.
Shavers, M R; Cucinotta, F A; Miller, J; Zeitlin, C; Heilbronn, L; Wilson, J W; Singleterry, R C
2001-01-01
Radiological assessment of the many cosmic ion species of widely distributed energies requires the use of theoretical transport models to accurately describe diverse physical processes related to nuclear reactions in spacecraft structures, planetary atmospheres and surfaces, and tissues. Heavy-ion transport models that were designed to characterize shielded radiation fields have been validated through comparison with data from thick-target irradiation experiments at particle accelerators. With the RTD Mission comes a unique opportunity to validate existing radiation transport models and guide the development of tools for shield design. For the first time, transport properties will be measured in free-space to characterize the shielding effectiveness of materials that are likely to be aboard interplanetary space missions. Target materials composed of aluminum, advanced composite spacecraft structure and other shielding materials, helium (a propellant) and tissue equivalent matrices will be evaluated. Large solid state detectors will provide kinetic energy and charge identification for incident heavy-ions and for secondary ions created in the target material. Transport calculations using the HZETRN model suggest that 8 g cm -2 thick targets would be adequate to evaluate the shielding effectiveness during solar minimum activity conditions for a period of 30 days or more.
‘My Virtual Dream’: Collective Neurofeedback in an Immersive Art Environment
Kovacevic, Natasha; Ritter, Petra; Tays, William; Moreno, Sylvain; McIntosh, Anthony Randal
2015-01-01
While human brains are specialized for complex and variable real world tasks, most neuroscience studies reduce environmental complexity, which limits the range of behaviours that can be explored. Motivated to overcome this limitation, we conducted a large-scale experiment with electroencephalography (EEG) based brain-computer interface (BCI) technology as part of an immersive multi-media science-art installation. Data from 523 participants were collected in a single night. The exploratory experiment was designed as a collective computer game where players manipulated mental states of relaxation and concentration with neurofeedback targeting modulation of relative spectral power in alpha and beta frequency ranges. Besides validating robust time-of-night effects, gender differences and distinct spectral power patterns for the two mental states, our results also show differences in neurofeedback learning outcome. The unusually large sample size allowed us to detect unprecedented speed of learning changes in the power spectrum (~ 1 min). Moreover, we found that participants' baseline brain activity predicted subsequent neurofeedback beta training, indicating state-dependent learning. Besides revealing these training effects, which are relevant for BCI applications, our results validate a novel platform engaging art and science and fostering the understanding of brains under natural conditions. PMID:26154513
An image-based automatic recognition method for the flowering stage of maize
NASA Astrophysics Data System (ADS)
Yu, Zhenghong; Zhou, Huabing; Li, Cuina
2018-03-01
In this paper, we proposed an image-based approach for automatic recognizing the flowering stage of maize. A modified HOG/SVM detection framework is first adopted to detect the ears of maize. Then, we use low-rank matrix recovery technology to precisely extract the ears at pixel level. At last, a new feature called color gradient histogram, as an indicator, is proposed to determine the flowering stage. Comparing experiment has been carried out to testify the validity of our method and the results indicate that our method can meet the demand for practical observation.
Trust Building in Virtual Communities
NASA Astrophysics Data System (ADS)
Mezgár, István
By using different types of communication networks various groups of people can come together according to their private or business interest forming a Virtual Community. In these communities cooperation and collaboration plays an important role. As trust is the base of all human interactions this fact is even more valid in case of virtual communities. According to different experiments the level of trust in virtual communities is highly influenced by the way/mode of communication and by the duration of contact. The paper discusses the ways of trust building focusing on communication technologies and security aspects in virtual communities.
1998-09-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Payload Hazardous Servicing Facility prepare Deep Space 1 for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near-Earth asteroid, 1992 KD, has also been selected for a possible flyby
Deep Space 1 is prepared for launch
NASA Technical Reports Server (NTRS)
1998-01-01
Workers in the Payload Hazardous Servicing Facility prepare Deep Space 1 for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near- Earth asteroid, 1992 KD, has also been selected for a possible flyby.
High-speed civil transport issues and technology program
NASA Technical Reports Server (NTRS)
Hewett, Marle D.
1992-01-01
A strawman program plan is presented, consisting of technology developments and demonstrations required to support the construction of a high-speed civil transport. The plan includes a compilation of technology issues related to the development of a transport. The issues represent technical areas in which research and development are required to allow airframe manufacturers to pursue an HSCT development. The vast majority of technical issues presented require flight demonstrated and validated solutions before a transport development will be undertaken by the industry. The author believes that NASA is the agency best suited to address flight demonstration issues in a concentrated effort. The new Integrated Test Facility at NASA Dryden Flight Research Facility is considered ideally suited to the task of supporting ground validations of proof-of-concept and prototype system demonstrations before night demonstrations. An elaborate ground hardware-in-the-loop (iron bird) simulation supported in this facility provides a viable alternative to developing an expensive fill-scale prototype transport technology demonstrator. Drygen's SR-71 assets, modified appropriately, are a suitable test-bed for supporting flight demonstrations and validations of certain transport technology solutions. A subscale, manned or unmanned flight demonstrator is suitable for flight validation of transport technology solutions, if appropriate structural similarity relationships can be established. The author contends that developing a full-scale prototype transport technology demonstrator is the best alternative to ensuring that a positive decision to develop a transport is reached by the United States aerospace industry.
Engine Validation of Noise and Emission Reduction Technology Phase I
NASA Technical Reports Server (NTRS)
Weir, Don (Editor)
2008-01-01
This final report has been prepared by Honeywell Aerospace, Phoenix, Arizona, a unit of Honeywell International, Inc., documenting work performed during the period December 2004 through August 2007 for the NASA Glenn Research Center, Cleveland, Ohio, under the Revolutionary Aero-Space Engine Research (RASER) Program, Contract No. NAS3-01136, Task Order 8, Engine Validation of Noise and Emission Reduction Technology Phase I. The NASA Task Manager was Dr. Joe Grady of the NASA Glenn Research Center. The NASA Contract Officer was Mr. Albert Spence of the NASA Glenn Research Center. This report is for a test program in which NASA funded engine validations of integrated technologies that reduce aircraft engine noise. These technologies address the reduction of engine fan and jet noise, and noise associated with propulsion/airframe integration. The results of these tests will be used by NASA to identify the engineering tradeoffs associated with the technologies that are needed to enable advanced engine systems to meet stringent goals for the reduction of noise. The objectives of this program are to (1) conduct system engineering and integration efforts to define the engine test-bed configuration; (2) develop selected noise reduction technologies to a technical maturity sufficient to enable engine testing and validation of those technologies in the FY06-07 time frame; (3) conduct engine tests designed to gain insight into the sources, mechanisms and characteristics of noise in the engines; and (4) establish baseline engine noise measurements for subsequent use in the evaluation of noise reduction.
Performance of technology-driven simulators for medical students--a systematic review.
Michael, Michael; Abboudi, Hamid; Ker, Jean; Shamim Khan, Mohammed; Dasgupta, Prokar; Ahmed, Kamran
2014-12-01
Simulation-based education has evolved as a key training tool in high-risk industries such as aviation and the military. In parallel with these industries, the benefits of incorporating specialty-oriented simulation training within medical schools are vast. Adoption of simulators into medical school education programs has shown great promise and has the potential to revolutionize modern undergraduate education. An English literature search was carried out using MEDLINE, EMBASE, and psychINFO databases to identify all randomized controlled studies pertaining to "technology-driven" simulators used in undergraduate medical education. A validity framework incorporating the "framework for technology enhanced learning" report by the Department of Health, United Kingdom, was used to evaluate the capabilities of each technology-driven simulator. Information was collected regarding the simulator type, characteristics, and brand name. Where possible, we extracted information from the studies on the simulators' performance with respect to validity status, reliability, feasibility, education impact, acceptability, and cost effectiveness. We identified 19 studies, analyzing simulators for medical students across a variety of procedure-based specialities including; cardiovascular (n = 2), endoscopy (n = 3), laparoscopic surgery (n = 8), vascular access (n = 2), ophthalmology (n = 1), obstetrics and gynecology (n = 1), anesthesia (n = 1), and pediatrics (n = 1). Incorporation of simulators has so far been on an institutional level; no national or international trends have yet emerged. Simulators are capable of providing a highly educational and realistic experience for the medical students within a variety of speciality-oriented teaching sessions. Further research is needed to establish how best to incorporate simulators into a more primary stage of medical education; preclinical and clinical undergraduate medicine. Copyright © 2014 Elsevier Inc. All rights reserved.
Modeling fission product vapor transport in the Falcon facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shepherd, I.M.; Drossinos, Y.; Benson, C.G.
1995-05-01
An extensive database of aerosol Experiments exists and has been used for checking aerosol transport codes. Data for fission product vapor transport are harder to find. Some qualitative data are available, but the Falcon thermal gradient tube tests carried out at AEA Technology`s laboratories in Winfrith, England, mark the first serious attempt to provide a set of experiments suitable for the validation of codes that predict the transport and condensation of realistic mixtures of fission product vapors. Four of these have been analyzed to check how well the computer code VICTORIA can predict the most important phenomena. Of the fourmore » experiments studied, two are reference cases (FAL-17 and FAL-19), one is a case without boric acid (FAL-18), and the other is run in a reducing atmosphere (FAL-20). The results show that once the vapors condense onto aerosols, VICTORIA can predict their deposition rather well. The dominant mechanism is thermophoresis, and each element deposits with more or less the same deposition velocity. The behavior of the vapors is harder to interpret. Essentially, it is important to know the temperature at which each element condenses. It is clear from the measurements that this temperature changed from test to test-caused mostly by the different speciation as the composition of the carrier gas and the relative concentration of other fission products changed. Only in the test with a steam atmosphere and without boric acid was the assumption valid that most of the iodine is cesium iodide and most of the cesium is cesium hydroxide. In general, VICTORIA predicts that, with the exception of cesium, there will be less variation in the speciation-and, hence, variation in the deposition-between tests than is in fact observed. VICTORIA underpredicts the volatility of most elements, and this is partly a consequence of the ideal solution assumption and partly an overestimation of vapor/aerosol interactions.« less
Commercial Supersonics Technology Project - Status of Airport Noise
NASA Technical Reports Server (NTRS)
Bridges, James
2016-01-01
The Commercial Supersonic Technology Project has been developing databases, computational tools, and system models to prepare for a level 1 milestone, the Low Noise Propulsion Tech Challenge, to be delivered Sept 2016. Steps taken to prepare for the final validation test are given, including system analysis, code validation, and risk reduction testing.
Space Technology 5: Changing the Mission Design without Changing the Hardware
NASA Technical Reports Server (NTRS)
Carlisle, Candace C.; Webb, Evan H.; Slavin, James A.
2005-01-01
The Space Technology 5 (ST-5) Project is part of NASA's New Millennium Program. The validation objectives are to demonstrate the research-quality science capability of the ST-5 spacecraft; to operate the three spacecraft as a constellation; and to design, develop, test and flight-validate three capable micro-satellites with new technologies. A three-month flight demonstration phase is planned, beginning in March 2006. This year, the mission was re-planned for a Pegasus XL dedicated launch into an elliptical polar orbit (instead of the Originally-planned Geosynchronous Transfer Orbit.) The re-plan allows the mission to achieve the same high-level technology validation objectives with a different launch vehicle. The new mission design involves a revised science validation strategy, a new orbit and different communication strategy, while minimizing changes to the ST-5 spacecraft itself. The constellation operations concepts have also been refined. While the system engineers, orbit analysts, and operations teams were re-planning the mission, the implementation team continued to make progress on the flight hardware. Most components have been delivered, and the first spacecraft is well into integration and test.
The EGS Collab Project: Stimulation Investigations for Geothermal Modeling Analysis and Validation
NASA Astrophysics Data System (ADS)
Blankenship, D.; Kneafsey, T. J.
2017-12-01
The US DOE's EGS Collab project team is establishing a suite of intermediate-scale ( 10-20 m) field test beds for coupled stimulation and interwell flow tests. The multiple national laboratory and university team is designing the tests to compare measured data to models to improve measurement and modeling toolsets available for use in field sites and investigations such as DOE's Frontier Observatory for Research in Geothermal Energy (FORGE) Project. Our tests will be well-controlled, in situexperiments focused on rock fracture behavior, seismicity, and permeability enhancement. Pre- and post-test modeling will allow for model prediction and validation. High-quality, high-resolution geophysical and other fracture characterization data will be collected, analyzed, and compared with models and field observations to further elucidate the basic relationships between stress, induced seismicity, and permeability enhancement. Coring through the stimulated zone after tests will provide fracture characteristics that can be compared to monitoring data and model predictions. We will also observe and quantify other key governing parameters that impact permeability, and attempt to understand how these parameters might change throughout the development and operation of an Enhanced Geothermal System (EGS) project with the goal of enabling commercial viability of EGS. The Collab team will perform three major experiments over the three-year project duration. Experiment 1, intended to investigate hydraulic fracturing, will be performed in the Sanford Underground Research Facility (SURF) at 4,850 feet depth and will build on kISMET Project findings. Experiment 2 will be designed to investigate hydroshearing. Experiment 3 will investigate changes in fracturing strategies and will be further specified as the project proceeds. The tests will provide quantitative insights into the nature of stimulation (e.g., hydraulic fracturing, hydroshearing, mixed-mode fracturing, thermal fracturing) in crystalline rock under reservoir-like stress conditions and generate high-quality, high-resolution, diverse data sets to be simulated allowing model validation. Monitoring techniques will also be evaluated under controlled conditions identifying technologies appropriate for deeper full-scale EGS sites.
Dykes, Patricia C; Hurley, Ann C; Brown, Suzanne; Carr, Robyn; Cashen, Margaret; Collins, Rita; Cook, Robyn; Currie, Leanne; Docherty, Charles; Ensio, Anneli; Foster, Joanne; Hardiker, Nicholas R; Honey, Michelle L L; Killalea, Rosaleen; Murphy, Judy; Saranto, Kaija; Sensmeier, Joyce; Weaver, Charlotte
2009-01-01
In 2005, the Healthcare Information Management Systems Society (HIMSS) Nursing Informatics Community developed a survey to measure the impact of health information technology (HIT), the I-HIT Scale, on the role of nurses and interdisciplinary communication in hospital settings. In 2007, nursing informatics colleagues from Australia, England, Finland, Ireland, New Zealand, Scotland and the United States formed a research collaborative to validate the I-HIT across countries. All teams have completed construct and face validation in their countries. Five out of six teams have initiated reliability testing by practicing nurses. This paper reports the international collaborative's validation of the I-HIT Scale completed to date.
Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition
NASA Technical Reports Server (NTRS)
Ewing, Anthony; Adams, Charles
2004-01-01
Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.
Lessons Learned During Implementation and Early Operations of the DS1 Beacon Monitor Experiment
NASA Technical Reports Server (NTRS)
Sherwood, Rob; Wyatt, Jay; Hotz, Henry; Schlutsmeyer, Alan; Sue, Miles
1998-01-01
A new approach to mission operations will be flight validated on NASA's New Millennium Program Deep Space One (DS1) mission which launched in October 1998. The Beacon Monitor Operations Technology is aimed at decreasing the total volume of downlinked engineering telemetry by reducing the frequency of downlink and the volume of data received per pass. Cost savings are achieved by reducing the amount of routine telemetry processing and analysis performed by ground staff. The technology is required for upcoming NASA missions to Pluto, Europa, and possibly some other missions. With beacon monitoring, the spacecraft will assess its own health and will transmit one of four beacon messages each representing a unique frequency tone to inform the ground how urgent it is to track the spacecraft for telemetry. If all conditions are nominal, the tone provides periodic assurance to ground personnel that the mission is proceeding as planned without having to receive and analyze downlinked telemetry. If there is a problem, the tone will indicate that tracking is required and the resulting telemetry will contain a concise summary of what has occurred since the last telemetry pass. The primary components of the technology are a tone monitoring technology, AI-based software for onboard engineering data summarization, and a ground response system. In addition, there is a ground visualization system for telemetry summaries. This paper includes a description of the Beacon monitor concept, the trade-offs with adapting that concept as a technology experiment, the current state of the resulting implementation on DS1, and our lessons learned during the initial checkout phase of the mission. Applicability to future missions is also included.
Telepressure and College Student Employment: The Costs of Staying Connected Across Social Contexts.
Barber, Larissa K; Santuzzi, Alecia M
2017-02-01
Telepressure is a psychological state consisting of the preoccupation and urge to respond quickly to message-based communications from others. Telepressure has been linked with negative stress and health outcomes, but the existing measure focuses on experiences specific to the workplace. The current study explores whether an adapted version of the workplace telepressure measure is relevant to general social interactions that rely on information and communication technologies. We validated a general telepressure measure in a sample of college students and found psychometric properties similar to the original workplace measure. Also, general telepressure was related to, but distinct from, the fear of missing out, self-control and technology use. Using a predictive validity design, we also found that telepressure at the beginning of the semester was related to student reports of burnout, perceived stress and poor sleep hygiene 1 month later (but not work-life balance or general life satisfaction). Moreover, telepressure was more strongly related to more negative outcomes (burnout, stress and poor sleep hygiene) and less positive outcomes (work-life balance and life satisfaction) among employed compared with non-employed students. Thus, the costs of staying connected to one's social network may be more detrimental to college students with additional employment obligations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Xia, Dunzhu; Yao, Yanhong; Cheng, Limei
2017-06-15
In this paper, we aimed to achieve the indoor tracking control of a two-wheeled inverted pendulum (TWIP) vehicle. The attitude data are acquired from a low cost micro inertial measurement unit (IMU), and the ultra-wideband (UWB) technology is utilized to obtain an accurate estimation of the TWIP's position. We propose a dual-loop control method to realize the simultaneous balance and trajectory tracking control for the TWIP vehicle. A robust adaptive second-order sliding mode control (2-RASMC) method based on an improved super-twisting (STW) algorithm is investigated to obtain the control laws, followed by several simulations to verify its robustness. The outer loop controller is designed using the idea of backstepping. Moreover, three typical trajectories, including a circle, a trifolium and a hexagon, have been designed to prove the adaptability of the control combinations. Six different combinations of inner and outer loop control algorithms have been compared, and the characteristics of inner and outer loop algorithm combinations have been analyzed. Simulation results demonstrate its tracking performance and thus verify the validity of the proposed control methods. Trajectory tracking experiments in a real indoor environment have been performed using our experimental vehicle to further validate the feasibility of the proposed algorithm in practice.
Xia, Dunzhu; Yao, Yanhong; Cheng, Limei
2017-01-01
In this paper, we aimed to achieve the indoor tracking control of a two-wheeled inverted pendulum (TWIP) vehicle. The attitude data are acquired from a low cost micro inertial measurement unit (IMU), and the ultra-wideband (UWB) technology is utilized to obtain an accurate estimation of the TWIP’s position. We propose a dual-loop control method to realize the simultaneous balance and trajectory tracking control for the TWIP vehicle. A robust adaptive second-order sliding mode control (2-RASMC) method based on an improved super-twisting (STW) algorithm is investigated to obtain the control laws, followed by several simulations to verify its robustness. The outer loop controller is designed using the idea of backstepping. Moreover, three typical trajectories, including a circle, a trifolium and a hexagon, have been designed to prove the adaptability of the control combinations. Six different combinations of inner and outer loop control algorithms have been compared, and the characteristics of inner and outer loop algorithm combinations have been analyzed. Simulation results demonstrate its tracking performance and thus verify the validity of the proposed control methods. Trajectory tracking experiments in a real indoor environment have been performed using our experimental vehicle to further validate the feasibility of the proposed algorithm in practice. PMID:28617338
Quantitative fluorescence angiography for neurosurgical interventions.
Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute
2013-06-01
Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.
SatCom Systems for Health and Medical Care
NASA Astrophysics Data System (ADS)
Diez, Hubert
2002-01-01
Convinced since 1997 that the satellite was capable of providing a real added value either autonomously or as a complement to terrestrial infrastructures, CNES (the French Space Agency) began a determined study, validation and demonstration procedure for new satellite services. At a national but also European and worldwide level, several experiments or projects have been set-up. In each of them (tele-consultations, distant education, tele-epidemiology, tele-echography, assistance to people, training and therapeutic assistance, disaster telemedicine, etc...) well suited satcoms are used. (telecommunications for broadcasting, multicasting, downloading,...- localization, positioning, - low medium and high data rate bidirectional systems,). Medical reference people are associated in each pilot projects first to define the needs but also to manage the medical validation aspects. Our aim is always to test, validate and adapt these new services to bring them into line with the users' expectations. The value added of these technologies in sustainable healthcare services are systematically demonstrated in real situations, with real users, both in terms of quality of service and of economic validity. For several projects CNES has developed typical hardware, or technical on as technical platform. The main projects and their relevant satcom systems will be presented and discussed : - Tele-consultation, - Distance learning and training, - Assistance to people, - Tele-epidemiology, - Disaster telemedicine.
Inertial Measurement Units for Clinical Movement Analysis: Reliability and Concurrent Validity
Nicholas, Kevin; Sparkes, Valerie; Sheeran, Liba; Davies, Jennifer L
2018-01-01
The aim of this study was to investigate the reliability and concurrent validity of a commercially available Xsens MVN BIOMECH inertial-sensor-based motion capture system during clinically relevant functional activities. A clinician with no prior experience of motion capture technologies and an experienced clinical movement scientist each assessed 26 healthy participants within each of two sessions using a camera-based motion capture system and the MVN BIOMECH system. Participants performed overground walking, squatting, and jumping. Sessions were separated by 4 ± 3 days. Reliability was evaluated using intraclass correlation coefficient and standard error of measurement, and validity was evaluated using the coefficient of multiple correlation and the linear fit method. Day-to-day reliability was generally fair-to-excellent in all three planes for hip, knee, and ankle joint angles in all three tasks. Within-day (between-rater) reliability was fair-to-excellent in all three planes during walking and squatting, and poor-to-high during jumping. Validity was excellent in the sagittal plane for hip, knee, and ankle joint angles in all three tasks and acceptable in frontal and transverse planes in squat and jump activity across joints. Our results suggest that the MVN BIOMECH system can be used by a clinician to quantify lower-limb joint angles in clinically relevant movements. PMID:29495600
The New Millennium Program Space Technology 5 (ST-5) Mission
NASA Technical Reports Server (NTRS)
Webb, Evan H.; Carlisle, Candace C.; Slavin, James A.
2005-01-01
The Space Technology 5 (ST-5) Project is part of NASA's New Millennium Program. ST-5 will consist of a constellation of three 25kg microsatellites. The mission goals are to demonstrate the research-quality science capability of the ST-5 spacecraft; to operate the three spacecraft as a constellation; and to design, develop and flight-validate three capable microsatellites with new technologies. ST-5 will be launched by a Pegasus XL into an elliptical polar (sun-synchronous) orbit. The three-month flight demonstration phase, beginning in March 2006, will validate the ability to perform science measurements, as well as the technologies and constellation operations. ST-5's technologies and concepts will enable future microsatellite science missions.
Development and Validation of an Internet Use Attitude Scale
ERIC Educational Resources Information Center
Zhang, Yixin
2007-01-01
This paper describes the development and validation of a new 40-item Internet Attitude Scale (IAS), a one-dimensional inventory for measuring the Internet attitudes. The first experiment initiated a generic Internet attitude questionnaire, ensured construct validity, and examined factorial validity and reliability. The second experiment further…
Teaching "Instant Experience" with Graphical Model Validation Techniques
ERIC Educational Resources Information Center
Ekstrøm, Claus Thorn
2014-01-01
Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fort, James A.; Pfund, David M.; Sheen, David M.
2007-04-01
The MFDRC was formed in 1998 to advance the state-of-the-art in simulating multiphase turbulent flows by developing advanced computational models for gas-solid flows that are experimentally validated over a wide range of industrially relevant conditions. The goal was to transfer the resulting validated models to interested US commercial CFD software vendors, who would then propagate the models as part of new code versions to their customers in the US chemical industry. Since the lack of detailed data sets at industrially relevant conditions is the major roadblock to developing and validating multiphase turbulence models, a significant component of the work involvedmore » flow measurements on an industrial-scale riser contributed by Westinghouse, which was subsequently installed at SNL. Model comparisons were performed against these datasets by LANL. A parallel Office of Industrial Technology (OIT) project within the consortium made similar comparisons between riser measurements and models at NETL. Measured flow quantities of interest included volume fraction, velocity, and velocity-fluctuation profiles for both gas and solid phases at various locations in the riser. Some additional techniques were required for these measurements beyond what was currently available. PNNL’s role on the project was to work with the SNL experimental team to develop and test two new measurement techniques, acoustic tomography and millimeter-wave velocimetry. Acoustic tomography is a promising technique for gas-solid flow measurements in risers and PNNL has substantial related experience in this area. PNNL is also active in developing millimeter wave imaging techniques, and this technology presents an additional approach to make desired measurements. PNNL supported the advanced diagnostics development part of this project by evaluating these techniques and then by adapting and developing the selected technology to bulk gas-solids flows and by implementing them for testing in the SNL riser testbed.« less
ERIC Educational Resources Information Center
Hidiroglu, Çaglar Naci; Bukova Güzel, Esra
2013-01-01
The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…
ERIC Educational Resources Information Center
Awofala, Adeneye O. A.; Fatade, Alfred O.
2015-01-01
Introduction: Investigation into the factor structure of Domains of Creativity Scale has been on for sometimes now. The purpose of this study was to test the validity of the Kaufman Domains of Creativity Scale on Nigerian preservice science, technology, and mathematics teachers. Method: Exploratory and confirmatory factor analyses were performed…
[Care with the child's health and validation of an educational technology for riverside families].
Teixeira, Elizabeth; de Almeida Siqueira, Aldo; da Silva, Joselice Pereira; Lavor, Lília Cunha
2011-01-01
This study aimed to assess the knowledge and ways of caring for the child health 0-5 years between riverine (Phase 1), and to validate an educational technology (Phase 2). It was carried out a descriptive qualitative study. With the mothers, focus groups and content analysis were used, and with judges-specialists and target-public-applied, forms. The study revealed that the concern with the care of a child between the riverine families permeates the adversity daily, with dedication and commitment of these families in maintaining the health of their children. The sensitivity listening of mothers indicated the need for a closer relationship between nursing professionals and family. The validation of the educational technology was convergent, within the parameters considered adequate.
NASA's Microgravity Science Program
NASA Technical Reports Server (NTRS)
Salzman, Jack A.
1994-01-01
Since the late 1980s, the NASA Microgravity Science Program has implemented a systematic effort to expand microgravity research. In 1992, 114 new investigators were selected to enter the program and more US microgravity experiments were conducted in space than in all the years combined since Skylab (1973-74). The use of NASA Research Announcements (NRA's) to solicit research proposals has proven to be highly successful in building a strong base of high-quality peer-reviewed science in both the ground-based and flight experiment elements of the program. The ground-based part of the program provides facilities for low gravity experiments including drop towers and aircraft for making parabolic flights. Program policy is that investigations should not proceed to the flight phase until all ground-based investigative capabilities have been exhausted. In the space experiments program, the greatest increase in flight opportunities has been achieved through dedicated or primary payload Shuttle missions. These missions will continue to be augmented by both mid-deck and GAS-Can accommodated experiments. A US-Russian cooperative flight program envisioned for 1995-97 will provide opportunities for more microgravity research as well as technology demonstration and systems validation efforts important for preparing for experiment operations on the Space Station.
Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tabb, David L.; Wang, Xia; Carr, Steven A.
2016-03-04
The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC-MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilarmore » workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation. From these assessments we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61-93% of the time. When comparing across different instruments and quantitative technologies, differential genes were reproduced by other data sets from 67-99% of the time. Projecting gene differences to biological pathways and networks increased the similarities. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation.« less
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.
1995-01-01
NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.
Designing communication and remote controlling of virtual instrument network system
NASA Astrophysics Data System (ADS)
Lei, Lin; Wang, Houjun; Zhou, Xue; Zhou, Wenjian
2005-01-01
In this paper, a virtual instrument network through the LAN and finally remote control of virtual instruments is realized based on virtual instrument and LabWindows/CVI software platform. The virtual instrument network system is made up of three subsystems. There are server subsystem, telnet client subsystem and local instrument control subsystem. This paper introduced virtual instrument network structure in detail based on LabWindows. Application procedure design of virtual instrument network communication, the Client/the programming mode of the server, remote PC and server communication far realizing, the control power of the workstation is transmitted, server program and so on essential technical were introduced. And virtual instruments network may connect to entire Internet on. Above-mentioned technology, through measuring the application in the electronic measurement virtual instrument network that is already built up, has verified the actual using value of the technology. Experiment and application validate that this design is resultful.
González-Ferrer, Arturo; Valcárcel, María Ángel
2018-04-01
The Cohesion and Quality Act of the National Health System promotes the use of new technologies to make it possible for health professionals put the scientific evidence into practice. In order to do this, there are technological tools, known as of computer-interpretable guidelines, which can help achieve this goal from an innovation perspective. They can be adopted using an iterative process, having a great initial potential as an educational tool, of quality and safety of the patient, in the decision making and, optionally, can be integrated with the electronic medical history, once they are rigorously validated. This article presents updates on these tools, reviews international projects, and personal experiences in which they have demonstrated their value, and highlights the advantages, risks, and limitations they present from a clinical point of view. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
Verification of NASA Emergent Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.
Field Analysis of Microbial Contamination Using Three Molecular Methods in Parallel
NASA Technical Reports Server (NTRS)
Morris, H.; Stimpson, E.; Schenk, A.; Kish, A.; Damon, M.; Monaco, L.; Wainwright, N.; Steele, A.
2010-01-01
Advanced technologies with the capability of detecting microbial contamination remain an integral tool for the next stage of space agency proposed exploration missions. To maintain a clean, operational spacecraft environment with minimal potential for forward contamination, such technology is a necessity, particularly, the ability to analyze samples near the point of collection and in real-time both for conducting biological scientific experiments and for performing routine monitoring operations. Multiple molecular methods for detecting microbial contamination are available, but many are either too large or not validated for use on spacecraft. Two methods, the adenosine- triphosphate (ATP) and Limulus Amebocyte Lysate (LAL) assays have been approved by the NASA Planetary Protection Office for the assessment of microbial contamination on spacecraft surfaces. We present the first parallel field analysis of microbial contamination pre- and post-cleaning using these two methods as well as universal primer-based polymerase chain reaction (PCR).
NGNP Data Management and Analysis System Analysis and Web Delivery Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cynthia D. Gentillon
2010-09-01
Projects for the Very High Temperature Reactor Technology Development Office provide data in support of Nuclear Regulatory Commission licensing of the very high temperature reactor. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high-temperature and high-fluence environments. In addition, thermal-hydraulic experiments are conducted to validate codes used to assess reactor safety. The Very High Temperature Reactor Technology Development Office has established the NGNP Data Management and Analysis System (NDMAS) at the Idaho National Laboratory to ensure that very high temperature reactor data are (1) qualified for use, (2) stored in amore » readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the third NDMAS objective. It describes capabilities for displaying the data in meaningful ways and for data analysis to identify useful relationships among the measured quantities.« less
The Weak Stability Boundary, A Gateway for Human Exploration of Space
NASA Technical Reports Server (NTRS)
Mendell, Wendell W.
2000-01-01
NASA plans for future human exploration of the Solar System describe only missions to Mars. Before such missions can be initiated, much study remains to be done in technology development, mission operations and human performance. While, for example, technology validation and operational experience could be gained in the context of lunar exploration missions, a NASA lunar program is seen as a competitor to a Mars mission rather than a step towards it. The recently characterized Weak Stability Boundary in the Earth-Moon gravitational field may provide an operational approach to all types of planetary exploration, and infrastructure developed for a gateway to the Solar System may be a programmatic solution for exploration that avoids the fractious bickering between Mars and Moon advocates. This viewpoint proposes utilizing the concept of Greater Earth to educate policy makers, opinion makers and the public about these subtle attributes of our space neighborhood.
NASA Astrophysics Data System (ADS)
Lopez-Baeza, Ernesto; Geraldo Ferreira, A.; Saleh-Contell, Kauzar
Space technology facilitates humanity and science with a global revolutionary view of the Earth through the acquisition of Earth Observation satellite data. Satellites capture information over different spatial and temporal scales and assist in understanding natural climate processes and in detecting and explaining climate change. Accurate Earth Observation data is needed to describe climate processes by improving the parameterisations of different climate elements. Algorithms to produce geophysical parameters from raw satellite observations should go through selection processes or participate in inter-comparison programmes to ensure performance reliability. Geophysical parameter datasets, obtained from satellite observations, should pass a quality control before they are accepted in global databases for impact, diagnostic or sensitivity studies. Calibration and Validation, or simply "Cal/Val", is the activity that endeavours to ensure that remote sensing products are highly consistent and reproducible. This is an evolving scientific activity that is becoming increasingly important as more long-term studies on global change are undertaken, and new satellite missions are launched. Calibration is the process of quantitatively defining the system responses to known, controlled signal inputs. Validation refers to the process of assessing, by independent means, the quality of the data products derived from the system outputs. These definitions are generally accepted and most often used in the remote sensing context to refer specifically and respectively to sensor radiometric calibration and geophysical parameter validation. Anchor Stations are carefully selected locations at which instruments measure quantities that are needed to run, calibrate or validate models and algorithms. These are needed to quanti-tatively evaluate satellite data and convert it into geophysical information. The instruments collect measurements of basic quantities over a long timescale. Measurements are made of meteorological and hydrological background data, and of quantities not readily assessed at operational stations. Anchor Stations also offer infrastructure to undertake validation experi-ments. These are more detailed measurements over shorter intensive observation periods. The Valencia Anchor Station is showing its capabilities and conditions as a reference validation site in the framework of low spatial resolution remote sensing missions such as CERES, GERB and SMOS. The Alacant Anchor Station is a reference site in studies on the interactions between desertification and climate. This paper presents the activities so far carried out at both Anchor Stations, the precise and detailed ground and aircraft experiments carefully designed to develop a specific methodology to validate low spatial resolution satellite data and products, and the knowledge exchange currently being exercised between the University of Valencia, Spain, and FUNCEME, Brazil, in common objectives of mutual interest.
PEPE is installed on Deep Space 1 in the PHSF
NASA Technical Reports Server (NTRS)
1998-01-01
The Plasma Experiment for Planetary Exploration (PEPE), one of two advanced science experiments flying on the Deep Space l mission, is prepared for installation on the spacecraft in the Payload Hazardous Servicing Facility. PEPE combines several instruments that study space plasma in one compact 13-pound (6- kilogram) package. Space plasma is composed of charged particles, most of which flow outward from the Sun. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. The spacecraft is scheduled to launch during a period opening Oct. 15 and closing Nov. 10, 1998. Most of its mission objectives will be completed within the first two months. A near-earth asteroid, 1992 KD, has also been selected for a possible flyby.
EVALUATING ROBOT TECHNOLOGIES AS TOOLS TO EXPLORE RADIOLOGICAL AND OTHER HAZARDOUS ENVIRONMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis W. Nielsen; David I. Gertman; David J. Bruemmer
2008-03-01
There is a general consensus that robots could be beneficial in performing tasks within hazardous radiological environments. Most control of robots in hazardous environments involves master-slave or teleoperation relationships between the human and the robot. While teleoperation-based solutions keep humans out of harms way, they also change the training requirements to accomplish a task. In this paper we present a research methodology that allowed scientists at Idaho National Laboratory to identify, develop, and prove a semi-autonomous robot solution for search and characterization tasks within a hazardous environment. Two experiments are summarized that validated the use of semi-autonomy and show thatmore » robot autonomy can help mitigate some of the performance differences between operators who have different levels of robot experience, and can improve performance over teleoperated systems.« less
The ATLAS Data Acquisition System: from Run 1 to Run 2
NASA Astrophysics Data System (ADS)
Panduro Vazquez, William; ATLAS Collaboration
2016-04-01
The experience gained during the first period of very successful data taking of the ATLAS experiment (Run 1) has inspired a number of ideas for improvement of the Data Acquisition (DAQ) system that are being put in place during the so-called Long Shutdown 1 of the Large Hadron Collider (LHC), in 2013/14. We have updated the data-flow architecture, rewritten an important fraction of the software and replaced hardware, profiting from state of the art technologies. This paper summarizes the main changes that have been applied to the ATLAS DAQ system and highlights the expected performance and functional improvements that will be available for the LHC Run 2. Particular emphasis will be put on explaining the reasons for our architectural and technical choices, as well as on the simulation and testing approach used to validate this system.
NASA Astrophysics Data System (ADS)
Yu, Siyuan; Wu, Feng; Wang, Qiang; Tan, Liying; Ma, Jing
2017-11-01
Acquisition and recognition for the beacon is the core technology of establishing the satellite optical link. In order to acquire the beacon correctly, the beacon image should be recognized firstly, excluding the influence of the background light. In this processing, many factors will influence the recognition precision of the beacon. This paper studies the constraint boundary conditions for acquiring the beacon from the perspective of theory and experiment, and as satellite-ground laser communications, an approach for obtaining the adaptive segmentation method is also proposed. Finally, the long distance laser communication experiment (11.16 km) verifies the validity of this method and the tracking error with the method is the least compared with the traditional approaches. The method helps to greatly improve the tracking precision in the satellite-ground laser communications.
Illness as a Crisis of Meaning.
Hauskeller, Michael
2018-05-21
In Phenomenological Bioethics: Medical Technologies, Human Suffering, and the Meaning of Being Alive, the Swedish philosopher Fredrik Svenaeus aims to show how the continental tradition of phenomenology can enrich bioethical debates by adding important but often-ignored perspectives, namely, that of lived experience. Phenomenology focuses not on supposedly objective, scientifically validated facts, but on the "life world" of the individuals affected by a situation. Individuals' life worlds consist of their experience of their own lived bodies (or Leiber) and the meaning structures of their everyday worlds. A phenomenologically informed and oriented bioethics would seek to take those life worlds into account when considering what should be done in a particular ethically challenging situation. The fundamental insight that Svenaeus develops in his new book is that our illnesses are often, if not always, crises of meaning. © 2018 The Hastings Center.
Malhotra, Karan; Buraimoh, Olatunbosun; Thornton, James; Cullen, Nicholas; Singh, Dishan; Goldberg, Andrew J
2016-06-20
To determine whether an entirely electronic system can be used to capture both patient-reported outcomes (electronic Patient-Reported Outcome Measures, ePROMs) as well as clinician-validated diagnostic and complexity data in an elective surgical orthopaedic outpatient setting. To examine patients' experience of this system and factors impacting their experience. Retrospective analysis of prospectively collected data. Single centre series. Outpatient clinics at an elective foot and ankle unit in the UK. All new adult patients attending elective orthopaedic outpatient clinics over a 32-month period. All patients were invited to complete ePROMs prior to attending their outpatient appointment. At their appointment, those patients who had not completed ePROMs were offered the opportunity to complete it on a tablet device with technical support. Matched diagnostic and complexity data were captured by the treating consultant during the appointment. Capture rates of patient-reported and clinician-reported data. All information and technology (IT) failures, language and disability barriers were captured. Patients were asked to rate their experience of using ePROMs. The scoring systems used included EQ-5D-5L, the Manchester-Oxford Foot Questionnaire (MOxFQ) and the Visual Analogue Scale (VAS) pain score. Out of 2534 new patients, 2176 (85.9%) completed ePROMs, of whom 1090 (50.09%) completed ePROMs at home/work prior to their appointment. 31.5% used a mobile (smartphone/tablet) device. Clinician-reported data were captured on 2491 patients (98.3%). The mean patient experience score of using Patient-Reported Outcome Measures (PROMs) was 8.55±1.85 out of 10 and 666 patients (30.61%) left comments. Of patients leaving comments, 214 (32.13%) felt ePROMs did not adequately capture their symptoms and these patients had significantly lower patient experience scores (p<0.001). This study demonstrates the successful implementation of technology into a service improvement programme. Excellent capture rates of ePROMs and clinician-validated diagnostic data can be achieved within a National Health Service setting. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Flexible structure control laboratory development and technology demonstration
NASA Technical Reports Server (NTRS)
Vivian, H. C.; Blaire, P. E.; Eldred, D. B.; Fleischer, G. E.; Ih, C.-H. C.; Nerheim, N. M.; Scheid, R. E.; Wen, J. T.
1987-01-01
An experimental structure is described which was constructed to demonstrate and validate recent emerging technologies in the active control and identification of large flexible space structures. The configuration consists of a large, 20 foot diameter antenna-like flexible structure in the horizontal plane with a gimballed central hub, a flexible feed-boom assembly hanging from the hub, and 12 flexible ribs radiating outward. Fourteen electrodynamic force actuators mounted to the hub and to the individual ribs provide the means to excite the structure and exert control forces. Thirty permanently mounted sensors, including optical encoders and analog induction devices provide measurements of structural response at widely distributed points. An experimental remote optical sensor provides sixteen additional sensing channels. A computer samples the sensors, computes the control updates and sends commands to the actuators in real time, while simultaneously displaying selected outputs on a graphics terminal and saving them in memory. Several control experiments were conducted thus far and are documented. These include implementation of distributed parameter system control, model reference adaptive control, and static shape control. These experiments have demonstrated the successful implementation of state-of-the-art control approaches using actual hardware.
Mandujano-Ramírez, Humberto J; González-Vázquez, José P; Oskam, Gerko; Dittrich, Thomas; Garcia-Belmonte, Germa; Mora-Seró, Iván; Bisquert, Juan; Anta, Juan A
2014-03-07
Many recent advances in novel solar cell technologies are based on charge separation in disordered semiconductor heterojunctions. In this work we use the Random Walk Numerical Simulation (RWNS) method to model the dynamics of electrons and holes in two disordered semiconductors in contact. Miller-Abrahams hopping rates and a tunnelling distance-dependent electron-hole annihilation mechanism are used to model transport and recombination, respectively. To test the validity of the model, three numerical "experiments" have been devised: (1) in the absence of constant illumination, charge separation has been quantified by computing surface photovoltage (SPV) transients. (2) By applying a continuous generation of electron-hole pairs, the model can be used to simulate a solar cell under steady-state conditions. This has been exploited to calculate open-circuit voltages and recombination currents for an archetypical bulk heterojunction solar cell (BHJ). (3) The calculations have been extended to nanostructured solar cells with inorganic sensitizers to study, specifically, non-ideality in the recombination rate. The RWNS model in combination with exponential disorder and an activated tunnelling mechanism for transport and recombination is shown to reproduce correctly charge separation parameters in these three "experiments". This provides a theoretical basis to study relevant features of novel solar cell technologies.
Theory and experiments in model-based space system anomaly management
NASA Astrophysics Data System (ADS)
Kitts, Christopher Adam
This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.
The Stratospheric Aerosol and Gas Experiment (SAGE) IV Pathfinder
NASA Astrophysics Data System (ADS)
Hill, C. A.; Damadeo, R. P.; Gasbarre, J. F.
2017-12-01
Stratospheric ozone has been the subject of observation and research for decades. Measurements from satellites provided data on the initial decline in the late 1970s and early 1980s that supported the adoption of the Montreal Protocol to current observations hinting at potential recovery. Adequate determination of that recovery requires continuous and, in the case of multiple instruments, overlapping data records. However, most current satellite systems are well beyond their expected lifetimes and thus, with only a few "younger" instruments available, we look towards the future of satellite observations of stratospheric ozone to develop the Stratospheric Aerosol and Gas Experiment (SAGE) IV Pathfinder. The SAGE IV Pathfinder project will develop and validate a technology demonstration that will pave the way for a future SAGE IV mission. Utilizing solar occultation imaging, SAGE IV will be capable of measuring ozone, aerosol, and other trace gas species with the same quality as previous SAGE instruments but with greatly improved pointing knowledge. Furthermore, current technological advancements allow SAGE IV to fit within a CubeSat framework and make use of commercial hardware, significantly reducing the size and cost when compared with traditional missions and enabling sustainability of future measurements.
Practical use of a framework for network science experimentation
NASA Astrophysics Data System (ADS)
Toth, Andrew; Bergamaschi, Flavio
2014-06-01
In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.
The Role of Structural Models in the Solar Sail Flight Validation Process
NASA Technical Reports Server (NTRS)
Johnston, John D.
2004-01-01
NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.
Risk Management in ETS-8 Project
NASA Astrophysics Data System (ADS)
Homma, M.
2002-01-01
Engineering Test Satellite - 8 (ETS-8) is the Japanese largest geo-synchronous satellite of 3 tons in mass, of which mission is mobile communications and navigation experiment. It is now in the flight model manufacturing phase. This paper introduces the risk management taken in this project as a reference. The mission success criteria of ETS-8 are described at first. All the risk management activities are planned taking these criteria into consideration. ETS-8 consists of many new technologies such as the large deployable antenna (19m x 17m), 64-bit MPU, 100 V solar paddle and so on. We have to pay attention to control these risk through each phase of development. In system design of ETS - 8, almost components have redundancy and there is some back-up function to avoid fatal failure. What kind of back-up function should be taken is one of the hot issues in this project. The consideration process is described as an actual case. In addition to conventional risk management procedure, FMEA and identification of the critical items so on, we conducted the validation experiment in space by use of a scale model that was launched on Ariane 5. The decision to conduct this kind of experiment is taken after evaluation between risk and cost, because it takes a lot of resources of project. The effect of this experiment is also presented. Failure detection, isolation and reconfiguration in the flight software are more important as the satellite system becomes large and complicated. We did the independent verification and validation to the software. Some remarks are noted with respect to its effectiveness.
Challenges in verification and validation of autonomous systems for space exploration
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Jonsson, Ari
2005-01-01
Space exploration applications offer a unique opportunity for the development and deployment of autonomous systems, due to limited communications, large distances, and great expense of direct operation. At the same time, the risk and cost of space missions leads to reluctance to taking on new, complex and difficult-to-understand technology. A key issue in addressing these concerns is the validation of autonomous systems. In recent years, higher-level autonomous systems have been applied in space applications. In this presentation, we will highlight those autonomous systems, and discuss issues in validating these systems. We will then look to future demands on validating autonomous systems for space, identify promising technologies and open issues.
LISA on Table: an optical simulator for LISA
NASA Astrophysics Data System (ADS)
Halloin, H.; Jeannin, O.; Argence, B.; Bourrier, V.; de Vismes, E.; Prat, P.
2017-11-01
LISA, the first space project for detecting gravitational waves, relies on two main technical challenges: the free falling masses and an outstanding precision on phase shift measurements (a few pm on 5 Mkm in the LISA band). The technology of the free falling masses, i.e. their isolation to forces other than gravity and the capability for the spacecraft to precisely follow the test masses, will soon be tested with the technological LISA Pathfinder mission. The performance of the phase measurement will be achieved by at least two stabilization stages: a pre-stabilisation of the laser frequency at a level of 10-13 (relative frequency stability) will be further improved by using numerical algorithms, such as Time Delay Interferometry, which have been theoretically and numerically demonstrated to reach the required performance level (10-21). Nevertheless, these algorithms, though already tested with numerical model of LISA, require experimental validation, including `realistic' hardware elements. Such an experiment would allow to evaluate the expected noise level and the possible interactions between subsystems. To this end, the APC is currently developing an optical benchtop experiment, called LISA On Table (LOT), which is representative of the three LISA spacecraft. A first module of the LOT experiment has been mounted and is being characterized. After completion this facility may be used by the LISA community to test hardware (photodiodes, phasemeters) or software (reconstruction algorithms) components.
NASA Astrophysics Data System (ADS)
Yerdelen-Damar, Sevda; Boz, Yezdan; Aydın-Günbatar, Sevgi
2017-08-01
This study examined the relations of preservice science teachers' attitudes towards technology use, technology ownership, technology competencies, and experiences to their self-efficacy beliefs about technological pedagogical content knowledge (TPACK). The present study also investigated interrelations among preservice teachers' attitudes towards technology use, technology ownership, technology competencies, and experiences. The participants of study were 665 elementary preservice science teachers (467 females, 198 males) from 7 colleges in Turkey. The proposed model based on educational technology literature was tested using structural equation modeling. The model testing results revealed that preservice teachers' technology competencies and experiences mediated the relation of technology ownership to their TPACK self efficacy beliefs. The direct relation of their possession of technology to their TPACK self efficacy beliefs was insignificant while the indirect relation through their technology competencies and experiences was significant. The results also indicated there were significant direct effects of preservice teachers' attitudes towards technology use, technology competencies, and experiences on their TPACK self efficacy beliefs.
Initial operation of the Lockheed Martin T4B experiment
NASA Astrophysics Data System (ADS)
Garrett, M. L.; Blinzer, A.; Ebersohn, F.; Gucker, S.; Heinrich, J.; Lohff, C.; McGuire, T.; Montecalvo, N.; Raymond, A.; Rhoads, J.; Ross, P.; Sommers, B.; Strandberg, E.; Sullivan, R.; Walker, J.
2017-10-01
The T4B experiment is a linear, encapsulated ring cusp confinement device, designed to develop a physics and technology basis for a follow-on high beta (β 1) machine. The experiment consists of 13 magnetic field coils (11 external, 2 internal), to produce a series of on-axis field nulls surrounded by modest magnetic fields of up to 0.3 T. The primary plasma source used on T4B is a lanthanum hexaboride (LaB6) cathode, capable of coupling over 100 kW into the plasma. Initial testing focused on commissioning of components and integration of diagnostics. Diagnostics include both long and short wavelength interferometry, bolometry, visible and X-ray spectroscopy, Langmuir and B-dot probes, Thomson scattering, flux loops, and fast camera imagery. Low energy discharges were used to begin validation of physics models and simulation efforts. Following the initial machine check-out, neutral beam injection (NBI) was integrated onto the device. Detailed results will be presented. 2017 Lockheed Martin Corporation. All Rights Reserved.
Reconstructing the Pupils Attitude towards Technology-Survey
ERIC Educational Resources Information Center
Ardies, Jan; De Maeyer, Sven; Gijbels, David
2013-01-01
In knowledge based economies technological literacy is gaining interest. Technological literacy correlates with attitude towards technology. When measuring technological literacy as an outcome of education, the attitudinal dimension has to be taken into account. This requires a valid, reliable instrument that should be as concise as possible, in…
Validating Innovative Renewable Energy Technologies: ESTCP Demonstrations at Two DoD Facilities
2012-05-01
AND SUBTITLE Validating Innovative Renewable Energy Technologies: ESTCP Demonstrations at Two DoD Facilities 5a. CONTRACT NUMBER 5b. GRANT NUMBER...Southern Research Institute,Advanced Energy Department,2000 Ninth Avenue South,Birmingham,AL,35205-5305 8. PERFORMING ORGANIZATION REPORT NUMBER...AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES Presented at the NDIA Environment, Energy Security
ERIC Educational Resources Information Center
Yurdakul, Isil Kabakci; Odabasi, Hatice Ferhan; Kilicer, Kerem; Coklar, Ahmet Naci; Birinci, Gurkay; Kurt, Adile Askim
2012-01-01
The purpose of this study is to develop a TPACK (technological pedagogical content knowledge) scale based on the centered component of TPACK framework in order to measure preservice teachers' TPACK. A systematic and step-by-step approach was followed for the development of the scale. The validity and reliability studies of the scale were carried…
ERIC Educational Resources Information Center
Aquino, Cesar A.
2014-01-01
This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…
ERIC Educational Resources Information Center
Corn, Jenifer O.
2010-01-01
Schools and districts should use a well-designed needs assessment to inform important decisions about a range of technology program areas. Presently, there is a lack of valid and reliable instruments available and accessible to schools to effectively assess their educational needs to better design and evaluate their projects and initiatives. The…
NASA Astrophysics Data System (ADS)
Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven
2016-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data (downscaled values) and metadata (characterizing different aspects of the downscaling methods). This constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods. Here, we present an overall validation, analyzing marginal and temporal aspects to assess the intrinsic performance and added value of statistical downscaling methods at both annual and seasonal levels. This validation takes into account the different properties/limitations of different approaches and techniques (as reported in the provided metadata) in order to perform a fair comparison. It is pointed out that this experiment alone is not sufficient to evaluate the limitations of (MOS) bias correction techniques. Moreover, it also does not fully validate PP since we don't learn whether we have the right predictors and whether the PP assumption is valid. These problems will be analyzed in the subsequent community-open VALUE experiments 2) and 3), which will be open for participation along the present year.
NASA Technical Reports Server (NTRS)
Mathews, Douglas; Bock, Larry A.; Bielak, Gerald W.; Dougherty, R. P.; Premo, John W.; Scharpf, Dan F.; Yu, Jia
2014-01-01
Major airports in the world's air transportation systems face a serious problem in providing greater capacity to meet the ever increasing demands of air travel. This problem could be relieved if airports are allowed to increase their operating time, now restricted by curfews and by relaxing present limits on takeoffs and landings. The key operational issue in extending the present curfews is noise. In response to these increasing restrictive noise regulations, NASA has launched a program to validate through engine testing, noise reduction concepts and technologies that have evolved from the Advanced Subsonic Technologies (AST) Noise Reduction Program. The goal of this AST program was to develop and validate technology that reduces engine noise and improves nacelle suppression effectiveness relative to 1992 technology. Contract NAS3-97144 titled "Engine Validation of Noise Reduction Concepts" (EVNRC) was awarded to P&W on August 12, 1997 to conduct full scale noise reduction tests in two Phases on a PW4098 engine. The following Section 1.2 provides a brief description of the overall program. The remainder of this report provides a detailed documentation of Phase I of the program.
Taylor, Sean C; Mrkusich, Eli M
2014-01-01
In the past decade, the techniques of quantitative PCR (qPCR) and reverse transcription (RT)-qPCR have become accessible to virtually all research labs, producing valuable data for peer-reviewed publications and supporting exciting research conclusions. However, the experimental design and validation processes applied to the associated projects are the result of historical biases adopted by individual labs that have evolved and changed since the inception of the techniques and associated technologies. This has resulted in wide variability in the quality, reproducibility and interpretability of published data as a direct result of how each lab has designed their RT-qPCR experiments. The 'minimum information for the publication of quantitative real-time PCR experiments' (MIQE) was published to provide the scientific community with a consistent workflow and key considerations to perform qPCR experiments. We use specific examples to highlight the serious negative ramifications for data quality when the MIQE guidelines are not applied and include a summary of good and poor practices for RT-qPCR. © 2013 S. Karger AG, Basel.
International Collaboration on Spent Fuel Disposition in Crystalline Media: FY17 Progress Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yifeng; Hadgu, Teklu; Kainina, Elena
Active participation in international R&D is crucial for achieving the Spent Fuel Waste Science & Technology (SFWST) long-term goals of conducting “experiments to fill data needs and confirm advanced modeling approaches” and of having a “robust modeling and experimental basis for evaluation of multiple disposal system options” (by 2020). DOE’s Office of Nuclear Energy (NE) has developed a strategic plan to advance cooperation with international partners. The international collaboration on the evaluation of crystalline disposal media at Sandia National Laboratories (SNL) in FY17 focused on the collaboration through the Development of Coupled Models and their Validation against Experiments (DECOVALEX-2019) project.more » The DECOVALEX project is an international research and model comparison collaboration, initiated in 1992, for advancing the understanding and modeling of coupled thermo-hydro-mechanical-chemical (THMC) processes in geological systems. SNL has been participating in three tasks of the DECOVALEX project: Task A. Modeling gas injection experiments (ENGINEER), Task C. Modeling groundwater recovery experiment in tunnel (GREET), and Task F. Fluid inclusion and movement in the tight rock (FINITO).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tyler Gray; Jeremy Diez; Jeffrey Wishart
2013-07-01
The intent of the electric Ground Support Equipment (eGSE) demonstration is to evaluate the day-to-day vehicle performance of electric baggage tractors using two advanced battery technologies to demonstrate possible replacements for the flooded lead-acid (FLA) batteries utilized throughout the industry. These advanced battery technologies have the potential to resolve barriers to the widespread adoption of eGSE deployment. Validation testing had not previously been performed within fleet operations to determine if the performance of current advanced batteries is sufficient to withstand the duty cycle of electric baggage tractors. This report summarizes the work performed and data accumulated during this demonstration inmore » an effort to validate the capabilities of advanced battery technologies. This report summarizes the work performed and data accumulated during this demonstration in an effort to validate the capabilities of advanced battery technologies. The demonstration project also grew the relationship with Southwest Airlines (SWA), our demonstration partner at Ontario International Airport (ONT), located in Ontario, California. The results of this study have encouraged a proposal for a future demonstration project with SWA.« less
NASA Technical Reports Server (NTRS)
Johnson, Sandra
2001-01-01
The frequency bands being used for new satellite communication systems are constantly increasing to accommodate the requirements for additional capacity. At these higher frequencies, propagation impairments that did not significantly affect the signal at lower frequencies begin to have considerable impact. In Ka-band, the next logical commercial frequency band to be used for satellite communication, attenuation of the signal due to rain is a primary concern. An experimental satellite built by NASA, the Advanced Communication Technology Satellite (ACTS), launched in September 1993, is the first US communication satellite operating in the Ka-band. In addition to higher carrier frequencies, a number of other new technologies, including onboard baseband processing, multiple beam antennas, and rain fade detection and compensation techniques, were designed into the ACTS. Verification experiments have been conducted since the launch to characterize the new technologies. The focus of this thesis is to describe and validate the method used by the ACTS Very Small Aperture Terminal (VSAT) ground stations in detecting the presence of fade in the communication signal and to adaptively compensate for it by the addition of burst rate reduction and forward error correction. Measured data obtained from the ACTS program is used to validate the compensation technique. In this thesis, models in MATLAB are developed to statistically characterize the increased availability achieved by the compensation techniques in terms of the bit error rate time enhancement factor. Several improvements to the ACTS technique are discussed and possible implementations for future Ka-band systems are also presented.
Design technology co-optimization for 14/10nm metal1 double patterning layer
NASA Astrophysics Data System (ADS)
Duan, Yingli; Su, Xiaojing; Chen, Ying; Su, Yajuan; Shao, Feng; Zhang, Recco; Lei, Junjiang; Wei, Yayi
2016-03-01
Design and technology co-optimization (DTCO) can satisfy the needs of the design, generate robust design rule, and avoid unfriendly patterns at the early stage of design to ensure a high level of manufacturability of the product by the technical capability of the present process. The DTCO methodology in this paper includes design rule translation, layout analysis, model validation, hotspots classification and design rule optimization mainly. The correlation of the DTCO and double patterning (DPT) can optimize the related design rule and generate friendlier layout which meets the requirement of the 14/10nm technology node. The experiment demonstrates the methodology of DPT-compliant DTCO which is applied to a metal1 layer from the 14/10nm node. The DTCO workflow proposed in our job is an efficient solution for optimizing the design rules for 14/10 nm tech node Metal1 layer. And the paper also discussed and did the verification about how to tune the design rule of the U-shape and L-shape structures in a DPT-aware metal layer.
Exploration Life Support Critical Questions for Future Human Space Missions
NASA Technical Reports Server (NTRS)
Ewert, Michael K.; Barta, Daniel J.; McQuillan, Jeff
2009-01-01
Exploration Life Support (ELS) is a project under NASA s Exploration Technology Development Program. The ELS Project plans, coordinates and implements the development of advanced life support technologies for human exploration missions in space. Recent work has focused on closed loop atmosphere and water systems for a lunar outpost, including habitats and pressurized rovers. But, what are the critical questions facing life support system developers for these and other future human missions? This paper explores those questions and discusses how progress in the development of ELS technologies can help answer them. The ELS Project includes Atmosphere Revitalization Systems (ARS), Water Recovery Systems (WRS), Waste Management Systems (WMS), Habitation Engineering, Systems Integration, Modeling and Analysis (SIMA), and Validation and Testing, which includes the sub-elements Flight Experiments and Integrated Testing. Systems engineering analysis by ELS seeks to optimize the overall mission architecture by considering all the internal and external interfaces of the life support system and the potential for reduction or reuse of commodities. In particular, various sources and sinks of water and oxygen are considered along with the implications on loop closure and the resulting launch mass requirements.
The QKD network: model and routing scheme
NASA Astrophysics Data System (ADS)
Yang, Chao; Zhang, Hongqi; Su, Jinhai
2017-11-01
Quantum key distribution (QKD) technology can establish unconditional secure keys between two communicating parties. Although this technology has some inherent constraints, such as the distance and point-to-point mode limits, building a QKD network with multiple point-to-point QKD devices can overcome these constraints. Considering the development level of current technology, the trust relaying QKD network is the first choice to build a practical QKD network. However, the previous research didn't address a routing method on the trust relaying QKD network in detail. This paper focuses on the routing issues, builds a model of the trust relaying QKD network for easily analysing and understanding this network, and proposes a dynamical routing scheme for this network. From the viewpoint of designing a dynamical routing scheme in classical network, the proposed scheme consists of three components: a Hello protocol helping share the network topology information, a routing algorithm to select a set of suitable paths and establish the routing table and a link state update mechanism helping keep the routing table newly. Experiments and evaluation demonstrates the validity and effectiveness of the proposed routing scheme.
Virtual reality technology and surgical training--a survey of general surgeons in Ireland.
Early, S A; Roche-Nagle, G
2006-01-01
Virtual Reality Technology (VRT) is a validated method of training in industry but only recently has found a place in the postgraduate surgical curriculum. We surveyed 143 Irish consultant surgeons to ascertain their opinions on this topical issue. The survey consisted of 22 questions to which the consultants were asked to respond by choosing from a 5-point Likert scale. Sixty-five per cent responded. A majority of 72% had seen VRT but only 47% had 'hands on' experience. Forty-six per cent believed that they were poorly informed regarding available technologies. As consultants became more informed about VRT significant differences were seen with regard to attitudes regarding the role of VR in skills in surgical training (p<0.05) and in the ability to define teaching objectives (p<0.005). Our survey suggests that the underuse of the current offerings is not due to a perceived lack of interest on the part of the surgical trainers. Suppliers of these programmes have a responsibility to adequately educate and collaborate with all parties involved to improve overall benefit from these simulators.
High-Power, High-Temperature Superconductor Technology Development
NASA Technical Reports Server (NTRS)
Bhasin, Kul B.
2005-01-01
Since the first discovery of high-temperature superconductors (HTS) 10 years ago, the most promising areas for their applications in microwave systems have been as passive components for communication systems. Soon after the discovery, experiments showed that passive microwave circuits made from HTS material exceeded the performance of conventional devices for low-power applications and could be 10 times as small or smaller. However, for superconducting microwave components, high-power microwave applications have remained elusive until now. In 1996, DuPont and Com Dev Ltd. developed high-power superconducting materials and components for communication applications under a NASA Lewis Research Center cooperative agreement, NCC3-344 "High Power High Temperature Superconductor (HTS) Technology Development." The agreement was cost shared between the Defense Advanced Research Projects Agency's (DARPA) Technology Reinvestment Program Office and the two industrial partners. It has the following objectives: 1) Material development and characterization for high-power HTS applications; 2) Development and validation of generic high-power microwave components; 3) Development of a proof-of-concept model for a high-power six-channel HTS output multiplexer.
Space Technology 5: Pathfinder for Future Micro-Sat Constellations
NASA Technical Reports Server (NTRS)
Carlisle, Candace; Finnegan, Eric
2004-01-01
The Space Technology 5 (ST-5) Project, currently in the implementation phase, is part of the National Aeronautics and Space Administration (NASA) s New Millennium Program (NMP). ST-5 will consist of a constellation of three miniature satellites, each with mass less than 25 kg and size approximately 60 cm by 30 cm. ST-5 addresses technology challenges, as well as fabrication, assembly, test and operations strategies for future micro-satellite missions. ST-5 will be deployed into a highly eccentric, geo-transfer orbit (GTO). This will expose the spacecraft to a high radiation environment as well as provide a low level magnetic background. A three-month flight demonstration phase is planned to validate the technologies and demonstrate concepts for future missions. Each ST-5 spacecraft incorporates NMP competitively-selected breakthrough technologies. These include Cold Gas Micro-Thrusters for propulsion and attitude control, miniature X-band transponder for space-ground communications, Variable Emittance Coatings for dynamic thermal control, and CULPRiT ultra low power logic chip used for Reed-Solomon encoding. The ST-5 spacecraft itself is a technology that can be infused into future missions. It is a fully functional micro-spacecraft built within tight volume and mass constraints. It is built to withstand a high radiation environment, large thermal variations, and high launch loads. The spacecraft power system is low-power and low-voltage, and is designed to turn on after separation &om the launch vehicle. Some of the innovations that are included in the ST-5 design are a custom spacecraft deployment structure, magnetometer deployment boom, nutation damper, X-band antenna, miniature spinning sun sensor, solar array with triple junction solar cells, integral card cage assembly containing single card Command and Data Handling and Power System Electronics, miniature magnetometer, and lithium ion battery. ST-5 will demonstrate the ability of a micro satellite to perform research-quality science. Each ST-5 spacecraft will deploy a precision magnetometer to be used both for attitude determination and as a representative science instrument. The spacecraft has been developed with a low magnetic signature to avoid interference with the magnetometer. The spacecraft will be able to detect and respond autonomously to science events, i.e. significant changes in the magnetic field measurements. The three spacecraft will be a pathfinder for future constellation missions. They will be deployed to demonstrate an appropriate geometry for scientific measurements as a constellation. They will be operationally managed as a constellation, demonstrating automation and communication strategies that will be useful for future missions. The technologies and future mission concepts will be validated both on the ground and in space. Technologies will be validated on the ground by a combination of component level and system level testing of the flight hardware in a thermal vacuum environment. In flight, specific validation runs are planned for each of the technologies. Each validation run consists of one or more orbits with a specific validation objective. This paper will describe the ST-5 mission, and the applicability of the NMP technologies, spacecraft, and mission concepts to future missions. It will also discuss the validation approach for the ST-5 technologies and mission concepts.
Psychometric evaluation of a new assessment of the ability to manage technology in everyday life.
Malinowsky, Camilla; Nygård, Louise; Kottorp, Anders
2011-03-01
Technology increasingly influences the everyday lives of most people, and the ability to manage technology can be seen as a prerequisite for participation in everyday occupations. However, knowledge of the ability and skills required for management of technology is sparse. This study aimed to validate a new observation-based assessment, the Management of Everyday Technology Assessment (META). The META has been developed to assess the ability to manage technology in everyday life. A sample of 116 older adults with and without cognitive impairment were observed and interviewed by the use of the META when managing their everyday technology at home. The results indicate that the META demonstrates acceptable person response validity and technology goodness-of-fit. Additionally, the META can separate individuals with higher ability from individuals with lower ability to manage everyday technology. The META can be seen as a complement to existing ADL assessment techniques and is planned to be used in both research and practice.
Exploration Life Support Technology Development for Lunar Missions
NASA Technical Reports Server (NTRS)
Ewert, Michael K.; Barta, Daniel J.; McQuillan, Jeffrey
2009-01-01
Exploration Life Support (ELS) is one of NASA's Exploration Technology Development Projects. ELS plans, coordinates and implements the development of new life support technologies for human exploration missions as outlined in NASA's Vision for Space Exploration. ELS technology development currently supports three major projects of the Constellation Program - the Orion Crew Exploration Vehicle (CEV), the Altair Lunar Lander and Lunar Surface Systems. ELS content includes Air Revitalization Systems (ARS), Water Recovery Systems (WRS), Waste Management Systems (WMS), Habitation Engineering, Systems Integration, Modeling and Analysis (SIMA), and Validation and Testing. The primary goal of the ELS project is to provide different technology options to Constellation which fill gaps or provide substantial improvements over the state-of-the-art in life support systems. Since the Constellation missions are so challenging, mass, power, and volume must be reduced from Space Shuttle and Space Station technologies. Systems engineering analysis also optimizes the overall architecture by considering all interfaces with the life support system and potential for reduction or reuse of resources. For long duration missions, technologies which aid in closure of air and water loops with increased reliability are essential as well as techniques to minimize or deal with waste. The ELS project utilizes in-house efforts at five NASA centers, aerospace industry contracts, Small Business Innovative Research contracts and other means to develop advanced life support technologies. Testing, analysis and reduced gravity flight experiments are also conducted at the NASA field centers. This paper gives a current status of technologies under development by ELS and relates them to the Constellation customers who will eventually use them.
Hybrid Life Support System Technology Demonstrations
NASA Astrophysics Data System (ADS)
Morrow, R. C.; Wetzel, J. P.; Richter, R. C.
2018-02-01
Demonstration of plant-based hybrid life support technologies in deep space will validate the function of these technologies for long duration missions, such as Mars transit, while providing dietary variety to improve habitability.
Besenyi, Gina M; Diehl, Paul; Schooley, Benjamin; Turner-McGrievy, Brie M; Wilcox, Sara; Stanis, Sonja A Wilhelm; Kaczynski, Andrew T
2016-12-01
Creation of mobile technology environmental audit tools can provide a more interactive way for youth to engage with communities and facilitate participation in health promotion efforts. This study describes the development and validity and reliability testing of an electronic version of the Community Park Audit Tool (eCPAT). eCPAT consists of 149 items and incorporates a variety of technology benefits. Criterion-related validity and inter-rater reliability were evaluated using data from 52 youth across 47 parks in Greenville County, SC. A large portion of items (>70 %) demonstrated either fair or moderate to perfect validity and reliability. All but six items demonstrated excellent percent agreement. The eCPAT app is a user-friendly tool that provides a comprehensive assessment of park environments. Given the proliferation of smartphones, tablets, and other electronic devices among both adolescents and adults, the eCPAT app has potential to be distributed and used widely for a variety of health promotion purposes.
Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R
2015-11-01
The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.
Fuel Cell and Hydrogen Technologies Program | Hydrogen and Fuel Cells |
NREL Fuel Cell and Hydrogen Technologies Program Fuel Cell and Hydrogen Technologies Program Through its Fuel Cell and Hydrogen Technologies Program, NREL researches, develops, analyzes, and validates fuel cell and hydrogen production, delivery, and storage technologies for transportation
Classification Framework for ICT-Based Learning Technologies for Disabled People
ERIC Educational Resources Information Center
Hersh, Marion
2017-01-01
The paper presents the first systematic approach to the classification of inclusive information and communication technologies (ICT)-based learning technologies and ICT-based learning technologies for disabled people which covers both assistive and general learning technologies, is valid for all disabled people and considers the full range of…
Deep Space 1 is prepared for launch
NASA Technical Reports Server (NTRS)
1998-01-01
Workers in the Payload Hazardous Servicing Facility test equipment on Deep Space 1 to prepare it for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near-Earth asteroid, 1992 KD, has also been selected for a possible flyby.
Integrated Technology Rotor Methodology Assessment Workshop
NASA Technical Reports Server (NTRS)
Mcnulty, Michael J. (Editor); Bousman, William G. (Editor)
1988-01-01
The conference proceedings contains 14 formal papers and the results of two panel discussions. In addition, a transcript of discussion that followed the paper presentations and panels is included. The papers are of two kinds. The first seven papers were directed specifically to the correlation of industry and government mathematical models with data for rotorcraft stability from six experiments. The remaining 7 papers dealt with related topics in the prediction of rotor aeroelastic or aeromechanical stability. The first of the panels provided an evaluation of the correlation that was shown between the mathematical models and the experimental data. The second panel addressed the general problems of the validation of mathematical models.
Space Station Simulation Computer System (SCS) study for NASA/MSFC. Concept document
NASA Technical Reports Server (NTRS)
1990-01-01
NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station Payload of experiments that will be onboard the Space Station Freedom. The simulation will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.
NASA Technical Reports Server (NTRS)
Hinson, W. F.; Keafer, L. S.
1984-01-01
It is proposed that for inflatable antenna systems, technology feasibility can be demonstrated and parametric design and scalability (scale factor 10 to 20) can be validated with an experiment using a 16-m-diameter antenna attached to the Shuttle. The antenna configuration consists of a thin film cone and paraboloid held to proper shape by internal pressure and a self-rigidizing torus. The cone and paraboloid would be made using pie-shaped gores with the paraboloid being coated with aluminum to provide reflectivity. The torus would be constructed using an aluminum polyester composite that when inflated would erect to a smooth shell that can withstand loads without internal pressure.
Experimental aeroelasticity history, status and future in brief
NASA Technical Reports Server (NTRS)
Ricketts, Rodney H.
1990-01-01
NASA conducts wind tunnel experiments to determine and understand the aeroelastic characteristics of new and advanced flight vehicles, including fixed-wing, rotary-wing and space-launch configurations. Review and assessments are made of the state-of-the-art in experimental aeroelasticity regarding available facilities, measurement techniques, and other means and devices useful in testing. In addition, some past experimental programs are described which assisted in the development of new technology, validated new analysis codes, or provided needed information for clearing flight envelopes of unwanted aeroelastic response. Finally, needs and requirements for advances and improvements in testing capabilities for future experimental research and development programs are described.
Deep Space 1 is prepared for launch
NASA Technical Reports Server (NTRS)
1998-01-01
Workers in the Payload Hazardous Servicing Facility check equipment on Deep Space 1 to prepare it for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near-Earth asteroid, 1992 KD, has also been selected for a possible flyby.
Derision is the sweet spot of adoption: unleashing disruptive growth.
Poll, Wayne
2011-01-01
Energetic and ambitious clinicians frequently present new disruptive technologies and growth opportunities to hospital management. Far too often, established medical staff leadership respond to these replacement services with derision, as they sense that the value of their hard-fought experience is threatened. In this regard, derision is often disguised validation and may be the first indicator that the visionary physician is on to something. Truly disruptive service offerings cannot survive the scrutiny of layered medical staff structure or traditional fiscal review. Innovative hospital CEOs should take notice when a new idea is treated with derision and consider resourcing them through an alternative pathway.
BACT Simulation User Guide (Version 7.0)
NASA Technical Reports Server (NTRS)
Waszak, Martin R.
1997-01-01
This report documents the structure and operation of a simulation model of the Benchmark Active Control Technology (BACT) Wind-Tunnel Model. The BACT system was designed, built, and tested at NASA Langley Research Center as part of the Benchmark Models Program and was developed to perform wind-tunnel experiments to obtain benchmark quality data to validate computational fluid dynamics and computational aeroelasticity codes, to verify the accuracy of current aeroservoelasticity design and analysis tools, and to provide an active controls testbed for evaluating new and innovative control algorithms for flutter suppression and gust load alleviation. The BACT system has been especially valuable as a control system testbed.
Deep Space 1 is prepared for launch
NASA Technical Reports Server (NTRS)
1998-01-01
Workers in the Payload Hazardous Servicing Facility remove a solar panel from Deep Space 1 as part of the preparations for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near- Earth asteroid, 1992 KD, has also been selected for a possible flyby.
Deep Space 1 is prepared for launch
NASA Technical Reports Server (NTRS)
1998-01-01
Workers in the Payload Hazardous Servicing Facility check out Deep Space 1 to prepare it for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near-Earth asteroid, 1992 KD, has also been selected for a possible flyby.
1998-09-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Payload Hazardous Servicing Facility remove a solar panel from Deep Space 1 as part of the preparations for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near-Earth asteroid, 1992 KD, has also been selected for a possible flyby
1998-09-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Payload Hazardous Servicing Facility check equipment on Deep Space 1 to prepare it for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near-Earth asteroid, 1992 KD, has also been selected for a possible flyby
1998-09-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Payload Hazardous Servicing Facility check out Deep Space 1 to prepare it for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near-Earth asteroid, 1992 KD, has also been selected for a possible flyby
1998-09-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Payload Hazardous Servicing Facility test equipment on Deep Space 1 to prepare it for launch aboard a Boeing Delta 7326 rocket in October. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Onboard experiments include an ion propulsion engine and software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Most of its mission objectives will be completed within the first two months. A near-Earth asteroid, 1992 KD, has also been selected for a possible flyby
Risk Management of New Microelectronics for NASA: Radiation Knowledge-base
NASA Technical Reports Server (NTRS)
LaBel, Kenneth A.
2004-01-01
Contents include the following: NASA Missions - implications to reliability and radiation constraints. Approach to Insertion of New Technologies Technology Knowledge-base development. Technology model/tool development and validation. Summary comments.
Multi-criteria development and incorporation into decision tools for health technology adoption.
Poulin, Paule; Austen, Lea; Scott, Catherine M; Waddell, Cameron D; Dixon, Elijah; Poulin, Michelle; Lafrenière, René
2013-01-01
When introducing new health technologies, decision makers must integrate research evidence with local operational management information to guide decisions about whether and under what conditions the technology will be used. Multi-criteria decision analysis can support the adoption or prioritization of health interventions by using criteria to explicitly articulate the health organization's needs, limitations, and values in addition to evaluating evidence for safety and effectiveness. This paper seeks to describe the development of a framework to create agreed-upon criteria and decision tools to enhance a pre-existing local health technology assessment (HTA) decision support program. The authors compiled a list of published criteria from the literature, consulted with experts to refine the criteria list, and used a modified Delphi process with a group of key stakeholders to review, modify, and validate each criterion. In a workshop setting, the criteria were used to create decision tools. A set of user-validated criteria for new health technology evaluation and adoption was developed and integrated into the local HTA decision support program. Technology evaluation and decision guideline tools were created using these criteria to ensure that the decision process is systematic, consistent, and transparent. This framework can be used by others to develop decision-making criteria and tools to enhance similar technology adoption programs. The development of clear, user-validated criteria for evaluating new technologies adds a critical element to improve decision-making on technology adoption, and the decision tools ensure consistency, transparency, and real-world relevance.
Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel
2011-06-01
This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.
Schäfer, Martina Christina Marion; Sutherland, Dean; McLay, Laurie; Achmadi, Donna; van der Meer, Larah; Sigafoos, Jeff; Lancioni, Giulio E; O'Reilly, Mark F; Schlosser, Ralf W; Marschik, Peter B
2016-12-01
The social validity of different communication modalities is a potentially important variable to consider when designing augmentative and alternative communication (AAC) interventions. To assess the social validity of three AAC modes (i.e., manual signing, picture exchange, and an iPad ® -based speech-generating device), we asked 59 undergraduate students (pre-service teachers) and 43 teachers to watch a video explaining each mode. They were then asked to nominate the mode they perceived to be easiest to learn as well as the most intelligible, effective, and preferred. Participants were also asked to list the main reasons for their nominations and report on their experience with each modality. Most participants (68-86%) nominated the iPad-based speech-generating device (SGD) as easiest to learn, as well as the most intelligible, effective, and preferred. This device was perceived to be easy to understand and use and to have familiar and socially acceptable technology. Results suggest that iPad-based SGDs were perceived as more socially valid among this sample of teachers and undergraduate students. Information of this type may have some relevance to designing AAC supports for people who use AAC and their current and future potential communication partners.
Gaggioli, Andrea
2012-01-01
What does one feel when one uses virtual reality? How does this experience differ from the experience associated with "real life" activities and situations? To answer these questions, we used the Experience Sampling Method (ESM), a procedure that allows researchers to investigate the daily fluctuations in the quality of experience through on-line self reports that participants fill out during daily life. The investigation consisted in one-week ESM observation (N = 42). During this week, participants underwent two virtual reality sessions: Immediately after the exposure to virtual environments, they were asked to complete a ESM report. For data analysis, experiential variables were aggregated into four dimensions: Mood, Engagement, Confidence, and Intrinsic Motivation Intrinsic Motivation. Findings showed that virtual experience is characterized by a specific configuration, which comprises significantly positive values for affective and cognitive components. In particular, positive scores of Mood suggest that participants perceived VR as an intrinsically pleasurable activity, while positive values of Engagement indicate that the use of VR and the experimental task provided valid opportunities for action and high skill investment. Furthermore, results showed that virtual experience is associated with Flow, a state of consciousness characterized by narrowed focus of attention, deep concentration, positive affect and intrinsic reward. Implications for VR research and practice are discussed.
Simperingham, Kim D; Cronin, John B; Ross, Angus
2016-11-01
Advanced testing technologies enable insight into the kinematic and kinetic determinants of sprint acceleration performance, which is particularly important for field-based team-sport athletes. Establishing the reliability and validity of the data, particularly from the acceleration phase, is important for determining the utility of the respective technologies. The aim of this systematic review was to explain the utility, reliability, validity and limitations of (1) radar and laser technology, and (2) non-motorised treadmill (NMT) and torque treadmill (TT) technology for providing kinematic and kinetic measures of sprint acceleration performance. A comprehensive search of the CINAHL Plus, MEDLINE (EBSCO), PubMed, SPORTDiscus, and Web of Science databases was conducted using search terms that included radar, laser, non-motorised treadmill, torque treadmill, sprint, acceleration, kinetic, kinematic, force, and power. Studies examining the kinematics or kinetics of short (≤10 s), maximal-effort sprint acceleration in adults or children, which included an assessment of reliability or validity of the advanced technologies of interest, were included in this systematic review. Absolute reliability, relative reliability and validity data were extracted from the selected articles and tabulated. The level of acceptance of reliability was a coefficient of variation (CV) ≤10 % and an intraclass correlation coefficient (ICC) or correlation coefficient (r) ≥0.70. A total of 34 studies met the inclusion criteria and were included in the qualitative analysis. Generally acceptable validity (r = 0.87-0.99; absolute bias 3-7 %), intraday reliability (CV ≤9.5 %; ICC/r ≥0.84) and interday reliability (ICC ≥0.72) were reported for data from radar and laser. However, low intraday reliability was reported for the theoretical maximum horizontal force (ICC 0.64) within adolescent athletes, and low validity was reported for velocity during the initial 5 m of a sprint acceleration (bias up to 0.41 m/s) measured with a laser device. Acceptable reliability of results from NMT and TT was only ensured when testing protocols involved sufficient familiarisation, a high sampling rate (≥200 Hz), a 'blocked' start position, and the analysis of discrete steps rather than arbitrary time periods. Sprinting times and speeds were 20-28 % slower on a TT, 28-67 % slower on an NMT, and only 9-64 % of the variance in overground measurements of speed and time (≤30 m) was explained by results from an NMT. There have been no reports to date of criterion validity of kinetic measures of sprint acceleration performance on NMT andTT, and only limited results regarding acceptable concurrent validity of radar-derived kinetic data. Radar, laser, NMT and TT technologies can be used to reliably measure sprint acceleration performance and to provide insight into the determinants of sprinting speed. However, further research is required to establish the validity of the kinetic measurements made with NMT and TT. Radar and laser technology may not be suitable for measuring the first few steps of a sprint acceleration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, Paolo; Theiler, C.; Fasoli, A.
A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kilkenny, J.; Richau, G.; Sangster, C.
A major goal of the Stockpile Stewardship Program (SSP) is to deliver validated numerical models, benchmarked against experiments that address relevant and important issues and provide data that stress the codes and our understanding. DOENNSA has made significant investments in major facilities and high-performance computing to successfully execute the SSP. The more information obtained about the physical state of the plasmas produced, the more stringent the test of theories, models, and codes can be, leading to increased confidence in our predictive capability. To fully exploit the world-leading capabilities of the ICF program, a multi-year program to develop and deploy advancedmore » diagnostics has been developed by the expert scientific community. To formalize these activities NNSA’s Acting Director for the Inertial Confinement Fusion Program directed the formation and duties of the National Diagnostics Working Group (NDWG) in a Memorandum 11/3/16 (Appendix A). The NDWG identified eight transformational diagnostics, shown in Table 1, that will provide unprecedented information from experiments in support of the SSP at NIF, Z and OMEGA. Table 1 shows how the missions of the SSP experiments including materials, complex hydrodynamics, radiation flow and effects and thermo-nuclear burn and boost will produce new observables, which will be measured using a variety of largely new diagnostic technologies used in the eight transformational diagnostics. The data provided by these diagnostics will validate and improve the physics contained within the SSP’s simulations and both uncover and quantify important phenomena that lie beyond our present understanding.« less
Jeon, Joonryong
2017-01-01
In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size. PMID:28704945
Heo, Gwanghee; Jeon, Joonryong
2017-07-12
In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size.
FuGEFlow: data model and markup language for flow cytometry
Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R
2009-01-01
Background Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. Methods We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. Results The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. Conclusion We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development. PMID:19531228
ERIC Educational Resources Information Center
Ross, Steven M.; Morrison, Jennifer R.
2014-01-01
In a paper published 25 years ago, Ross and Morrison ("Educ Technol Res Dev" 37(1):19-33, 1989) called for a "happy medium" in educational technology research, to be achieved by balancing high rigor of studies (internal validity) with relevance to real-world applications (external validity). In this paper, we argue that,…