Low Noise Camera for Suborbital Science Applications
NASA Technical Reports Server (NTRS)
Hyde, David; Robertson, Bryan; Holloway, Todd
2015-01-01
Low-cost, commercial-off-the-shelf- (COTS-) based science cameras are intended for lab use only and are not suitable for flight deployment as they are difficult to ruggedize and repackage into instruments. Also, COTS implementation may not be suitable since mission science objectives are tied to specific measurement requirements, and often require performance beyond that required by the commercial market. Custom camera development for each application is cost prohibitive for the International Space Station (ISS) or midrange science payloads due to nonrecurring expenses ($2,000 K) for ground-up camera electronics design. While each new science mission has a different suite of requirements for camera performance (detector noise, speed of image acquisition, charge-coupled device (CCD) size, operation temperature, packaging, etc.), the analog-to-digital conversion, power supply, and communications can be standardized to accommodate many different applications. The low noise camera for suborbital applications is a rugged standard camera platform that can accommodate a range of detector types and science requirements for use in inexpensive to mid range payloads supporting Earth science, solar physics, robotic vision, or astronomy experiments. Cameras developed on this platform have demonstrated the performance found in custom flight cameras at a price per camera more than an order of magnitude lower.
The Mast Cameras and Mars Descent Imager (MARDI) for the 2009 Mars Science Laboratory
NASA Technical Reports Server (NTRS)
Malin, M. C.; Bell, J. F.; Cameron, J.; Dietrich, W. E.; Edgett, K. S.; Hallet, B.; Herkenhoff, K. E.; Lemmon, M. T.; Parker, T. J.; Sullivan, R. J.
2005-01-01
Based on operational experience gained during the Mars Exploration Rover (MER) mission, we proposed and were selected to conduct two related imaging experiments: (1) an investigation of the geology and short-term atmospheric vertical wind profile local to the Mars Science Laboratory (MSL) landing site using descent imaging, and (2) a broadly-based scientific investigation of the MSL locale employing visible and very near infra-red imaging techniques from a pair of mast-mounted, high resolution cameras. Both instruments share a common electronics design, a design also employed for the MSL Mars Hand Lens Imager (MAHLI) [1]. The primary differences between the cameras are in the nature and number of mechanisms and specific optics tailored to each camera s requirements.
High-Resolution Mars Camera Test Image of Moon Infrared
2005-09-13
This crescent view of Earth Moon in infrared wavelengths comes from a camera test by NASA Mars Reconnaissance Orbiter spacecraft on its way to Mars. This image was taken by taken by the High Resolution Imaging Science Experiment camera Sept. 8, 2005.
NASA Technical Reports Server (NTRS)
Haines, Richard F.; Chuang, Sherry L.
1993-01-01
Current plans indicate that there will be a large number of life science experiments carried out during the thirty year-long mission of the Biological Flight Research Laboratory (BFRL) on board Space Station Freedom (SSF). Non-human life science experiments will be performed in the BFRL. Two distinct types of activities have already been identified for this facility: (1) collect, store, distribute, analyze and manage engineering and science data from the Habitats, Glovebox and Centrifuge, (2) perform a broad range of remote science activities in the Glovebox and Habitat chambers in conjunction with the remotely located principal investigator (PI). These activities require extensive video coverage, viewing and/or recording and distribution to video displays on board SSF and to the ground. This paper concentrates mainly on the second type of activity. Each of the two BFRL habitat racks are designed to be configurable for either six rodent habitats per rack, four plant habitats per rack, or a combination of the above. Two video cameras will be installed in each habitat with a spare attachment for a third camera when needed. Therefore, a video system that can accommodate up to 12-18 camera inputs per habitat rack must be considered.
Camera Ready to Install on Mars Reconnaissance Orbiter
2005-01-07
A telescopic camera called the High Resolution Imaging Science Experiment, or HiRISE, right was installed onto the main structure of NASA Mars Reconnaissance Orbiter left on Dec. 11, 2004 at Lockheed Martin Space Systems, Denver.
High-Resolution Mars Camera Test Image of Moon (Infrared)
NASA Technical Reports Server (NTRS)
2005-01-01
This crescent view of Earth's Moon in infrared wavelengths comes from a camera test by NASA's Mars Reconnaissance Orbiter spacecraft on its way to Mars. The mission's High Resolution Imaging Science Experiment camera took the image on Sept. 8, 2005, while at a distance of about 10 million kilometers (6 million miles) from the Moon. The dark feature on the right is Mare Crisium. From that distance, the Moon would appear as a star-like point of light to the unaided eye. The test verified the camera's focusing capability and provided an opportunity for calibration. The spacecraft's Context Camera and Optical Navigation Camera also performed as expected during the test. The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across.1991-04-03
The USML-1 Glovebox (GBX) is a multi-user facility supporting 16 experiments in fluid dynamics, combustion sciences, crystal growth, and technology demonstration. The GBX has an enclosed working space which minimizes the contamination risks to both Spacelab and experiment samples. The GBX supports four charge-coupled device (CCD) cameras (two of which may be operated simultaneously) with three black-and-white and three color camera CCD heads available. The GBX also has a backlight panel, a 35 mm camera, and a stereomicroscope that offers high-magnification viewing of experiment samples. Video data can also be downlinked in real-time. The GBX also provides electrical power for experiment hardware, a time-temperature display, and cleaning supplies.
1995-08-29
The USML-1 Glovebox (GBX) is a multi-user facility supporting 16 experiments in fluid dynamics, combustion sciences, crystal growth, and technology demonstration. The GBX has an enclosed working space which minimizes the contamination risks to both Spacelab and experiment samples. The GBX supports four charge-coupled device (CCD) cameras (two of which may be operated simultaneously) with three black-and-white and three color camera CCD heads available. The GBX also has a backlight panel, a 35 mm camera, and a stereomicroscope that offers high-magnification viewing of experiment samples. Video data can also be downlinked in real-time. The GBX also provides electrical power for experiment hardware, a time-temperature display, and cleaning supplies.
Characterization and optimization for detector systems of IGRINS
NASA Astrophysics Data System (ADS)
Jeong, Ueejeong; Chun, Moo-Young; Oh, Jae Sok; Park, Chan; Yuk, In-Soo; Oh, Heeyoung; Kim, Kang-Min; Ko, Kyeong Yeon; Pavel, Michael D.; Yu, Young Sam; Jaffe, Daniel T.
2014-07-01
IGRINS (Immersion GRating INfrared Spectrometer) is a high resolution wide-band infrared spectrograph developed by the Korea Astronomy and Space Science Institute (KASI) and the University of Texas at Austin (UT). This spectrograph has H-band and K-band science cameras and a slit viewing camera, all three of which use Teledyne's λc~2.5μm 2k×2k HgCdTe HAWAII-2RG CMOS detectors. The two spectrograph cameras employ science grade detectors, while the slit viewing camera includes an engineering grade detector. Teledyne's cryogenic SIDECAR ASIC boards and JADE2 USB interface cards were installed to control those detectors. We performed experiments to characterize and optimize the detector systems in the IGRINS cryostat. We present measurements and optimization of noise, dark current, and referencelevel stability obtained under dark conditions. We also discuss well depth, linearity and conversion gain measurements obtained using an external light source.
Making Connections with Digital Data
ERIC Educational Resources Information Center
Leonard, William; Bassett, Rick; Clinger, Alicia; Edmondson, Elizabeth; Horton, Robert
2004-01-01
State-of-the-art digital cameras open up enormous possibilities in the science classroom, especially when used as data collectors. Because most high school students are not fully formal thinkers, the digital camera can provide a much richer learning experience than traditional observation. Data taken through digital images can make the…
Full-Frame Reference for Test Photo of Moon
NASA Technical Reports Server (NTRS)
2005-01-01
This pair of views shows how little of the full image frame was taken up by the Moon in test images taken Sept. 8, 2005, by the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. The Mars-bound camera imaged Earth's Moon from a distance of about 10 million kilometers (6 million miles) away -- 26 times the distance between Earth and the Moon -- as part of an activity to test and calibrate the camera. The images are very significant because they show that the Mars Reconnaissance Orbiter spacecraft and this camera can properly operate together to collect very high-resolution images of Mars. The target must move through the camera's telescope view in just the right direction and speed to acquire a proper image. The day's test images also demonstrate that the focus mechanism works properly with the telescope to produce sharp images. Out of the 20,000-pixel-by-6,000-pixel full frame, the Moon's diameter is about 340 pixels, if the full Moon could be seen. The illuminated crescent is about 60 pixels wide, and the resolution is about 10 kilometers (6 miles) per pixel. At Mars, the entire image region will be filled with high-resolution information. The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across. The Mars Reconnaissance Orbiter mission is managed by NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, for the NASA Science Mission Directorate. Lockheed Martin Space Systems, Denver, prime contractor for the project, built the spacecraft. Ball Aerospace & Technologies Corp., Boulder, Colo., built the High Resolution Imaging Science Experiment instrument for the University of Arizona, Tucson, to provide to the mission. The HiRISE Operations Center at the University of Arizona processes images from the camera.GEMINI-TITAN (GT)-11 - MISC. EXPERIMENTS - MSC
1966-03-22
S66-02611 (22 March 1966) --- Gemini-11 Experiment S-13 Ultraviolet Astronomical Camera. It will be used to test the techniques of ultraviolet photography under vacuum conditions and obtain ultraviolet radiation observations of stars in wave length region of 2,000 to 4,000 Angstroms by spectral means. Equipment is the Maurer 70mm camera with UV lens (f3.3) and magazine, objective grating and objective prism, extended shuttle actuator, and mounting bracket. For the experiment, the camera is mounted on the centerline torque box to point through the opened right-hand hatch. Propellant expenditure is estimated at 4.5 pounds per night pass. Two night passes will be used to photograph probably six star fields. Sponsors are NASA's Office of Space Science and Applications and Northwestern University. Photo credit: NASA
International Space Station: Expedition 2000
NASA Technical Reports Server (NTRS)
2000-01-01
Live footage of the International Space Station (ISS) presents an inside look at the groundwork and assembly of the ISS. Footage includes both animation and live shots of a Space Shuttle liftoff. Phil West, Engineer; Dr. Catherine Clark, Chief Scientist ISS; and Joe Edwards, Astronaut, narrate the video. The first topic of discussion is People and Communications. Good communication is a key component in our ISS endeavor. Dr. Catherine Clark uses two soup cans attached by a string to demonstrate communication. Bill Nye the Science Guy talks briefly about science aboard the ISS. Charlie Spencer, Manager of Space Station Simulators, talks about communication aboard the ISS. The second topic of discussion is Engineering. Bonnie Dunbar, Astronaut at Johnson Space Flight Center, gives a tour of the Japanese Experiment Module (JEM). She takes us inside Node 2 and the U.S. Lab Destiny. She also shows where protein crystal growth experiments are performed. Audio terminal units are used for communication in the JEM. A demonstration of solar arrays and how they are tested is shown. Alan Bell, Project Manager MRMDF (Mobile Remote Manipulator Development Facility), describes the robot arm that is used on the ISS and how it maneuvers the Space Station. The third topic of discussion is Science and Technology. Dr. Catherine Clark, using a balloon attached to a weight, drops the apparatus to the ground to demonstrate Microgravity. The bursting of the balloon is observed. Sherri Dunnette, Imaging Technologist, describes the various cameras that are used in space. The types of still cameras used are: 1) 35 mm, 2) medium format cameras, 3) large format cameras, 4) video cameras, and 5) the DV camera. Kumar Krishen, Chief Technologist ISS, explains inframetrics, infrared vision cameras and how they perform. The Short Arm Centrifuge is shown by Dr. Millard Reske, Senior Life Scientist, to subject astronauts to forces greater than 1-g. Reske is interested in the physiological effects of the eyes and the muscular system after their exposure to forces greater than 1-g.
Science and Creative Writing: An Ad(d)verse Relationship?
ERIC Educational Resources Information Center
Blake, William E.
1983-01-01
Suggests integrating creative writing activities into field trips or outdoor education experiences in science as a method of providing "right-brain" and "left-brain" activities in the same exercise. Provides instructions given to students and a poem written from student "photographs" using imaginary cameras. Also provides two student poems. (JM)
Staking out Curiosity Landing Site
2012-08-09
The geological context for the landing site of NASA Curiosity rover is visible in this image mosaic obtained by the High-Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.
Mountainous Crater Rim on Mars
2013-10-17
This is a screen shot from a high-definition simulated movie of Mojave Crater on Mars, based on images taken by the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.
2003-01-16
KENNEDY SPACE CENTER, FLA. - STS-107 Mission Specialist Laurel Clark waves to a camera out of view during final preparations of her launch and entry suit in the White Room. The environmentally controlled chamber is mated to Space Shuttle Columbia for entry into the Shuttle. The hatch is seen in the background right. STS-107 is a mission devoted to research and will include more than 80 experiments that will study Earth and space science, advanced technology development, and astronaut health and safety. The payload on Space Shuttle Columbia includes FREESTAR (Fast Reaction Experiments Enabling Science, Technology, Applications and Research) and the SHI Research Double Module (SHI/RDM), known as SPACEHAB. Experiments on the module range from material sciences to life sciences. Liftoff is scheduled for 10:39 a.m. EST.
Color Image of Phoenix Lander on Mars Surface
2008-05-27
This is an enhanced-color image from Mars Reconnaissance Orbiter High Resolution Imaging Science Experiment HiRISE camera. It shows the NASA Mars Phoenix lander with its solar panels deployed on the Mars surface
2012-09-06
Tracks from the first drives of NASA Curiosity rover are visible in this image captured by the High-Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter. The rover is seen where the tracks end.
Large, Fresh Crater Surrounded by Smaller Craters
2014-05-22
The largest crater associated with a March 2012 impact on Mars has many smaller craters around it, revealed in this image from the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.
Wilderness experience in Rocky Mountain National Park 2002: Report to RMNP
Schuster, Elke; Johnson, S. Shea; Taylor, Jonathan G.
2004-01-01
The social science technique of Visitor Employed Photography [VEP] was used to obtain information from visitors about wilderness experiences. Visitors were selected at random from Park-designated wilderness trails, in proportion to their use, and asked to participate in the survey. Respondents were given single-use, 10-exposure cameras and photo-log diaries to record experiences. A total of 293 cameras were distributed, with a response rate of 87%. Following the development of the photos, a copy of the photos, two pertinent pages from the photo-log, and a follow-up survey were mailed to respondents. Fifty six percent of the follow-up surveys were returned. Findings from the two surveys were analyzed and compared.
Volunteers Help Decide Where to Point Mars Camera
2015-07-22
This series of images from NASA's Mars Reconnaissance Orbiter successively zooms into "spider" features -- or channels carved in the surface in radial patterns -- in the south polar region of Mars. In a new citizen-science project, volunteers will identify features like these using wide-scale images from the orbiter. Their input will then help mission planners decide where to point the orbiter's high-resolution camera for more detailed views of interesting terrain. Volunteers will start with images from the orbiter's Context Camera (CTX), which provides wide views of the Red Planet. The first two images in this series are from CTX; the top right image zooms into a portion of the image at left. The top right image highlights the geological spider features, which are carved into the terrain in the Martian spring when dry ice turns to gas. By identifying unusual features like these, volunteers will help the mission team choose targets for the orbiter's High Resolution Imaging Science Experiment (HiRISE) camera, which can reveal more detail than any other camera ever put into orbit around Mars. The final image is this series (bottom right) shows a HiRISE close-up of one of the spider features. http://photojournal.jpl.nasa.gov/catalog/PIA19823
Payload specialist Merbold performing experiment in Spacelab
1983-11-28
STS009-13-699 (28 Nov - 8 Dec 1983) --? Ulf Merbold, Spacelab 1 payload specialist, carries out one of the experiments using the gradient heating facility on the materials science double rack facility in the busy science module aboard the Earth-orbiting Space Shuttle Columbia. Representing the European Space Agency, Dr. Merbold comes from Max-Planck Institute in Stuttgart, the Federal Republic of Germany. He is a specialist in crystal lattice defects and low temperature physics. The photograph was made with a 35mm camera.
Caught in Action: Avalanches on North Polar Scarps
2008-03-03
Amazingly, this image has captured at least four Martian avalanches, or debris falls, in action. It was taken on February 19, 2008, by the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.
Phoenix Lander Amid Disappearing Spring Ice
2010-01-11
NASA Phoenix Mars Lander, its backshell and heatshield visible within this enhanced-color image of the Phoenix landing site taken on Jan. 6, 2010 by the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.
Phobos from 6,800 Kilometers Color
2008-04-09
The High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter took two images of the larger of Mars two moons, Phobos, within 10 minutes of each other on March 23, 2008. This is the first.
2008-05-24
This animation zooms in on the area on Mars where NASA Phoenix Mars Lander will touchdown on May 25, 2008. The image was taken by the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.
Could This Be the Mars Soviet 3 Lander?
2013-04-11
This set of images shows what might be hardware from the Soviet Union 1971 Mars 3 lander, seen in a pair of images from the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.
Oblique View of Victoria Crater
2009-08-12
This image of Victoria Crater in the Meridiani Planum region of Mars was taken by the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter at more of a sideways angle than earlier orbital images of this crater.
High-Tech Simulations Linked to Learning
ERIC Educational Resources Information Center
Ash, Katie
2009-01-01
To build on classroom experiments and lectures, Daniel Sweeney has his 9th grade earth science students act out scientific concepts on a 15-by-15-foot mat on the floor of the room. Object-tracking cameras mounted on scaffolding around the space collect data based on the students' movements while immersing them in the experience through a video…
Full-Frame Reference for Test Photo of Moon
2005-09-10
This pair of views shows how little of the full image frame was taken up by the Moon in test images taken Sept. 8, 2005, by the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.
NASA Mars Science Laboratory Rover
NASA Technical Reports Server (NTRS)
Olson, Tim
2017-01-01
Since August 2012, the NASA Mars Science Laboratory (MSL) rover Curiosity has been operating on the Martian surface. The primary goal of the MSL mission is to assess whether Mars ever had an environment suitable for life. MSL Science Team member Dr. Tim Olson will provide an overview of the rover's capabilities and the major findings from the mission so far. He will also share some of his experiences of what it is like to operate Curiosity's science cameras and explore Mars as part of a large team of scientists and engineers.
Science, conservation, and camera traps
Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas
2011-01-01
Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.
Earth and Moon as Seen from Mars
2008-03-03
The High Resolution Imaging Science Experiment HiRISE camera would make a great backyard telescope for viewing Mars, and we can also use it at Mars to view other planets. This is an image of Earth and the moon, acquired on October 3, 2007.
NASA Astrophysics Data System (ADS)
Tsuruoka, Masako; Shibasaki, Ryosuke; Box, Elgene O.; Murai, Shunji; Mori, Eiji; Wada, Takao; Kurita, Masahiro; Iritani, Makoto; Kuroki, Yoshikatsu
1994-08-01
In medical rehabilitation science, quantitative understanding of patient movement in 3-D space is very important. The patient with any joint disorder will experience its influence on other body parts in daily movement. The alignment of joints in movement is able to improve under medical therapy process. In this study, the newly developed system is composed of two non- metri CCD video cameras and a force plate sensor, which are controlled simultaneously by a personal computer. By this system time-series digital data from 3-D image photogrammetry, each foot pressure and its center position, is able to provide efficient information for biomechanical and mathematical analysis of human movement. Each specific and common points are indicated in any patient movement. This study suggests more various, quantitative understanding in medical rehabilitation science.
Cassini Camera Contamination Anomaly: Experiences and Lessons Learned
NASA Technical Reports Server (NTRS)
Haemmerle, Vance R.; Gerhard, James H.
2006-01-01
We discuss the contamination 'Haze' anomaly for the Cassini Narrow Angle Camera (NAC), one of two optical telescopes that comprise the Imaging Science Subsystem (ISS). Cassini is a Saturn Orbiter with a 4-year nominal mission. The incident occurred in 2001, five months after Jupiter encounter during the Cruise phase and ironically at the resumption of planned maintenance decontamination cycles. The degraded optical performance was first identified by the Instrument Operations Team with the first ISS Saturn imaging six weeks later. A distinct haze of varying size from image to image marred the images of Saturn. A photometric star calibration of the Pleiades, 4 days after the incident, showed stars with halos. Analysis showed that while the halo's intensity was only 1 - 2% of the intensity of the central peak of a star, the halo contained 30 - 70% of its integrated flux. This condition would impact science return. In a review of our experiences, we examine the contamination control plan, discuss the analysis of the limited data available and describe the one-year campaign to remove the haze from the camera. After several long conservative heating activities and interim analysis of their results, the contamination problem as measured by the camera's point spread function was essentially back to preanomaly size and at a point where there would be more risk to continue. We stress the importance of the flexibility of operations and instrument design, the need to do early infight instrument calibration and continual monitoring of instrument performance.
Experience with the UKIRT InSb array camera
NASA Technical Reports Server (NTRS)
Mclean, Ian S.; Casali, Mark M.; Wright, Gillian S.; Aspin, Colin
1989-01-01
The cryogenic infrared camera, IRCAM, has been operating routinely on the 3.8 m UK Infrared Telescope on Mauna Kea, Hawaii for over two years. The camera, which uses a 62x58 element Indium Antimonide array from Santa Barbara Research Center, was designed and built at the Royal Observatory, Edinburgh which operates UKIRT on behalf of the UK Science and Engineering Research Council. Over the past two years at least 60% of the available time on UKIRT has been allocated for IRCAM observations. Described here are some of the properties of this instrument and its detector which influence astronomical performance. Observational techniques and the power of IR arrays with some recent astronomical results are discussed.
SOFIA Science Instruments: Commissioning, Upgrades and Future Opportunities
NASA Technical Reports Server (NTRS)
Smith, Erin C.
2014-01-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is the world's largest airborne observatory, featuring a 2.5 meter telescope housed in the aft section of a Boeing 747sp aircraft. SOFIA's current instrument suite includes: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), a 5-40 µm dual band imager/grism spectrometer developed at Cornell University; HIPO (High-speed Imaging Photometer for Occultations), a 0.3-1.1 micron imager built by Lowell Observatory; FLITECAM (First Light Infrared Test Experiment CAMera), a 1-5 micron wide-field imager/grism spectrometer developed at UCLA; FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), a 42-210 micron IFU grating spectrograph completed by University Stuttgart; and EXES (Echelon-Cross- Echelle Spectrograph), a 5-28 micron high-resolution spectrometer being completed by UC Davis and NASA Ames. A second generation instrument, HAWC+ (Highresolution Airborne Wideband Camera), is a 50-240 micron imager being upgraded at JPL to add polarimetry and new detectors developed at GSFC. SOFIA will continually update its instrument suite with new instrumentation, technology demonstration experiments and upgrades to the existing instrument suite. This paper details instrument capabilities and status as well as plans for future instrumentation, including the call for proposals for 3rd generation SOFIA science instruments.
Mars Descent Imager for Curiosity
2010-07-19
A pocketknife provides scale for this image of the Mars Descent Imager camera; the camera will fly on the Curiosity rover of NASA Mars Science Laboratory mission. Malin Space Science Systems, San Diego, Calif., supplied the camera for the mission.
Ultrafast electron microscopy in materials science, biology, and chemistry
NASA Astrophysics Data System (ADS)
King, Wayne E.; Campbell, Geoffrey H.; Frank, Alan; Reed, Bryan; Schmerge, John F.; Siwick, Bradley J.; Stuart, Brent C.; Weber, Peter M.
2005-06-01
The use of pump-probe experiments to study complex transient events has been an area of significant interest in materials science, biology, and chemistry. While the emphasis has been on laser pump with laser probe and laser pump with x-ray probe experiments, there is a significant and growing interest in using electrons as probes. Early experiments used electrons for gas-phase diffraction of photostimulated chemical reactions. More recently, scientists are beginning to explore phenomena in the solid state such as phase transformations, twinning, solid-state chemical reactions, radiation damage, and shock propagation. This review focuses on the emerging area of ultrafast electron microscopy (UEM), which comprises ultrafast electron diffraction (UED) and dynamic transmission electron microscopy (DTEM). The topics that are treated include the following: (1) The physics of electrons as an ultrafast probe. This encompasses the propagation dynamics of the electrons (space-charge effect, Child's law, Boersch effect) and extends to relativistic effects. (2) The anatomy of UED and DTEM instruments. This includes discussions of the photoactivated electron gun (also known as photogun or photoelectron gun) at conventional energies (60-200 keV) and extends to MeV beams generated by rf guns. Another critical aspect of the systems is the electron detector. Charge-coupled device cameras and microchannel-plate-based cameras are compared and contrasted. The effect of various physical phenomena on detective quantum efficiency is discussed. (3) Practical aspects of operation. This includes determination of time zero, measurement of pulse-length, and strategies for pulse compression. (4) Current and potential applications in materials science, biology, and chemistry. UEM has the potential to make a significant impact in future science and technology. Understanding of reaction pathways of complex transient phenomena in materials science, biology, and chemistry will provide fundamental knowledge for discovery-class science.
Wageningen UR Unmanned Aerial Remote Sensing Facility - Overview of activities
NASA Astrophysics Data System (ADS)
Bartholomeus, Harm; Keesstra, Saskia; Kooistra, Lammert; Suomalainen, Juha; Mucher, Sander; Kramer, Henk; Franke, Jappe
2016-04-01
To support environmental management there is an increasing need for timely, accurate and detailed information on our land. Unmanned Aerial Systems (UAS) are increasingly used to monitor agricultural crop development, habitat quality or urban heat efficiency. An important reason is that UAS technology is maturing quickly while the flexible capabilities of UAS fill a gap between satellite based and ground based geo-sensing systems. In 2012, different groups within Wageningen University and Research Centre have established an Unmanned Airborne Remote Sensing Facility. The objective of this facility is threefold: a) To develop innovation in the field of remote sensing science by providing a platform for dedicated and high-quality experiments; b) To support high quality UAS services by providing calibration facilities and disseminating processing procedures to the UAS user community; and c) To promote and test the use of UAS in a broad range of application fields like habitat monitoring, precision agriculture and land degradation assessment. The facility is hosted by the Laboratory of Geo-Information Science and Remote Sensing (GRS) and the Department of Soil Physics and Land Management (SLM) of Wageningen University together with the team Earth Informatics (EI) of Alterra. The added value of the Unmanned Aerial Remote Sensing Facility is that compared to for example satellite based remote sensing more dedicated science experiments can be prepared. This includes for example higher frequent observations in time (e.g., diurnal observations), observations of an object under different observation angles for characterization of BRDF and flexibility in use of camera's and sensors types. In this way, laboratory type of set ups can be tested in a field situation and effects of up-scaling can be tested. In the last years we developed and implemented different camera systems (e.g. a hyperspectral pushbroom system, and multispectral frame cameras) which we operated in projects all around the world, while new camera systems are being planned such as LiDAR and a full frame hyperspectral camera. In the presentation we will give an overview of our activities, ranging from erosion studies, decision support for precision agriculture, determining leaf biochemistry and canopy structure in tropical forests to the mapping of coastal zones.
FBIS report. Science and technology: Japan, November 6, 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-06
Some articles are: R&D on Microfactory Technologies; MHI Develops Low Cost, Low Noise Mid-size Helicopters; Kumamoto University to Apply for Approval to Conduct Clinical Experiment for Gene Therapy; MITI To Support Private Sector to Develop Cipher Technology; and Hitachi Electronics Develops Digital Broadcasting Camera System.
2003-01-16
KENNEDY SPACE CENTER, FLA. - A closeup camera view shows Space Shuttle Columbia as it lifts off from Launch Pad 39A on mission STS-107. Following a flawless and uneventful countdown, liftoff occurred on-time at 10:39 a.m. EST. The 16-day research mission includes FREESTAR (Fast Reaction Experiments Enabling Science, Technology, Applications and Research) and the SHI Research Double Module (SHI/RDM), known as SPACEHAB. Experiments on the module range from material sciences to life sciences.. Landing of Columbia is scheduled at about 8:53 a.m. EST on Saturday, Feb. 1. This mission is the first Shuttle mission of 2003. Mission STS-107 is the 28th flight of the orbiter Columbia and the 113th flight overall in NASA's Space Shuttle program.
NASA Astrophysics Data System (ADS)
Muñoz-Franco, Granada; Criado, Ana María; García-Carmona, Antonio
2018-04-01
This article presents the results of a qualitative study aimed at determining the effectiveness of the camera obscura as a didactic tool to understand image formation (i.e., how it is possible to see objects and how their image is formed on the retina, and what the image formed on the retina is like compared to the object observed) in a context of scientific inquiry. The study involved 104 prospective primary teachers (PPTs) who were being trained in science teaching. To assess the effectiveness of this tool, an open questionnaire was applied before (pre-test) and after (post-test) the educational intervention. The data were analyzed by combining methods of inter- and intra-rater analysis. The results showed that more than half of the PPTs advanced in their ideas towards the desirable level of knowledge in relation to the phenomena studied. The conclusion reached is that the camera obscura, used in a context of scientific inquiry, is a useful tool for PPTs to improve their knowledge about image formation and experience in the first person an authentic scientific inquiry during their teacher training.
Overview of Athena Microscopic Imager Results
NASA Technical Reports Server (NTRS)
Herkenhoff, K.; Squyres, S.; Arvidson, R.; Bass, D.; Bell, J., III; Bertelsen, P.; Cabrol, N.; Ehlmann, B.; Farrand, W.; Gaddis, L.
2005-01-01
The Athena science payload on the Mars Exploration Rovers (MER) includes the Microscopic Imager (MI). The MI is a fixed-focus camera mounted on an extendable arm, the Instrument Deployment Device (IDD). The MI acquires images at a spatial resolution of 31 microns/pixel over a broad spectral range (400 - 700 nm). The MI uses the same electronics design as the other MER cameras but its optics yield a field of view of 32 32 mm across a 1024 1024 pixel CCD image. The MI acquires images using only solar or skylight illumination of the target surface. The MI science objectives, instrument design and calibration, operation, and data processing were described by Herkenhoff et al. Initial results of the MI experiment on both MER rovers (Spirit and Opportunity) have been published previously. Highlights of these and more recent results are described.
PoSSUM: Polar Suborbital Science in the Upper Mesosphere
NASA Astrophysics Data System (ADS)
Reimuller, J. D.; Fritts, D. C.; Thomas, G. E.; Taylor, M. J.; Mitchell, S.; Lehmacher, G. A.; Watchorn, S. R.; Baumgarten, G.; Plane, J. M.
2013-12-01
Project PoSSUM (www.projectpossum.org) is a suborbital research project leveraging imaging and remote sensing techniques from Reusable Suborbital Launch Vehicles (rSLVs) to gather critical climate data through use of the PoSSUM Observatory and the PoSSUM Aeronomy Laboratory. An acronym for Polar Suborbital Science in the Upper Mesosphere, PoSSUM grew from the opportunity created by the Noctilucent Cloud Imagery and Tomography Experiment, selected by the NASA Flight Opportunities Program as Experiment 46-S in March 2012. This experiment will employ an rSLV (e.g. the XCOR Lynx Mark II) launched from a high-latitude spaceport (e.g. Eielson AFB, Alaska or Kiruna, Sweden) during a week-long deployment scheduled for July 2015 to address critical questions concerning noctilucent clouds (NLCs) through flights that transition the cloud layer where the clouds will be under direct illumination from the sun. The 2015 Project PoSSUM NLC campaign will use the unique capability of rSLVs to address key under-answered questions pertaining to NLCs. Specifically, PoSSUM will answer: 1) What are the small-scale dynamics of NLCs and what does this tell us about the energy and momentum deposition from the lower atmosphere? 2) What is the seasonal variability of NLCs, mesospheric dynamics, and temperatures? 3) Are structures observed in the OH layer coupled with NLC structures? 4) How do NLCs nucleate? and 5) What is the geometry of NLC particles and how do they stratify? Instrumentation will include video and still-frame visible cameras (PoSSUMCam), infrared cameras, a mesospheric temperatures experiment, a depolarization LiDAR, a mesospheric density and temperatures experiment (MCAT), a mesospheric winds experiment, and a meteoric smoke detector (MASS). The instrument suite used on PoSSUM will mature through subsequent campaigns to develop an integrated, modular laboratory (the ';PoSSUM Observatory') that will provide repeatable, low cost, in-situ NLC and aeronomy observations as well as validate a method to serve the broader Earth Observation science, atmospheric science, and aeronomy communities.
SKYLAB (SL)-4 - CREW TRAINING (ORBITAL WORKSHOP [OWS]) - JSC
1973-08-22
S73-32840 (10 Sept. 1973) --- Scientist-astronaut Edward G. Gibson, Skylab 4 science pilot, turns on a switch on the control box of the S190B camera, one of the components of the Earth Resources Experiments Package (EREP). The single lens Earth Terrain Camera takes five-inch photographs. Behind Gibson is the stowed suit of astronaut Gerald P. Carr, commander for the third manned mission. The crew's other member is astronaut William R. Pogue, pilot. The training exercise took place in the Orbital Workshop one-G trainer at Johnson Space Center. Photo credit: NASA
Color Image of Phoenix Lander on Mars Surface
NASA Technical Reports Server (NTRS)
2008-01-01
This is an enhanced-color image from Mars Reconnaissance Orbiter's High Resolution Imaging Science Experiment (HiRISE) camera. It shows the Phoenix lander with its solar panels deployed on the Mars surface. The spacecraft appears more blue than it would in reality. The blue/green and red filters on the HiRISE camera were used to make this picture. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.View of Astronaut Owen Garriott taking video of two Skylab spiders experiment
NASA Technical Reports Server (NTRS)
1973-01-01
View of Scientist-Astronaut Owen K. Garriott, Skylab 3 science pilot, taking TV footage of Arabella and Anita, the two Skylab 3 common cross spiders 'aranous diadematus,' aboard the Skylab space station cluster in Earth orbit. During the 59 day Skylab 3 mission the two spiders Arabella and Anita, were housed in an enclosure onto which a motion picture and still camera were attached to record the spiders' attempts to build a web in the weightless environment. Note the automatic data acquisition camera (DAC) about 3.5 feet to Garriott's right (about waist level).
Opportunity Science Using the Juno Magnetometer Investigation Star Trackers
NASA Astrophysics Data System (ADS)
Joergensen, J. L.; Connerney, J. E.; Bang, A. M.; Denver, T.; Oliversen, R. J.; Benn, M.; Lawton, P.
2013-12-01
The magnetometer experiment onboard Juno is equipped with four non-magnetic star tracker camera heads, two of which reside on each of the magnetometer sensor optical benches. These are located 10 and 12 m from the spacecraft body at the end of one of the three solar panel wings. The star tracker, collectively referred to as the Advanced Stellar Compass (ASC), provides high accuracy attitude information for the magnetometer sensors throughout science operations. The star tracker camera heads are pointed +/- 13 deg off the spin vector, in the anti-sun direction, imaging a 13 x 20 deg field of view every ¼ second as Juno rotates at 1 or 2 rpm. The ASC is a fully autonomous star tracker, producing a time series of attitude quaternions for each camera head, utilizing a suite of internal support functions. These include imaging capabilities, autonomous object tracking, automatic dark-sky monitoring, and related capabilities; these internal functions may be accessed via telecommand. During Juno's cruise phase, this capability can be tapped to provide unique science and engineering data available along the Juno trajectory. We present a few examples of the JUNO ASC opportunity science here. As the Juno spacecraft approached the Earth-Moon system for the close encounter with the Earth on October 9, 2013, one of the ASC camera heads obtained imagery of the Earth-Moon system while the other three remained in full science (attitude determination) operation. This enabled the first movie of the Earth and Moon obtained by a spacecraft flying past the Earth in gravity assist. We also use the many artificial satellites in orbit about the Earth as calibration targets for the autonomous asteroid detection system inherent to the ASC autonomous star tracker. We shall also profile the zodiacal dust disk, using the interstellar image data, and present the outlook for small asteroid body detection and distribution being performed during Juno's passage from Earth flyby to Jovian orbit insertion.
Pilot Kent Rominger floats in tunnel
1995-10-24
STS073-E-5053 (26 Oct. 1995) --- Astronaut Kent V. Rominger, STS-73 pilot, floats through a tunnel connecting the space shuttle Columbia's cabin and its science module. Rominger is one of seven crewmembers in the midst of a 16-day multi-faceted mission aboard Columbia. For the next week and a half, the crew will continue working in shifts around the clock on a diverse assortment of United States Microgravity Laboratory (USML-2) experiments located in the science module. Fields of study include fluid physics, materials science, biotechnology, combustion science and commercial space processing technologies. The frame was exposed with an Electronic Still Camera (ESC).
NASA Astrophysics Data System (ADS)
2005-01-01
Einstein year: Einstein is brought back to life for a year of educational events Workshop: Students reach out for the Moon Event: Masterclasses go with a bang Workshop: Students search for asteroids on Einstein's birthday Scotland: Curriculum for Excellence takes holistic approach Conference: Reporting from a mattress in Nachod Conference: 'Change' is key objective at ICPE conference 2005 Lecture: Institute of Physics Schools Lecture series Conference: Experience showcase science in Warwick National network: Science Learning Centre opens Meeting: 30th Stirling Physics Meeting breaks records Competition: Win a digital camera! Forthcoming Events
View of the Columbia's remote manipulator system
1982-03-30
STS003-09-444 (22-30 March 1982) --- The darkness of space provides the backdrop for this scene of the plasma diagnostics package (PDR) experiment in the grasp of the end effector or ?hand? of the remote manipulator system (RMS) arm, and other components of the Office of Space Sciences (OSS-1) package in the aft section of the Columbia?s cargo hold. The PDP is a compact, comprehensive assembly of electromagnetic and particle sensors that will be used to study the interaction of the orbiter with its surrounding environment; to test the capabilities of the shuttle?s remote manipulator system; and to carry out experiments in conjunction with the fast pulse electron generator of the vehicle charging and potential experiment, another experiment on the OSS-1 payload pallet. This photograph was exposed with a 70mm handheld camera by the astronaut crew of STS-3, with a handheld camera aimed through the flight deck?s aft window. Photo credit: NASA
Science Highlights from the First Year of Advanced Camera for Surveys
NASA Technical Reports Server (NTRS)
Clampin, M.; Ford, H. C.; Illingworth, G. D.; Hartig, G.; Ardila, D. R.; Blakeslee, J. P.; Bouwens, R. J.; Cross, N. J. G.; Feldman, P. D.; Golimowski, D. A.
2003-01-01
The Advanced Camera for Surveys (ACS) is a deep imaging camera installed on the Hubble Space Telescope during the fourth HST servicing mission. ACS recently entered its second year of science operations and continues to perform beyond pre-launch expectations. We present science highlights from the ACS Science Team's GTO program. These highlights include the evolution of Z approx. 6 galaxies from deep imaging observations; deep imaging of strongly lensed clusters which have been used to determine cluster mass, and independently constraint the geometry of the Universe; and coronagraphic observations of debris disks.
Real-time imaging of quantum entanglement.
Fickler, Robert; Krenn, Mario; Lapkiewicz, Radek; Ramelow, Sven; Zeilinger, Anton
2013-01-01
Quantum Entanglement is widely regarded as one of the most prominent features of quantum mechanics and quantum information science. Although, photonic entanglement is routinely studied in many experiments nowadays, its signature has been out of the grasp for real-time imaging. Here we show that modern technology, namely triggered intensified charge coupled device (ICCD) cameras are fast and sensitive enough to image in real-time the effect of the measurement of one photon on its entangled partner. To quantitatively verify the non-classicality of the measurements we determine the detected photon number and error margin from the registered intensity image within a certain region. Additionally, the use of the ICCD camera allows us to demonstrate the high flexibility of the setup in creating any desired spatial-mode entanglement, which suggests as well that visual imaging in quantum optics not only provides a better intuitive understanding of entanglement but will improve applications of quantum science.
Real-Time Imaging of Quantum Entanglement
Fickler, Robert; Krenn, Mario; Lapkiewicz, Radek; Ramelow, Sven; Zeilinger, Anton
2013-01-01
Quantum Entanglement is widely regarded as one of the most prominent features of quantum mechanics and quantum information science. Although, photonic entanglement is routinely studied in many experiments nowadays, its signature has been out of the grasp for real-time imaging. Here we show that modern technology, namely triggered intensified charge coupled device (ICCD) cameras are fast and sensitive enough to image in real-time the effect of the measurement of one photon on its entangled partner. To quantitatively verify the non-classicality of the measurements we determine the detected photon number and error margin from the registered intensity image within a certain region. Additionally, the use of the ICCD camera allows us to demonstrate the high flexibility of the setup in creating any desired spatial-mode entanglement, which suggests as well that visual imaging in quantum optics not only provides a better intuitive understanding of entanglement but will improve applications of quantum science. PMID:23715056
Magnetospheric Multiscale Mission Micrometeoroid/Orbital Debris Impacts
NASA Technical Reports Server (NTRS)
Williams, Trevor; Sedlak, Joseph; Shulman, Seth
2017-01-01
The MMS spacecraft are highly instrumented (accelerometers, star cameras, Sun sensors, science experiments for plasmas etc.). This presentation will discuss how data from these systems has allowed two micrometeoroid/orbital debris events to be studied: the Feb. 2, 2016 impact with an MMS4 shunt resistor, and the June 12, 2016 impact with an MMS4 wire boom.
SKYLAB (SL)-2 - EXPERIMENTS (M-114)
1973-06-05
S73-27509 (6 June 1973) --- Scientist-astronaut Joseph P. Kerwin (right), Skylab 2 science pilot and a doctor of medicine, takes a blood sample from astronaut Charles Conrad Jr., Skylab 2 commander, as seen in this reproduction taken from a color television transmission made by a TV camera aboard the Skylab 1 and 2 space station cluster in Earth orbit. The blood sampling was part of the Skylab Hematology and Immunology Experiment M110 series. Photo credit: NASA
NASA Technical Reports Server (NTRS)
Bothwell, Mary
2004-01-01
My division was charged with building a suite of cameras for the Mars Exploration Rover (MER) project. We were building the science cameras on the mass assembly, the microscope camera, and the hazard and navigation cameras for the rovers. Not surprisingly, a lot of folks were paying attention to our work - because there's really no point in landing on Mars if you can't take pictures. In Spring 2002 things were not looking good. The electronics weren't coming in, and we had to go back to the vendors. The vendors would change the design, send the boards back, and they wouldn't work. On our side, we had an instrument manager in charge who I believe has the potential to become a great manager, but when things got behind schedule he didn't have the experience to know what was needed to catch up. As division manager, I was ultimately responsible for seeing that all my project and instrument managers delivered their work. I had to make the decision whether or not to replace him.
View of Astronaut Owen Garriott taking video of two Skylab spiders experiment
1973-08-16
SL3-109-1345 (August 1973) --- View of scientist-astronaut Owen K. Garriott, Skylab 3 science pilot, taking TV footage of Arabella and Anita, the two Skylab 3 common cross spiders "aranous diadematus," aboard the Skylab space station cluster in Earth orbit. During the 59-day Skylab 3 mission the two spiders Arabella and Anita, were housed in an enclosure onto which a motion picture and still camera were attached to record the spiders' attempts to build a web in the weightless environment. Note the automatic data acquisition camera (DAC) about 3.5 feet to Garriott's right (about waist level). Photo credit: NASA
In this medium close-up view, captured by an Electronic Still Camera (ESC), the Spartan 207
NASA Technical Reports Server (NTRS)
1996-01-01
STS-77 ESC VIEW --- In this medium close-up view, captured by an Electronic Still Camera (ESC), the Spartan 207 free-flyer is held in the grasp of the Space Shuttle Endeavour's Remote Manipulator System (RMS) following its re-capture on May 21, 1996. The six-member crew has spent a portion of the early stages of the mission in various activities involving the Spartan 207 and the related Inflatable Antenna Experiment (IAE). The Spartan project is managed by NASA's Goddard Space Flight Center (GSFC) for NASA's Office of Space Science, Washington, D.C. GMT: 09:38:05.
Exploring the dark energy biosphere, 15 seconds at a time
NASA Astrophysics Data System (ADS)
Petrone, C.; Tossey, L.; Biddle, J.
2016-12-01
Science communication often suffers from numerous pitfalls including jargon, complexity, ageneral lack of (science) education of the audience, and short attention spans. With the Center for Dark EnergyBiosphere Investigations (C-DEBI), Delaware Sea Grant is expanding its collection of 15 Second Science videos, whichdeliver complex science topics, with visually stimulating footage and succinct audio. Featuring a diverse cast of scientistsand educators in front of the camera, we are expanded our reach into the public and classrooms. We're alsoexperimenting with smartphone-based virtual reality, for a more immersive experience into the deep! We will show youthe process for planning, producing, and posting our #15secondscience videos and VR segments, and how we areevaluating effectiveness.
NASA Technical Reports Server (NTRS)
Brook, M.
1986-01-01
An optical lightning detector was constructed and flown, along with Vinton cameras and a Fairchild Line Scan Spectrometer, on a U-2 during the summer of 1979. The U-2 lightning data was obtained in daylight, and was supplemented with ground truth taken at Langmuir Laboratory. Simulations were prepared as required to establish experiment operating procedures and science training for the astronauts who would operate the Night/Day Optical Survey of Thunderstorm Lightning (NOSL) equipment during the STS-2 NOSL experiment on the Space Shuttle. Data was analyzed and papers were prepared for publication.
NASA Technical Reports Server (NTRS)
Shirazi, Yasaman; Choi, S.; Harris, C.; Gong, C.; Fisher, R. J.; Beegle, J. E.; Stube, K. C.; Martin, K. J.; Nevitt, R. G.; Globus, R. K.
2017-01-01
Animal models, particularly rodents, are the foundation of pre-clinical research to understand human diseases and evaluate new therapeutics, and play a key role in advancing biomedical discoveries both on Earth and in space. The National Research Councils Decadal survey emphasized the importance of expanding NASA's life sciences research to perform long duration, rodent experiments on the International Space Station (ISS) to study effects of the space environment on the musculoskeletal and neurological systems of mice as model organisms of human health and disease, particularly in areas of muscle atrophy, bone loss, and fracture healing. To accomplish this objective, flight hardware, operations, and science capabilities were developed at NASA Ames Research Center (ARC) to enhance science return for both commercial (CASIS) and government-sponsored rodent research. The Rodent Research Project at NASA ARC has pioneered a new research capability on the International Space Station and has progressed toward translating research to the ISS utilizing commercial rockets, collaborating with academia and science industry, while training crewmembers to assist in performing research on orbit. The Rodent Research Habitat provides a living environment for animals on ISS according to standard animal welfare requirements, and daily health checks can be performed using the habitats camera system. Results from these studies contribute to the science community via both the primary investigation and banked samples that are shared in publicly available data repository such as GeneLab. Following each flight, through the Biospecimen Sharing Program (BSP), numerous tissues and thousands of samples will be harvested, and distributed from the Space Life and Physical Sciences (SLPS) to Principal Investigators (PIs) through the Ames Life Science Data Archive (ALSDA). Every completed mission sets a foundation to build and design greater complexity into future research and answer questions about common human diseases. Together, the hardware improvements (enrichment, telemetry sensors, cameras), new capabilities (live animal return), and experience that the Rodent Research team has gained working with principal investigator teams and ISS crew to conduct complex experiments on orbit are expanding capabilities for long duration rodent research on the ISS to achieve both basic science and biomedical research objectives.
Autonomous Rock Tracking and Acquisition from a Mars Rover
NASA Technical Reports Server (NTRS)
Maimone, Mark W.; Nesnas, Issa A.; Das, Hari
1999-01-01
Future Mars exploration missions will perform two types of experiments: science instrument placement for close-up measurement, and sample acquisition for return to Earth. In this paper we describe algorithms we developed for these tasks, and demonstrate them in field experiments using a self-contained Mars Rover prototype, the Rocky 7 rover. Our algorithms perform visual servoing on an elevation map instead of image features, because the latter are subject to abrupt scale changes during the approach. 'This allows us to compensate for the poor odometry that results from motion on loose terrain. We demonstrate the successful grasp of a 5 cm long rock over 1m away using 103-degree field-of-view stereo cameras, and placement of a flexible mast on a rock outcropping over 5m away using 43 degree FOV stereo cameras.
ASPIRE - Airborne Spectro-Polarization InfraRed Experiment
NASA Astrophysics Data System (ADS)
DeLuca, E.; Cheimets, P.; Golub, L.; Madsen, C. A.; Marquez, V.; Bryans, P.; Judge, P. G.; Lussier, L.; McIntosh, S. W.; Tomczyk, S.
2017-12-01
Direct measurements of coronal magnetic fields are critical for taking the next step in active region and solar wind modeling and for building the next generation of physics-based space-weather models. We are proposing a new airborne instrument to make these key observations. Building on the successful Airborne InfraRed Spectrograph (AIR-Spec) experiment for the 2017 eclipse, we will design and build a spectro-polarimeter to measure coronal magnetic field during the 2019 South Pacific eclipse. The new instrument will use the AIR-Spec optical bench and the proven pointing, tracking, and stabilization optics. A new cryogenic spectro-polarimeter will be built focusing on the strongest emission lines observed during the eclipse. The AIR-Spec IR camera, slit jaw camera and data acquisition system will all be reused. The poster will outline the optical design and the science goals for ASPIRE.
MRO's High Resolution Imaging Science Experiment (HiRISE): Education and Public Outreach Plans
NASA Technical Reports Server (NTRS)
Gulick, V.; McEwen, A.; Delamere, W. A.; Eliason, E.; Grant, J.; Hansen, C.; Herkenhoff, K.; Keszthelyi, L.; Kirk, R.; Mellon, M.
2003-01-01
The High Resolution Imaging Experiment, described by McEwen et al. and Delamere et al., will fly on the Mars 2005 Orbiter. In conjunction with the NASA Mars E/PO program, the HiRISE team plans an innovative and aggressive E/PO effort to complement the unique high-resolution capabilities of the camera. The team is organizing partnerships with existing educational outreach programs and museums and plans to develop its own educational materials. In addition to other traditional E/PO activities and a strong web presence, opportunities will be provided for the public to participate in image targeting and science analysis. The main aspects of our program are summarized.
2000-12-06
KENNEDY SPACE CENTER, FLA. -- Members of the STS-107 crew take part in In-Flight Maintenance training for their mission. Looking over the OSTEO experiment and paperwork are (at left) Mission Specialists David M. Brown and Laurel Clark and Payload Specialist Ilan Roman of Israel; Pilot William C. “Willie” McCool; and Commander Rick D. Husband. Looking on are project engineers and scientists. On the right are Mission Specialists Michael Anderson (back to camera) and Kalpana Chawla. As a research mission, STS-107will carry the SPACEHAB Double Module in its first research flight into space and a broad collection of experiments ranging from material science to life science. It is scheduled to launch July 19, 2001
2000-12-06
KENNEDY SPACE CENTER, FLA. -- Members of the STS-107 crew take part in In-Flight Maintenance training for their mission. Looking over the OSTEO experiment and paperwork are (at left) Mission Specialists David M. Brown and Laurel Clark and Payload Specialist Ilan Roman of Israel; Pilot William C. “Willie” McCool; and Commander Rick D. Husband. Looking on are project engineers and scientists. On the right are Mission Specialists Michael Anderson (back to camera) and Kalpana Chawla. As a research mission, STS-107will carry the SPACEHAB Double Module in its first research flight into space and a broad collection of experiments ranging from material science to life science. It is scheduled to launch July 19, 2001
VUV testing of science cameras at MSFC: QE measurement of the CLASP flight cameras
NASA Astrophysics Data System (ADS)
Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.
2015-08-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint MSFC, National Astronomical Observatory of Japan (NAOJ), Instituto de Astrofisica de Canarias (IAC) and Institut D'Astrophysique Spatiale (IAS) sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512 × 512 detector, dual channel analog readout and an internally mounted cold block. At the flight CCD temperature of -20C, the CLASP cameras exceeded the low-noise performance requirements (<= 25 e- read noise and <= 10 e- /sec/pixel dark current), in addition to maintaining a stable gain of ≍ 2.0 e-/DN. The e2v CCD57-10 detectors were coated with Lumogen-E to improve quantum efficiency (QE) at the Lyman- wavelength. A vacuum ultra-violet (VUV) monochromator and a NIST calibrated photodiode were employed to measure the QE of each camera. Three flight cameras and one engineering camera were tested in a high-vacuum chamber, which was configured to operate several tests intended to verify the QE, gain, read noise and dark current of the CCD. We present and discuss the QE measurements performed on the CLASP cameras. We also discuss the high-vacuum system outfitted for testing of UV, EUV and X-ray science cameras at MSFC.
NIRCam/NGST Education and Public Outreach: ``Linking Girls with the Sky"
NASA Astrophysics Data System (ADS)
McCarthy, D. W., Jr.; Lebofsky, L. A.; Slater, T. F.; Rieke, M. J.; Pompea, S. M.
2002-09-01
Astronomical images can inspire a new generation. The clarity of the Next Generation Space Telescope (NGST), combined with the near-infrared camera's (NIRCam) ability to see farther back in time and through murky regions of space, may unveil the ``First Light" from a newborn Universe and the origins of planetary systems. The NIRCam science team, led by Dr. Marcia Rieke, unites scientists from across the U.S., Canada, and Lockheed Martin's Advanced Technology Center with prominent science educators. The E/PO program especially targets K-14 girls in a partnership with the Girl Scouts of the USA, to address such specific needs as (1) the review of existing badge programs for younger girls, (2) new, community-based activities and research experiences for older girls, (3) interaction experiences in person and on-line with inspiring mentors and role-models, and (4) leadership and training experiences for adult trainers. New activities will be inquiry-based and appropriate in both formal and informal settings. They will also used for training future teachers of science. Topics such as ``Light pollution" can be related thematically to such NGST concepts as a ``low thermal background". The Astronomy Camp facilities on historic Mt. Lemmon will be used to ``train the trainers" by providing Girl Scouts and their adult leaders hands-on experiences with 8- to 60-inch telescopes, CCD and infrared cameras, and image processing techniques. NIRCam scientists will also be involved in developing authentic research-based projects using NIRCam datasets for in-class use by middle and high school teachers. The NIRCam E/PO program is funded by NASA under prime contract, NAS502105, with Goddard Space Flight Center to The University of Arizona.
VUV Testing of Science Cameras at MSFC: QE Measurement of the CLASP Flight Cameras
NASA Technical Reports Server (NTRS)
Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtain, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike
2015-01-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512x512 detector, dual channel analog readout electronics and an internally mounted cold block. At the flight operating temperature of -20 C, the CLASP cameras achieved the low-noise performance requirements (less than or equal to 25 e- read noise and greater than or equal to 10 e-/sec/pix dark current), in addition to maintaining a stable gain of approximately equal to 2.0 e-/DN. The e2v CCD57-10 detectors were coated with Lumogen-E to improve quantum efficiency (QE) at the Lyman- wavelength. A vacuum ultra-violet (VUV) monochromator and a NIST calibrated photodiode were employed to measure the QE of each camera. Four flight-like cameras were tested in a high-vacuum chamber, which was configured to operate several tests intended to verify the QE, gain, read noise, dark current and residual non-linearity of the CCD. We present and discuss the QE measurements performed on the CLASP cameras. We also discuss the high-vacuum system outfitted for testing of UV and EUV science cameras at MSFC.
Deployable Camera (DCAM3) System for Observation of Hayabusa2 Impact Experiment
NASA Astrophysics Data System (ADS)
Sawada, Hirotaka; Ogawa, Kazunori; Shirai, Kei; Kimura, Shinichi; Hiromori, Yuichi; Mimasu, Yuya
2017-07-01
An asteroid exploration probe "Hayabusa2", that was developed by Japan Aerospace Exploration Agency (JAXA), was launched on December 3rd, 2014 to challenge complicated and accurate operations during the mission phase around the C-type asteroid 162137 Ryugu (1999 JU3) (Tsuda et al. in Acta Astron. 91:356-362, 2013). An impact experiment on a surface of the asteroid will be conducted using the Small Carry-on Impactor (SCI) system, which will be the world's first artificial crater creation experiment on asteroids (Saiki et al. in Proc. International Astronautical Congress, IAC-12.A3.4.8, 2012, Acta Astron. 84:227-236, 2013a; Proc. International Symposium on Space Technology and Science, 2013b). We developed a new micro Deployable CAMera (DCAM3) system for remote observations of the impact phenomenon applying our conventional DCAM technology that is one of the smallest probes in space missions and gained a great success in past Japanese mission IKAROS (Interplanetary Kite-craft Accelerated by Radiation Of the Sun). DCAM3 is a miniaturized separable unit that contains two cameras and radio communication devices for transmission image data to the mothership "Hayabusa2", and it observes the impact experiment at an unsafe region in where the "Hayabusa2" is difficult to stay because of a risk of exploding and impacting debris hitting. In this paper, we report details of the DCAM3 system and development results as well as our mission plan for the DCAM3 observation during the SCI experiment.
Curiosity: How to Boldly Go...
NASA Technical Reports Server (NTRS)
Pyrzak, Guy
2013-01-01
Operating a one-ton rover on the surface of Mars requires more than just a joystick and an experiment. With 10 science instruments, 17 cameras, a radioisotope thermoelectric generator and lasers, Curiosity is the largest and most complex rover NASA has sent to Mars. Combined with a 1 way light time of 4 to 20 minutes and a distributed international science and engineering team, it takes a lot of work to operate this mega-rover. The Mars Science Lab's operations team has developed an organization and process that maximizes science return and safety of the spacecraft. These are the voyages of the rover Curiosity, its 2 year mission, to determine the habitability of Gale Crater, to understand the role of water, to study the climate and geology of Mars.
Seeing Earth Through the Eyes of an Astronaut
NASA Technical Reports Server (NTRS)
Dawson, Melissa
2014-01-01
The Human Exploration Science Office within the ARES Directorate has undertaken a new class of handheld camera photographic observations of the Earth as seen from the International Space Station (ISS). For years, astronauts have attempted to describe their experience in space and how they see the Earth roll by below their spacecraft. Thousands of crew photographs have documented natural features as diverse as the dramatic clay colors of the African coastline, the deep blues of the Earth's oceans, or the swirling Aurora Borealis of Australia in the upper atmosphere. Dramatic recent improvements in handheld digital single-lens reflex (DSLR) camera capabilities are now allowing a new field of crew photography: night time-lapse imagery.
Experiential learning in soil science: Use of an augmented reality sandbox
NASA Astrophysics Data System (ADS)
Vaughan, Karen; Vaughan, Robert; Seeley, Janel; Brevik, Eric
2017-04-01
It is known widely that greater learning occurs when students are active participants. Novel technologies allow instructors the opportunity to create interactive activities for undergraduate students to gain comprehension of complex landscape processes. We incorporated the use of an Augmented Reality (AR) Sandbox in the Introductory Soil Science course at the University of Wyoming to facilitate an experiential learning experience in pedology. The AR Sandbox was developed by researchers at the University of California, Davis as part of a project on informal science education in freshwater lakes and watershed science. It is a hands-on display that allows users to create topography models by shaping sand that is augmented in real-time by a colored elevation maps, topographic contour lines, and simulated water. It uses a 3-dimensional motion sensing camera that detects changes to the distance between the sand surface and the camera sensor. A short-throw projector then displays the elevation model and contour lines in real-time. Undergraduate students enrolled in the Introductory Soil Science course were tasked with creating a virtual landscape and then predicting where particular soils would form on the various landforms. All participants reported a greater comprehension of surface water flow, erosion, and soil formation as a result of this exercise. They provided suggestions for future activities using the AR Sandbox including its incorporation into lessons of watershed hydrology, land management, soil water, and soil genesis.
Keszthelyi, L.; Jaeger, W.; McEwen, A.; Tornabene, L.; Beyer, R.A.; Dundas, C.; Milazzo, M.
2008-01-01
In the first 6 months of the Mars Reconnaissance Orbiter's Primary Science Phase, the High Resolution Imaging Science Experiment (HiRISE) camera has returned images sampling the diversity of volcanic terrains on Mars. While many of these features were noted in earlier imaging, they are now seen with unprecedented clarity. We find that some volcanic vents produced predominantly effusive products while others generated mostly pyroclastics. Flood lavas were emplaced in both turbulent and gentle eruptions, producing roofed channels and inflation features. However, many areas on Mars are too heavily mantled to allow meter-scale volcanic features to be discerned. In particular, the major volcanic edifices are extensively mantled, though it is possible that some of the mantle is pyroclastic material rather than atmospheric dust. Support imaging by the Context Imager (CTX) and topographic information derived from stereo imaging are both invaluable in interpreting the HiRISE data. Copyright 2008 by the American Geophysical Union.
Nomad rover field experiment, Atacama Desert, Chile 1. Science results overview
NASA Astrophysics Data System (ADS)
Cabrol, N. A.; Thomas, G.; Witzke, B.
2001-04-01
Nomad was deployed for a 45 day traverse in the Atacama Desert, Chile, during the summer of 1997. During this traverse, 1 week was devoted to science experiments. The goal of the science experiments was to test different planetary surface exploration strategies that included (1) a Mars mission simulation, (2) a science on the fly experiment, where the rover was kept moving 75% of the operation time. (The goal of this operation was to determine whether or not successful interpretation of the environment is related to the time spent on a target. The role of mobility in helping the interpretation was also assessed.) (3) a meteorite search using visual and instrumental methods to remotely identify meteorites in extreme environments, and (4) a time-delay experiment with and without using the panospheric camera. The results were as follow: the remote science team positively identified the main characteristics of the test site geological environment. The science on the fly experiment showed that the selection of appropriate targets might be even more critical than the time spent on a study area to reconstruct the history of a site. During the same operation the science team members identified and sampled a rock from a Jurassic outcrop that they proposed to be a fossil. The presence of paleolife indicators in this rock was confirmed later by laboratory analysis. Both visual and instrumental modes demonstrated the feasibility, in at least some conditions, of carrying out a field search for meteorites by using remote-controlled vehicles. Finally, metrics collected from the observation of the science team operations, and the use team members made of mission data, provided critical information on what operation sequences could be automated on board rovers in future planetary surface explorations.
Capabilities, performance, and status of the SOFIA science instrument suite
NASA Astrophysics Data System (ADS)
Miles, John W.; Helton, L. Andrew; Sankrit, Ravi; Andersson, B. G.; Becklin, E. E.; De Buizer, James M.; Dowell, C. D.; Dunham, Edward W.; Güsten, Rolf; Harper, Doyal A.; Herter, Terry L.; Keller, Luke D.; Klein, Randolf; Krabbe, Alfred; Marcum, Pamela M.; McLean, Ian S.; Reach, William T.; Richter, Matthew J.; Roellig, Thomas L.; Sandell, Göran; Savage, Maureen L.; Smith, Erin C.; Temi, Pasquale; Vacca, William D.; Vaillancourt, John E.; Van Cleve, Jeffery E.; Young, Erick T.; Zell, Peter T.
2013-09-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne observatory, carrying a 2.5 m telescope onboard a heavily modified Boeing 747SP aircraft. SOFIA is optimized for operation at infrared wavelengths, much of which is obscured for ground-based observatories by atmospheric water vapor. The SOFIA science instrument complement consists of seven instruments: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), GREAT (German Receiver for Astronomy at Terahertz Frequencies), HIPO (High-speed Imaging Photometer for Occultations), FLITECAM (First Light Infrared Test Experiment CAMera), FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), EXES (Echelon-Cross-Echelle Spectrograph), and HAWC (High-resolution Airborne Wideband Camera). FORCAST is a 5-40 μm imager with grism spectroscopy, developed at Cornell University. GREAT is a heterodyne spectrometer providing high-resolution spectroscopy in several bands from 60-240 μm, developed at the Max Planck Institute for Radio Astronomy. HIPO is a 0.3-1.1 μm imager, developed at Lowell Observatory. FLITECAM is a 1-5 μm wide-field imager with grism spectroscopy, developed at UCLA. FIFI-LS is a 42-210 μm integral field imaging grating spectrometer, developed at the University of Stuttgart. EXES is a 5-28 μm high-resolution spectrograph, developed at UC Davis and NASA ARC. HAWC is a 50-240 μm imager, developed at the University of Chicago, and undergoing an upgrade at JPL to add polarimetry capability and substantially larger GSFC detectors. We describe the capabilities, performance, and status of each instrument, highlighting science results obtained using FORCAST, GREAT, and HIPO during SOFIA Early Science observations conducted in 2011.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here, students from Sycamore High School in Cincinnati, Ohio, help a NASA technician prepare their experiment. This image is from a digital still camera; higher resolution is not available.
NASA Technical Reports Server (NTRS)
2007-01-01
Artist's concept of the New Horizons spacecraft as it approaches Pluto and its largest moon, Charon, in July 2015. The craft's miniature cameras, radio science experiment, ultraviolet and infrared spectrometers and space plasma experiments will characterize the global geology and geomorphology of Pluto and Charon, map their surface compositions and temperatures, and examine Pluto's atmosphere in detail. The spacecraft's most prominent design feature is a nearly 7-foot (2.1-meter) dish antenna, through which it will communicate with Earth from as far as 4.7 billion miles (7.5 billion kilometers) away.At the Crossroads of Art and Science: A New Course for University Non-Science Majors
NASA Astrophysics Data System (ADS)
Blatt, S. Leslie
2004-03-01
How much did Seurat know about the physics, physiology, and perceptual science of color mixing when he began his experiments in pointillism? Did Vermeer have a camera obscura built into his studio to create the perfect perspective and luminous effects of his canvases? Early in the 20th century, consequences of the idea that "no single reference point is to be preferred above any other" were worked out in physics by Einstein (special and general relativity), in art by Picasso (early cubism), and in music by Schoenberg (12-tone compositions); did this same paradigm-shifting concept arise, in three disparate fields, merely by coincidence? We are developing a new course, aimed primarily at non-science majors, that addresses questions like these through a combination of hands-on experiments on the physics of light, investigations in visual perception, empirical tests of various drawing and painting techniques, and field trips to nearby museums. We will show a few examples of the kinds of art/science intersections our students will be exploring, and present a working outline for the course.
Mars Science Laboratory Engineering Cameras
NASA Technical Reports Server (NTRS)
Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.
2012-01-01
NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.
Communities, Cameras, and Conservation
ERIC Educational Resources Information Center
Patterson, Barbara
2012-01-01
Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…
The ISS Fluids Integrated Rack (FIR): a Summary of Capabilities
NASA Astrophysics Data System (ADS)
Gati, F.; Hill, M. E.
2002-01-01
The Fluids Integrated Rack (FIR) is a modular, multi-user scientific research facility that will fly in the U.S. laboratory module, Destiny, of the International Space Station (ISS). The FIR will be one of the two racks that will make up the Fluids and Combustion Facility (FCF) - the other being the Combustion Integrated Rack (CIR). The ISS will provide the FCF with the necessary resources, such as power and cooling. While the ISS crew will be available for experiment operations, their time will be limited. The FCF is, therefore, being designed for autonomous operations and remote control operations. Control of the FCF will be primarily through the Telescience Support Center (TSC) at the Glenn Research Center. The FCF is being designed to accommodate a wide range of combustion and fluids physics experiments within the ISS resources and constraints. The primary mission of the FIR, however, is to accommodate experiments from four major fluids physics disciplines: Complex Fluids; Multiphase Flow and Heat Transfer; Interfacial Phenomena; and Dynamics and Stability. The design of the FIR is flexible enough to accommodate experiments from other science disciplines such as Biotechnology. The FIR flexibility is a result of the large volume dedicated for experimental hardware, easily re-configurable diagnostics that allow for unique experiment configurations, and it's customizable software. The FIR will utilize six major subsystems to accommodate this broad scope of fluids physics experiments. The major subsystems are: structural, environmental, electrical, gaseous, command and data management, and imagers and illumination. Within the rack, the FIR's structural subsystem provides an optics bench type mechanical interface for the precise mounting of experimental hardware; including optical components. The back of the bench is populated with FIR avionics packages and light sources. The interior of the rack is isolated from the cabin through two rack doors that are hinged near the top and bottom of the rack. Transmission of micro-gravity disturbances to and from the rack is minimized through the Active Rack Isolation System (ARIS). The environmental subsystem will utilize air and water to remove heat generated by facility and experimental hardware. The air will be circulated throughout the rack and will be cooled by an air-water heat exchanger. Water will be used directly to cool some of the FIR components and will also be available to cool experiment hardware as required. The electrical subsystem includes the Electrical Power Control Unit (EPCU), which provides 28 VDC and 120 VDC power to the facility and the experiment hardware. The EPCU will also provide power management and control functions, as well as fault protection capabilities. The FIR will provide access to the ISS gaseous nitrogen and vacuum systems. These systems are available to support experiment operations such as the purging of experimental cells, creating flows within experimental cells and providing dry conditions where needed. The FIR Command and Data Management subsystem (CDMS) provides command and data handling for both facility and experiment hardware. The Input Output Processor (IOP) provides the overall command and data management functions for the rack including downlinking or writing data to removable drives. The IOP will also monitor the health and status of the rack subsystems. The Image Processing and Storage Units (IPSU) will perform diagnostic control and image data acquisition functions. An IPSU will be able to control a digital camera, receive image data from that camera and process/ compress image data as necessary. The Fluids Science and Avionics Package (FSAP) will provide the primary control over an experiment. The FSAP contains various computer boards/cards that will perform data and control functions. To support the imaging needs, cameras and illumination sources will be available to the investigator. Both color analog and black and white digital cameras with lenses are expected. These cameras will be capable of high resolution and, separately, frame rates up to 32,000 frames per second. Lenses for these cameras will provide both microscopic and macroscopic views. The FIR will provide two illumination sources, a 532 nm Nd:YAG laser and a white light source, both with adjustable power output. The FIR systems are being designed to maximize the amount of science that can be done on-orbit. Experiments will be designed and efficiently operated. Each individual experiment must determine the best configuration of utilizing facility capabilities and resources with augmentation of specific experiment hardware. Efficient operations will be accomplished via a combination of on-orbit physical component change-outs or processing by the crew, and software updates via ground commanding or by the crew. Careful coordination by ground and on-orbit personnel regarding the on-orbit storage and downlinking of image data will also be very important.
Evidence for Basinwide Mud Volcanism in Acidalia Planitia, Mars
NASA Technical Reports Server (NTRS)
Oehler, Dorothy Z.; Allen, Carlton C.
2010-01-01
High-albedo mounds in Acidalia Planitia occur in enormous numbers. They have been variously interpreted as pseudocraters, cinder cones, tuff cones, pingos, ice disintegration features, or mud volcanoes. Our work uses regional mapping, basin analysis, and new data from the Context Camera (CTX), High Resolution Imaging Science Experiment (HiRISE), and Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) to re-assess the origin and significance of these structures.
Windy Mars: A Dynamic Planet as Seen by the HiRISE Camera
NASA Technical Reports Server (NTRS)
Bridges, N. T.; Geissler, P. E.; McEwen, A. S.; Thomson, B. J.; Chuang, F. C.; Herkenhoff, K. E.; Keszthelyi, L. P.; Martnez-Alonso, S.
2007-01-01
With a dynamic atmosphere and a large supply of particulate material, the surface of Mars is heavily influenced by wind-driven, or aeolian, processes. The High Resolution Imaging Science Experiment (HiRISE) camera on the Mars Reconnaissance Orbiter (MRO) provides a new view of Martian geology, with the ability to see decimeter-size features. Current sand movement, and evidence for recent bedform development, is observed. Dunes and ripples generally exhibit complex surfaces down to the limits of resolution. Yardangs have diverse textures, with some being massive at HiRISE scale, others having horizontal and cross-cutting layers of variable character, and some exhibiting blocky and polygonal morphologies. 'Reticulate' (fine polygonal texture) bedforms are ubiquitous in the thick mantle at the highest elevations.
An Educational PET Camera Model
ERIC Educational Resources Information Center
Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.
2006-01-01
Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.
Color Camera for Curiosity Robotic Arm
2010-11-16
The Mars Hand Lens Imager MAHLI camera will fly on NASA Mars Science Laboratory mission, launching in late 2011. This photo of the camera was taken before MAHLI November 2010 installation onto the robotic arm of the mission Mars rover, Curiosity.
Photon collider: a four-channel autoguider solution
NASA Astrophysics Data System (ADS)
Hygelund, John C.; Haynes, Rachel; Burleson, Ben; Fulton, Benjamin J.
2010-07-01
The "Photon Collider" uses a compact array of four off axis autoguider cameras positioned with independent filtering and focus. The photon collider is two way symmetric and robustly mounted with the off axis light crossing the science field which allows the compact single frame construction to have extremely small relative deflections between guide and science CCDs. The photon collider provides four independent guiding signals with a total of 15 square arc minutes of sky coverage. These signals allow for simultaneous altitude, azimuth, field rotation and focus guiding. Guide cameras read out without exposure overhead increasing the tracking cadence. The independent focus allows the photon collider to maintain in focus guide stars when the main science camera is taking defocused exposures as well as track for telescope focus changes. Independent filters allow auto guiding in the science camera wavelength bandpass. The four cameras are controlled with a custom web services interface from a single Linux based industrial PC, and the autoguider mechanism and telemetry is built around a uCLinux based Analog Devices BlackFin embedded microprocessor. Off axis light is corrected with a custom meniscus correcting lens. Guide CCDs are cooled with ethylene glycol with an advanced leak detection system. The photon collider was built for use on Las Cumbres Observatory's 2 meter Faulks telescopes and currently used to guide the alt-az mount.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Students from Sycamore High School in Cincinnati, Ohio (girls), and the COSI Academy, Columbus, Ohio (boys), participated. This image is from a digital still camera; higher resolution is not available.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Meredith Mendenhall of Sycamore High School, Cincinnati, Ohio, flips on a tape recorder in preparation for a drop. This image is from a digital still camera; higher resolution is not available.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Sandi Thompson of the National Center for Microgravity Research GRC makes a final adjustment to the drop package. This image is from a digital still camera; higher resolution is not available.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here, students are briefed by NASA engineer Daniel Dietrich at the top of the drop tower. This image is from a digital still camera; higher resolution is not available.
1992-06-01
The first United States Microgravity Laboratory (USML-1) provided scientific research in materials science, fluid dynamics, biotechnology, and combustion science in a weightless environment inside the Spacelab module. This photograph is a close-up view of the Glovebox in operation during the mission. The Spacelab Glovebox, provided by the European Space Agency, offers experimenters new capabilities to test and develop science procedures and technologies in microgravity. It enables crewmembers to handle, transfer, and otherwise manipulate materials in ways that are impractical in the open Spacelab. The facility is equipped with three doors: a central port through which experiments are placed in the Glovebox and two glovedoors on both sides with an attachment for gloves or adjustable cuffs and adapters for cameras. The Glovebox has an enclosed compartment that offers a clean working space and minimizes the contamination risks to both Spacelab and experiment samples. Although fluid containment and ease of cleanup are major benefits provided by the facility, it can also contain powders and bioparticles; toxic, irritating, or potentially infectious materials; and other debris produced during experiment operations. The facility is equipped with photographic/video capabilities and permits mounting a microscope. For the USML-1 mission, the Glovebox experiments fell into four basic categories: fluid dynamics, combustion science, crystal growth, and technology demonstration. The USML-1 flew aboard the STS-50 mission in June 1992.
Red ball ranging optimization based on dual camera ranging method
NASA Astrophysics Data System (ADS)
Kuang, Lei; Sun, Weijia; Liu, Jiaming; Tang, Matthew Wai-Chung
2018-05-01
In this paper, the process of positioning and moving to target red ball by NAO robot through its camera system is analyzed and improved using the dual camera ranging method. The single camera ranging method, which is adapted by NAO robot, was first studied and experimented. Since the existing error of current NAO Robot is not a single variable, the experiments were divided into two parts to obtain more accurate single camera ranging experiment data: forward ranging and backward ranging. Moreover, two USB cameras were used in our experiments that adapted Hough's circular method to identify a ball, while the HSV color space model was used to identify red color. Our results showed that the dual camera ranging method reduced the variance of error in ball tracking from 0.68 to 0.20.
Space telescope phase B definition study. Volume 2A: Science instruments, f48/96 planetary camera
NASA Technical Reports Server (NTRS)
Grosso, R. P.; Mccarthy, D. J.
1976-01-01
The analysis and preliminary design of the f48/96 planetary camera for the space telescope are discussed. The camera design is for application to the axial module position of the optical telescope assembly.
Dropping In a Microgravity Environment (DIME) contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here, students from Sycamore High School in Cincinnati, Ohio, help a NASA technician prepare their experiment. This image is from a digital still camera; higher resolution is not available.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. This is the interior of the Sycamore High School (Cincinnati, Ohio) students' experiment to observe the flame spreading on a 100 percent cotton T-shirt under low-g. This image is from a digital still camera; higher resolution is not available.
NASA Astrophysics Data System (ADS)
Close, Laird M.; Males, Jared R.; Kopon, Derek A.; Gasho, Victor; Follette, Katherine B.; Hinz, Phil; Morzinski, Katie; Uomoto, Alan; Hare, Tyson; Riccardi, Armando; Esposito, Simone; Puglisi, Alfio; Pinna, Enrico; Busoni, Lorenzo; Arcidiacono, Carmelo; Xompero, Marco; Briguglio, Runa; Quiros-Pacheco, Fernando; Argomedo, Javier
2012-07-01
The heart of the 6.5 Magellan AO system (MagAO) is a 585 actuator adaptive secondary mirror (ASM) with <1 msec response times (0.7 ms typically). This adaptive secondary will allow low emissivity and high-contrast AO science. We fabricated a high order (561 mode) pyramid wavefront sensor (similar to that now successfully used at the Large Binocular Telescope). The relatively high actuator count (and small projected ~23 cm pitch) allows moderate Strehls to be obtained by MagAO in the “visible” (0.63-1.05 μm). To take advantage of this we have fabricated an AO CCD science camera called "VisAO". Complete “end-to-end” closed-loop lab tests of MagAO achieve a solid, broad-band, 37% Strehl (122 nm rms) at 0.76 μm (i’) with the VisAO camera in 0.8” simulated seeing (13 cm ro at V) with fast 33 mph winds and a 40 m Lo locked on R=8 mag artificial star. These relatively high visible wavelength Strehls are enabled by our powerful combination of a next generation ASM and a Pyramid WFS with 400 controlled modes and 1000 Hz sample speeds (similar to that used successfully on-sky at the LBT). Currently only the VisAO science camera is used for lab testing of MagAO, but this high level of measured performance (122 nm rms) promises even higher Strehls with our IR science cameras. On bright (R=8 mag) stars we should achieve very high Strehls (>70% at H) in the IR with the existing MagAO Clio2 (λ=1-5.3 μm) science camera/coronagraph or even higher (~98% Strehl) the Mid-IR (8-26 microns) with the existing BLINC/MIRAC4 science camera in the future. To eliminate non-common path vibrations, dispersions, and optical errors the VisAO science camera is fed by a common path advanced triplet ADC and is piggy-backed on the Pyramid WFS optical board itself. Also a high-speed shutter can be used to block periods of poor correction. The entire system passed CDR in June 2009, and we finished the closed-loop system level testing phase in December 2011. Final system acceptance (“pre-ship” review) was passed in February 2012. In May 2012 the entire AO system is was successfully shipped to Chile and fully tested/aligned. It is now in storage in the Magellan telescope clean room in anticipation of “First Light” scheduled for December 2012. An overview of the design, attributes, performance, and schedule for the Magellan AO system and its two science cameras are briefly presented here.
Astronaut Joseph Kerwin takes blood sample from Astronaut Charles Conrad
NASA Technical Reports Server (NTRS)
1973-01-01
Scientist-Astronaut Joseph P. Kerwin (right), Skylab 2 science pilot and a doctor of medicine, takes a blood sample from Astronaut Charles Conrad Jr., Sylab 2 commander, as seen in this reproduction taken from a color television transmission made by a TV camera aboard the Skylab 1 and 2 space station cluster in Earth orbit. The blood sampling was part of the Skylab Hematology and Immunology Experiment M110 series.
CIRCE: The Canarias InfraRed Camera Experiment for the Gran Telescopio Canarias
NASA Astrophysics Data System (ADS)
Eikenberry, Stephen S.; Charcos, Miguel; Edwards, Michelle L.; Garner, Alan; Lasso-Cabrera, Nestor; Stelter, Richard D.; Marin-Franch, Antonio; Raines, S. Nicholas; Ackley, Kendall; Bennett, John G.; Cenarro, Javier A.; Chinn, Brian; Donoso, H. Veronica; Frommeyer, Raymond; Hanna, Kevin; Herlevich, Michael D.; Julian, Jeff; Miller, Paola; Mullin, Scott; Murphey, Charles H.; Packham, Chris; Varosi, Frank; Vega, Claudia; Warner, Craig; Ramaprakash, A. N.; Burse, Mahesh; Punnadi, Sunjit; Chordia, Pravin; Gerarts, Andreas; Martín, Héctor De Paz; Calero, María Martín; Scarpa, Riccardo; Acosta, Sergio Fernandez; Sánchez, William Miguel Hernández; Siegel, Benjamin; Pérez, Francisco Francisco; Martín, Himar D. Viera; Losada, José A. Rodríguez; Nuñez, Agustín; Tejero, Álvaro; González, Carlos E. Martín; Rodríguez, César Cabrera; Sendra, Jordi Molgó; Rodriguez, J. Esteban; Cáceres, J. Israel Fernádez; García, Luis A. Rodríguez; Lopez, Manuel Huertas; Dominguez, Raul; Gaggstatter, Tim; Lavers, Antonio Cabrera; Geier, Stefan; Pessev, Peter; Sarajedini, Ata; Castro-Tirado, A. J.
The Canarias InfraRed Camera Experiment (CIRCE) is a near-infrared (1-2.5μm) imager, polarimeter and low-resolution spectrograph operating as a visitor instrument for the Gran Telescopio Canarias (GTC) 10.4-m telescope. It was designed and built largely by graduate students and postdocs, with help from the University of Florida (UF) astronomy engineering group, and is funded by the UF and the US National Science Foundation. CIRCE is intended to help fill the gap in near-infrared capabilities prior to the arrival of Especrografo Multiobjecto Infra-Rojo (EMIR) to the GTC and will also provide the following scientific capabilities to compliment EMIR after its arrival: high-resolution imaging, narrowband imaging, high-time-resolution photometry, imaging polarimetry, and low resolution spectroscopy. In this paper, we review the design, fabrication, integration, lab testing, and on-sky performance results for CIRCE. These include a novel approach to the opto-mechanical design, fabrication, and alignment.
NASA Technical Reports Server (NTRS)
1997-01-01
The Sojourner rover's front right camera imaged Pop-tart, a small rock or indurated soil material which was pushed out of the surrounding drift material by Sojourner's front left wheel during a soil mechanics experiment.
Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech). The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.Particle and heat flux estimates in Proto-MPEX in Helicon Mode with IR imaging
NASA Astrophysics Data System (ADS)
Showers, M. A.; Biewer, T. M.; Caughman, J. B. O.; Donovan, D. C.; Goulding, R. H.; Rapp, J.
2016-10-01
The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) at Oak Ridge National Laboratory (ORNL) is a linear plasma device developing the plasma source concept for the Material Plasma Exposure eXperiment (MPEX), which will address plasma material interaction (PMI) science for future fusion reactors. To better understand how and where energy is being lost from the Proto-MPEX plasma during ``helicon mode'' operations, particle and heat fluxes are quantified at multiple locations along the machine length. Relevant diagnostics include infrared (IR) cameras, four double Langmuir probes (LPs), and in-vessel thermocouples (TCs). The IR cameras provide temperature measurements of Proto-MPEX's plasma-facing dump and target plates, located on either end of the machine. The change in surface temperature is measured over the duration of the plasma shot to determine the heat flux hitting the plates. The IR cameras additionally provide 2-D thermal load distribution images of these plates, highlighting Proto-MPEX plasma behaviors, such as hot spots. The LPs and TCs provide additional plasma measurements required to determine particle and heat fluxes. Quantifying axial variations in fluxes will help identify machine operating parameters that will improve Proto-MPEX's performance, increasing its PMI research capabilities. This work was supported by the U.S. D.O.E. contract DE-AC05-00OR22725.
STS-99 MS Kavandi works on OV-105's flight deck
2000-04-05
STS099-329-019 (11-22 February 2000) --- Astronaut Janet L. Kavandi, mission specialist, appears joyous over the success of the Shuttle Radar Topography Mission (SRTM) and other experiments on the flight deck of the Space Shuttle Endeavour. The Red Team member is standing beneath an electronic still camera (ESC) mounted in Endeavour's overhead windows. The camera stayed busy throughout the ll-day mission taking vertical imagery of Earth points of opportunity for the EarthKAM project. Students across the United States and in France, Germany and Japan took photos throughout the STS-99 mission. And they are using these new photos, plus all the images already available in the EarthKAM system, to enhance their classroom learning in Earth and space science, social studies, geography, mathematics and more.
Windy Mars: A dynamic planet as seen by the HiRISE camera
Bridges, N.T.; Geissler, P.E.; McEwen, A.S.; Thomson, B.J.; Chuang, F.C.; Herkenhoff, K. E.; Keszthelyi, L.P.; Martinez-Alonso, S.
2007-01-01
With a dynamic atmosphere and a large supply of particulate material, the surface of Mars is heavily influenced by wind-driven, or aeolian, processes. The High Resolution Imaging Science Experiment (HiRISE) camera on the Mars Reconnaissance Orbiter (MRO) provides a new view of Martian geology, with the ability to see decimeter-size features. Current sand movement, and evidence for recent bedform development, is observed. Dunes and ripples generally exhibit complex surfaces down to the limits of resolution. Yardangs have diverse textures, with some being massive at HiRISE scale, others having horizontal and cross-cutting layers of variable character, and some exhibiting blocky and polygonal morphologies. "Reticulate" (fine polygonal texture) bedforms are ubiquitus in the thick mantle at the highest elevations. Copyright 2007 by the American Geophysical Union.
In Situ Strategy of the 2011 Mars Science Laboratory to Investigate the Habitability of Ancient Mars
NASA Technical Reports Server (NTRS)
Mahaffy, Paul R.
2011-01-01
The ten science investigations of the 2011 Mars Science Laboratory (MSL) Rover named "Curiosity" seek to provide a quantitative assessment of habitability through chemical and geological measurements from a highly capable robotic' platform. This mission seeks to understand if the conditions for life on ancient Mars are preserved in the near-surface geochemical record. These substantial payload resources enabled by MSL's new entry descent and landing (EDL) system have allowed the inclusion of instrument types nevv to the Mars surface including those that can accept delivered sample from rocks and soils and perform a wide range of chemical, isotopic, and mineralogical analyses. The Chemistry and Mineralogy (CheMin) experiment that is located in the interior of the rover is a powder x-ray Diffraction (XRD) and X-ray Fluorescence (XRF) instrument that provides elemental and mineralogical information. The Sample Analysis at Mars (SAM) suite of instruments complements this experiment by analyzing the volatile component of identically processed samples and by analyzing atmospheric composition. Other MSL payload tools such as the Mast Camera (Mastcam) and the Chemistry & Camera (ChemCam) instruments are utilized to identify targets for interrogation first by the arm tools and subsequent ingestion into SAM and CheMin using the Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem. The arm tools include the Mars Hand Lens Imager (MAHLI) and the Chemistry and Alpha Particle X-ray Spectrometer (APXX). The Dynamic Albedo of Neutrons (DAN) instrument provides subsurface identification of hydrogen such as that contained in hydrated minerals
NASA Technical Reports Server (NTRS)
Ramesham, Rajeshuni; Maki, Justin N.; Cucullu, Gordon C.
2008-01-01
Package Qualification and Verification (PQV) of advanced electronic packaging and interconnect technologies and various other types of qualification hardware for the Mars Exploration Rover/Mars Science Laboratory flight projects has been performed to enhance the mission assurance. The qualification of hardware (Engineering Camera and Platinum Resistance Thermometer, PRT) under extreme cold temperatures has been performed with reference to various project requirements. The flight-like packages, sensors, and subassemblies have been selected for the study to survive three times (3x) the total number of expected temperature cycles resulting from all environmental and operational exposures occurring over the life of the flight hardware including all relevant manufacturing, ground operations and mission phases. Qualification has been performed by subjecting above flight-like qual hardware to the environmental temperature extremes and assessing any structural failures or degradation in electrical performance due to either overstress or thermal cycle fatigue. Experiments of flight like hardware qualification test results have been described in this paper.
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] Annotated Version This portion of an image acquired by the Mars Reconnaissance Orbiter's High Resolution Imaging Science Experiment camera shows the Spirit rover's winter campaign site. Spirit was parked on a slope tilted 11 degrees to the north to maximize sunlight during the southern winter season. 'Tyrone' is an area where the rover's wheels disturbed light-toned soils. Remote sensing and in-situ analyses found the light-toned soil at Tyrone to be sulfate rich and hydrated. The original picture is catalogued as PSP_001513_1655_red and was taken on Sept. 29, 2006. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The High Resolution Imaging Science Experiment is operated by the University of Arizona, Tucson, and the instrument was built by Ball Aerospace and Technology Corp., Boulder, Colo.New Mars Camera's First Image of Mars from Mapping Orbit (Full Frame)
NASA Technical Reports Server (NTRS)
2006-01-01
The high resolution camera on NASA's Mars Reconnaissance Orbiter captured its first image of Mars in the mapping orbit, demonstrating the full resolution capability, on Sept. 29, 2006. The High Resolution Imaging Science Experiment (HiRISE) acquired this first image at 8:16 AM (Pacific Time). With the spacecraft at an altitude of 280 kilometers (174 miles), the image scale is 25 centimeters per pixel (10 inches per pixel). If a person were located on this part of Mars, he or she would just barely be visible in this image. The image covers a small portion of the floor of Ius Chasma, one branch of the giant Valles Marineris system of canyons. The image illustrates a variety of processes that have shaped the Martian surface. There are bedrock exposures of layered materials, which could be sedimentary rocks deposited in water or from the air. Some of the bedrock has been faulted and folded, perhaps the result of large-scale forces in the crust or from a giant landslide. The image resolves rocks as small as small as 90 centimeters (3 feet) in diameter. It includes many dunes or ridges of windblown sand. This image (TRA_000823_1720) was taken by the High Resolution Imaging Science Experiment camera onboard the Mars Reconnaissance Orbiter spacecraft on Sept. 29, 2006. Shown here is the full image, centered at minus 7.8 degrees latitude, 279.5 degrees east longitude. The image is oriented such that north is to the top. The range to the target site was 297 kilometers (185.6 miles). At this distance the image scale is 25 centimeters (10 inches) per pixel (with one-by-one binning) so objects about 75 centimeters (30 inches) across are resolved. The image was taken at a local Mars time of 3:30 PM and the scene is illuminated from the west with a solar incidence angle of 59.7 degrees, thus the sun was about 30.3 degrees above the horizon. The season on Mars is northern winter, southern summer. [Photojournal note: Due to the large sizes of the high-resolution TIFF and JPEG files, some systems may experience extremely slow downlink time while viewing or downloading these images; some systems may be incapable of handling the download entirely.] NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The HiRISE camera was built by Ball Aerospace & Technologies Corporation, Boulder, Colo., and is operated by the University of Arizona, Tucson.NASA Technical Reports Server (NTRS)
Vaughan, O. H., Jr.
1990-01-01
Information on the data obtained from the Mesoscale Lightning Experiment flown on STS-26 is provided. The experiment used onboard TV cameras and a 35 mm film camera to obtain data. Data from the 35 mm camera are presented. During the mission, the crew had difficulty locating the various targets of opportunity with the TV cameras. To obtain as much data as possible in the short observational timeline allowed due to other commitments, the crew opted to use the hand-held 35 mm camera.
Students' Framing of Laboratory Exercises Using Infrared Cameras
ERIC Educational Resources Information Center
Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.
2015-01-01
Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N = 30) partook in four IR-camera laboratory activities, designed around the…
ERIC Educational Resources Information Center
Caldwell, Andy
2005-01-01
In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites.…
View From Camera Not Used During Curiosity's First Six Months on Mars
2017-12-08
This view of Curiosity's left-front and left-center wheels and of marks made by wheels on the ground in the "Yellowknife Bay" area comes from one of six cameras used on Mars for the first time more than six months after the rover landed. The left Navigation Camera (Navcam) linked to Curiosity's B-side computer took this image during the 223rd Martian day, or sol, of Curiosity's work on Mars (March 22, 2013). The wheels are 20 inches (50 centimeters) in diameter. Curiosity carries a pair of main computers, redundant to each other, in order to have a backup available if one fails. Each of the computers, A-side and B-side, also has other redundant subsystems linked to just that computer. Curiosity operated on its A-side from before the August 2012 landing until Feb. 28, when engineers commanded a switch to the B-side in response to a memory glitch on the A-side. One set of activities after switching to the B-side computer has been to check the six engineering cameras that are hard-linked to that computer. The rover's science instruments, including five science cameras, can each be operated by either the A-side or B-side computer, whichever is active. However, each of Curiosity's 12 engineering cameras is linked to just one of the computers. The engineering cameras are the Navigation Camera (Navcam), the Front Hazard-Avoidance Camera (Front Hazcam) and Rear Hazard-Avoidance Camera (Rear Hazcam). Each of those three named cameras has four cameras as part of it: two stereo pairs of cameras, with one pair linked to each computer. Only the pairs linked to the active computer can be used, and the A-side computer was active from before landing, in August, until Feb. 28. All six of the B-side engineering cameras have been used during March 2013 and checked out OK. Image Credit: NASA/JPL-Caltech NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
4K Video of Colorful Liquid in Space
2015-10-09
Once again, astronauts on the International Space Station dissolved an effervescent tablet in a floating ball of water, and captured images using a camera capable of recording four times the resolution of normal high-definition cameras. The higher resolution images and higher frame rate videos can reveal more information when used on science investigations, giving researchers a valuable new tool aboard the space station. This footage is one of the first of its kind. The cameras are being evaluated for capturing science data and vehicle operations by engineers at NASA's Marshall Space Flight Center in Huntsville, Alabama.
Dropping In a Microgravity Environment (DIME) Contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Sandi Thompson of the National Center for Microgravity Research GRC makes a final adjustment to the drop package. This image is from a digital still camera; higher resolution is not available.
Dropping In a Microgravity Environment (DIME) Contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Pictured are students from COSI Academy, Columbus, Ohio and their teacher. The other team was from Sycamore High School in Cincinnati, Ohio. This image is from a digital still camera; higher resolution is not available.
Dropping In a Microgravity Environment (DIME) contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here, students are briefed by NASA engineer Daniel Dietrich at the top of the drop tower. This image is from a digital still camera; higher resolution is not available.
Dropping In a Microgravity Environment (DIME) Contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Meredith Mendenhall of Sycamore High School, Cincinnati, Ohio, flips on a tape recorder in preparation for a drop. This image is from a digital still camera; higher resolution is not available.
1997-07-01
Onboard Space Shuttle Columbia (STS-94) Mission Specialist Donald A. Thomas observes an experiment in the glovebox aboard the Spacelab Science Module. Thomas is looking through an eye-piece of a camcorder and recording his observations on tape for post-flight analysis. Other cameras inside the glovebox are also recording other angles of the experiment or downlinking video to the experiment teams on the ground. The glovebox is thought of as a safety cabinet with closed front and negative pressure differential to prevent spillage and contamination and allow for manipulation of the experiment sample when its containment has to be opened for observation, microscopy and photography. Although not visible in this view, the glovebox is equipped with windows on the top and each side for these observations.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Pictured are students from COSI Academy, Columbus, Ohio and their teacher. The other team was from Sycamore High School in Cincinnati, Ohio. This image is from a digital still camera; higher resolution is not available.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here students from Sycamore High School in Cincinnati, Ohio, talk with Dr. Dennis Stocker, one of Glenn's lead microgravity scientists, about the uses of the drop tower. This image is from a digital still camera; higher resolution is not available.
Dropping In a Microgravity Environment (DIME) Contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Students from Sycamore High School in Cincinnati, Ohio (girls), and the COSI Academy, Columbus, Ohio (boys), participated. This image is from a digital still camera; higher resolution is not available.
NASA Astrophysics Data System (ADS)
Hansen, Candice; Bolton, S.; Caplinger, M.; Dyches, P.; Jensen, E.; Levin, S.; Ravine, M.
2012-10-01
The camera on the Juno spacecraft is part of the payload specifically for public outreach. Juno’s JunoCam camera team will rely on public participation to accomplish our goals. Our theme is “science in a fishbowl” - execution of camera operation includes several amateur communities playing essential roles, and the public to help make decisions. JunoCam is a push-frame imager with 4 filters, built by Malin Space Science Systems (MSSS). It uses the Juno spacecraft rotation to sweep its field of view across the planet. Its wide field of view (58 deg) is optimized to take advantage of Juno’s polar orbit, yielding images of the poles with 50 km spatial scale. At perijove of Juno’s elliptical orbit images will have 3 km spatial scale. Jupiter is a dynamic planet - timely images of its cloudtops from amateur astronomers will be used to simulate what may be in the camera field of view at a given time. We are developing a website to organize contributions from amateur astronomers and tools to predict ahead where storms will be. Students will lead blog discussions (or the 2016 equivalent) on the merits of imaging any given target and the entire public is invited to weigh in on both the merits and the actual decision of what images to acquire. Images will be available within days for the public to process. The JunoCam team is relying on the amateur image processing community for color products, maps, and movies. When Junocam acquires images of the Earth in October 2013, we will use the opportunity to gain experience operating the instrument with public involvement. Although we will have a professional ops team at MSSS, the tiny size of the team overall means that the public participation is not just an extra - it is essential to our success.
Foale uses takes photographs of a BCAT SGSM in the U.S. Lab during Expedition 8
2004-04-05
ISS008-E-20610 (5 April 2004) --- Astronaut C. Michael Foale, Expedition 8 commander and NASA ISS science officer, uses a digital still camera to photograph a Slow Growth Sample Module (SGSM) for the Binary Colloidal Alloy Test-3 (BCAT) experiment. The SGSM is on a mounting bracket attached to the Maintenance Work Area (MWA) table set up in the Destiny laboratory of the International Space Station (ISS).
DPM and Glovebox, Payload Commander Kathy Thornton and Payload Specialist Albert Sacco in Spacelab
1995-10-21
STS073-E-5003 (23 Oct. 1995) --- Astronaut Kathryn C. Thornton, STS-73 payload commander, works at the Drop Physics Module (DPM) on the portside of the science module aboard the Space Shuttle Columbia in Earth orbit. Payload specialist Albert Sacco Jr. conducts an experiment at the Glovebox. This frame was exposed with the color Electronic Still Camera (ESC) assigned to the 16-day United States Microgravity Laboratory (USML-2) mission.
2003-09-08
KENNEDY SPACE CENTER, FLA. - The Window Observational Research Facility (WORF), seen in the Space Station Processing Facility, was designed and built by the Boeing Co. at NASA’s Marshall Space Flight Center in Huntsville, Ala. WORF will be delivered to the International Space Station and placed in the rack position in front of the Destiny lab window, providing locations for attaching cameras, multi-spectral scanners and other instruments. WORF will support a variety of scientific and commercial experiments in areas of Earth systems and processes, global ecological changes in Earth’s biosphere, lithosphere, hydrosphere and climate system, Earth resources, natural hazards, and education. After installation, it will become a permanent focal point for Earth Science research aboard the space station.
2003-09-08
KENNEDY SPACE CENTER, FLA. - Workers in the Space Station Processing Facility check out the Window Observational Research Facility (WORF), designed and built by the Boeing Co. at NASA’s Marshall Space Flight Center in Huntsville, Ala. WORF will be delivered to the International Space Station and placed in the rack position in front of the Destiny lab window, providing locations for attaching cameras, multi-spectral scanners and other instruments. WORF will support a variety of scientific and commercial experiments in areas of Earth systems and processes, global ecological changes in Earth’s biosphere, lithosphere, hydrosphere and climate system, Earth resources, natural hazards, and education. After installation, it will become a permanent focal point for Earth Science research aboard the space station.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here Jose Carrion, a lab mechanic with AKAC, starts the orange-colored drag shield, and the experiment apparatus inside, on the hoist upward to the control station at the top of the drop tower. This image is from a digital still camera; higher resolution is not available.
Dropping In a Microgravity Environment (DIME) contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. This is the interior of the Sycamore High School (Cincinnati, Ohio) students' experiment to observe the flame spreading on a 100 percent cotton T-shirt under low-g. This image is from a digital still camera; higher resolution is not available.
NASA Astrophysics Data System (ADS)
Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.
2014-07-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-α and to detect the Hanle effect in the line core. Due to the nature of Lyman-α polarizationin the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. The CLASP cameras were designed to operate with ≤ 10 e-/pixel/second dark current, ≤ 25 e- read noise, a gain of 2.0 +- 0.5 and ≤ 1.0% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.
Space telescope phase B definition study. Volume 2A: Science instruments, f24 field camera
NASA Technical Reports Server (NTRS)
Grosso, R. P.; Mccarthy, D. J.
1976-01-01
The analysis and design of the F/24 field camera for the space telescope are discussed. The camera was designed for application to the radial bay of the optical telescope assembly and has an on axis field of view of 3 arc-minutes by 3 arc-minutes.
Scientists Must Not Film but Must Appear on Screen!
NASA Astrophysics Data System (ADS)
Gerdes, A.; Madlener, S.
2013-12-01
Film production in science has affected its subjects in a truly remarkable way. Where scientists were once perceived to be poor communicators with an overwhelming aptitude for numbers and figures, audiences now have access to scientists they can understand and even relate to. Over the years, scientists have grown accustomed to involving and using the media in their research and exposing their science to wider audiences, making them better communicators. This is a huge development, and one that is especially noticeable at MARUM, the Center for Marine Environmental Sciences at the University of Bremen/Germany. Over time, the collaboration between the scientists and public relations staff has taught us all to be better at what we do. A unique characteristic of MARUM TV is that more or less all videos are produced 'in house'; we have established the small yet effective infrastructure necessary do develop, execute, and distribute semi-professional videos to access broader audiences and increase world-wide visibility. MARUM TV relies on our research scientists to operate cameras and capture important moments offshore on expedition, and to cooperate with us as we shoot footage of them and conduct interviews onshore in the lab. In turn, we promote their research and help increase their accessibility. At the forefront of our success is the relatively recent implementation of HD cameras on MARUM's fleet of remotely operated vehicles, which capture stunning video footage of the deep sea. Furthermore, sustained collaborations with national tv stations, online media portals, and large production companies helps inform our process and increases MARUM's visibility. The result is an extensive suite of about 70 short and long format science videos with some of the highest view counts on YouTube compared to other marine institutes. In the session PA011 'Scientists must film!' we intent to address issues regarding roadblocks to bridging science and media: a) Science communication varies across cultures, posing different challenges for film production depending on the country or even the research institute. b) Audiences expect a high production value in video, which is difficult to achieve when using non-professionals behind the camera. c) Producers must have an acute awareness of both the outside world and the internal workings of a particular institute or research group. We aim to address these challenges and to share our experiences in successful science filmmaking with this presentation.
A Control System and Streaming DAQ Platform with Image-Based Trigger for X-ray Imaging
NASA Astrophysics Data System (ADS)
Stevanovic, Uros; Caselle, Michele; Cecilia, Angelica; Chilingaryan, Suren; Farago, Tomas; Gasilov, Sergey; Herth, Armin; Kopmann, Andreas; Vogelgesang, Matthias; Balzer, Matthias; Baumbach, Tilo; Weber, Marc
2015-06-01
High-speed X-ray imaging applications play a crucial role for non-destructive investigations of the dynamics in material science and biology. On-line data analysis is necessary for quality assurance and data-driven feedback, leading to a more efficient use of a beam time and increased data quality. In this article we present a smart camera platform with embedded Field Programmable Gate Array (FPGA) processing that is able to stream and process data continuously in real-time. The setup consists of a Complementary Metal-Oxide-Semiconductor (CMOS) sensor, an FPGA readout card, and a readout computer. It is seamlessly integrated in a new custom experiment control system called Concert that provides a more efficient way of operating a beamline by integrating device control, experiment process control, and data analysis. The potential of the embedded processing is demonstrated by implementing an image-based trigger. It records the temporal evolution of physical events with increased speed while maintaining the full field of view. The complete data acquisition system, with Concert and the smart camera platform was successfully integrated and used for fast X-ray imaging experiments at KIT's synchrotron radiation facility ANKA.
NASA Astrophysics Data System (ADS)
Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.
2008-12-01
Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous data volumes.
Dropping In a Microgravity Environment (DIME) contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here Carol Hodanbosi of the National Center for Microgravity Research and Jose Carrion, a lab mechanic with AKAC, prepare a student experiment package (inside the silver-colored frame) inside the orange-colored drag shield that encloses all experiment hardware. This image is from a digital still camera; higher resolution is not available.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here Carol Hodanbosi of the National Center for Microgravity Research and Jose Carrion, a lab mechanic with AKAC, prepare a student experiment package (inside the silver-colored frame) inside the orange-colored drag shield that encloses all experiment hardware. This image is from a digital still camera; higher resolution is not available.
SR-71 Ship #1 - Ultraviolet Experiment
NASA Technical Reports Server (NTRS)
1994-01-01
NASA's SR-71 streaks into the twilight on a night/science flight from the Dryden Flight Research Center, Edwards, California. Mounted in the nose of the SR-71 was an ultraviolet video camera aimed skyward to capture images of stars, asteroids and comets. The science portion of the flight is a project of the Jet Propulsion Laboratory, Pasadena, California. Two SR-71 aircraft have been used by NASA as test beds for high-speed and high-altitude aeronautical research. One early research project flown on one of Dryden's SR-71s consisted of a proposal for a series of flights using the SR-71 as a science camera platform for the Jet Propulsion Laboratory (JPL) of the California Institute of Technology, which operates under contract to NASA in much the way that NASA centers do. In March 1993, an upward-looking ultraviolet (UV) video camera placed in the SR-71's nosebay studied a variety of celestial objects in the ultraviolet light spectrum. The SR-71 was proposed as a test bed for the experiment because it is capable of flying at altitudes above 80,000 feet for an extended length of time. Observation of ultraviolet radiation is not possible from the Earth's surface because the atmosphere's ozone layer absorbs UV rays. Study of UV radiation is important because it is known to cause skin cancer with prolonged exposure. UV radiation is also valuable to study from an astronomical perspective. Satellite study of ultraviolet radiation is very expensive. As a result, the South West Research Institute (SWRI) in Texas developed the hypothesis of using a high-flying aircraft such as the SR-71 to conduct UV observations. The SR-71 is capable of flying above 90 percent of the Earth's atmosphere. The flight program was also designed to test the stability of the aircraft as a test bed for UV observation. A joint flight program was developed between the JPL and NASA's Ames-Dryden Flight Research Facility (redesignated the Dryden Flight Research Center, Edwards, California, in 1994) in conjunction with SWRI to test the hypothesis. Dryden modified the nosebay of the SR-71, creating an upward-observing window to carry SWRI's ultraviolet CCD camera so it could make observations. According to Dryden's SR-71 Project Manager Dave Lux, a single flight of the aircraft confirmed the aircraft's capability and stability as a test bed for UV observations. SWRI's principle investigator was Dr. Allen Stern.
Phootprint - A Phobos sample return mission study
NASA Astrophysics Data System (ADS)
Koschny, Detlef; Svedhem, Håkan; Rebuffat, Denis
Introduction ESA is currently studying a mission to return a sample from Phobos, called Phootprint. This study is performed as part of ESA’s Mars Robotic Exploration Programme. Part of the mission goal is to prepare technology needed for a sample return mission from Mars itself; the mission should also have a strong scientific justification, which is described here. 1. Science goal The main science goal of this mission will be to Understand the formation of the Martian moons Phobos and put constraints on the evolution of the solar system. Currently, there are several possibilities for explaining the formation of the Martian moons: (a) co-formation with Mars (b) capture of objects coming close to Mars (c) Impact of a large body onto Mars and formation from the impact ejecta The main science goal of this mission is to find out which of the three scenarios is the most probable one. To do this, samples from Phobos would be returned to Earth and analyzed with extremely high precision in ground-based laboratories. An on-board payload is foreseen to provide information to put the sample into the necessary geological context. 2. Mission Spacecraft and payload will be based on experience gained from previous studies to Martian moons and asteroids. In particular the Marco Polo and MarcoPolo-R asteroid sample return mission studies performed at ESA were used as a starting point. Currently, industrial studies are ongoing. The initial starting assumption was to use a Soyuz launcher. Uunlike the initial Marco Polo and MarcoPolo-R studies to an asteroid, a transfer stage will be needed. Another main difference to an asteroid mission is the fact that the spacecraft actually orbits Mars, not Phobos or Deimos. It is possible to select a spacecraft orbit, which in a Phobos- or Deimos-centred reference system would give an ellipse around the moon. The following model payload is currently foreseen: - Wide Angle Camera, - Narrow Angle Camera, - Close-Up Camera, - Context camera for sampling context, - visible-IR spectrometer - thermal IR spectrometer - and a Radio Science investigation. It is expected that with these instruments the necessary context for the sample can be provided. The paper will focus on the current status of the mission study.
NASA Technical Reports Server (NTRS)
Trauger, John T.
2005-01-01
Eclipse is a proposed NASA Discovery mission to perform a sensitive imaging survey of nearby planetary systems, including a survey for jovian-sized planets orbiting Sun-like stars to distances of 15 pc. We outline the science objectives of the Eclipse mission and review recent developments in the key enabling technologies. Eclipse is a space telescope concept for high-contrast visible-wavelength imaging and spectrophotometry. Its design incorporates a telescope with an unobscured aperture of 1.8 meters, a coronographic camera for suppression of diffracted light, and precise active wavefront correction for the suppression of scattered background light. For reference, Eclipse is designed to reduce the diffracted and scattered starlight between 0.33 and 1.5 arcseconds from the star by three orders of magnitude compared to any HST instrument. The Eclipse mission provides precursor science exploration and technology experience in support of NASA's Terrestrial Planet Finder (TPF) program.
An ordinary camera in an extraordinary location: Outreach with the Mars Webcam
NASA Astrophysics Data System (ADS)
Ormston, T.; Denis, M.; Scuka, D.; Griebel, H.
2011-09-01
The European Space Agency's Mars Express mission was launched in 2003 and was Europe's first mission to Mars. On-board was a small camera designed to provide ‘visual telemetry’ of the separation of the Beagle-2 lander. After achieving its goal it was shut down while the primary science mission of Mars Express got underway. In 2007 this camera was reactivated by the flight control team of Mars Express for the purpose of providing public education and outreach—turning it into the ‘Mars Webcam’.The camera is a small, 640×480 pixel colour CMOS camera with a wide-angle 30°×40° field of view. This makes it very similar in almost every way to the average home PC webcam. The major difference is that this webcam is not in an average location but is instead in orbit around Mars. On a strict basis of non-interference with the primary science activities, the camera is turned on to provide unique wide-angle views of the planet below.A highly automated process ensures that the observations are scheduled on the spacecraft and then uploaded to the internet as rapidly as possible. There is no intermediate stage, so that visitors to the Mars Webcam blog serve as ‘citizen scientists’. Full raw datasets and processing instructions are provided along with a mechanism to allow visitors to comment on the blog. Members of the public are encouraged to use this in either a personal or an educational context and work with the images. We then take their excellent work and showcase it back on the blog. We even apply techniques developed by them to improve the data and webcam experience for others.The accessibility and simplicity of the images also makes the data ideal for educational use, especially as educational projects can then be showcased on the site as inspiration for others. The oft-neglected target audience of space enthusiasts is also important as this allows them to participate as part of an interplanetary instrument team.This paper will cover the history of the project and the technical background behind using the camera and linking the results to an accessible blog format. It will also cover the outreach successes of the project, some of the contributions from the Mars Webcam community, opportunities to use and work with the Mars Webcam and plans for future uses of the camera.
Student designed experiments to learn fluids
NASA Astrophysics Data System (ADS)
Stern, Catalina
2013-11-01
Lasers and high speed cameras are a wonderful tool to visualize the very complex behavior of fluids, and to help students grasp concepts like turbulence, surface tension and vorticity. In this work we present experiments done by physics students in their senior year at the School of Science of the National University of Mexico as a final project in the continuum mechanics course. Every semester, the students make an oral presentation of their work and videos and images are kept in the web page ``Pasión por los Fluidos''. I acknowledge support from the Physics Department of Facultad de Ciencias, Universidad Nacional Autónoma de México.
1973-07-01
SL3-108-1288 (July-Sept. 1973) --- Astronaut Owen K. Garriott, science pilot of the Skylab 3 mission, is stationed at the Apollo Telescope Mount (ATM) console in the Multiple Docking Adapter (MDA) of the Skylab space station in Earth orbit. This picture was taken with a handheld 35mm Nikon camera. Astronauts Garriott, Alan L. Bean and Jack R. Lousma remained with the Skylab space station cluster in orbit for 59 days conducting numerous medical, scientific and technological experiments. In orbit the MDA functions as a major experiment control center for solar observations. From this console the astronauts actively control the ATM solar physics telescopes. Photo credit: NASA
NASA Technical Reports Server (NTRS)
Over, Ann P.
2001-01-01
The Combustion Module-1 (CM-1) was a large, state-of-the-art space shuttle Spacelab facility that was designed, built, and operated on STS-83 and STS-94 by a team from the NASA Glenn Research Center composed of civil servants and local support contractors (Analex and Zin Technologies). CM-1 accomplished the incredible task of providing a safe environment to support flammable and toxic gases while providing a suite of diagnostics for science measurements more extensive than any prior shuttle experiment (or anything since). Finally, CM-1 proved that multiple science investigations can be accommodated in one facility, a crucial step for Glenn's Fluids and Combustion Facility developed for the International Space Station. However, the story does not end with CM-1. In 1998, CM-2 was authorized to take the CM-1 accomplishments a big step further by completing three major steps: Converting the entire experiment to operate in a SPACEHAB module. Conducting an extensive hardware refurbishment and upgrading diagnostics (e.g., cameras, gas chromatograph, and numerous sensors). Adding a new, completely different combustion experiment.
Portable Long-Wavelength Infrared Camera for Civilian Application
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Krabach, T. N.; Bandara, S. V.; Liu, J. K.
1997-01-01
In this paper, we discuss the performance of this portable long-wavelength infrared camera in quantum efficiency, NEAT, minimum resolvable temperature differnce (MRTD), uniformity, etc. and its application in science, medicine and defense.
Filters for Color Imaging and for Science
2013-03-18
The color cameras on NASA Mars rover Curiosity, including the pair that make up the rover Mastcam instrument, use the same type of Bayer pattern RGB filter as found in typical commercial color cameras.
Surveillance Cameras and Their Use as a Dissecting Microscope in the Teaching of Biological Sciences
ERIC Educational Resources Information Center
Vale, Marcus R.
2016-01-01
Surveillance cameras are prevalent in various public and private areas, and they can also be coupled to optical microscopes and telescopes with excellent results. They are relatively simple cameras without sophisticated technological features and are much less expensive and more accessible to many people. These features enable them to be used in…
Improving accuracy of Plenoptic PIV using two light field cameras
NASA Astrophysics Data System (ADS)
Thurow, Brian; Fahringer, Timothy
2017-11-01
Plenoptic particle image velocimetry (PIV) has recently emerged as a viable technique for acquiring three-dimensional, three-component velocity field data using a single plenoptic, or light field, camera. The simplified experimental arrangement is advantageous in situations where optical access is limited and/or it is not possible to set-up the four or more cameras typically required in a tomographic PIV experiment. A significant disadvantage of a single camera plenoptic PIV experiment, however, is that the accuracy of the velocity measurement along the optical axis of the camera is significantly worse than in the two lateral directions. In this work, we explore the accuracy of plenoptic PIV when two plenoptic cameras are arranged in a stereo imaging configuration. It is found that the addition of a 2nd camera improves the accuracy in all three directions and nearly eliminates any differences between them. This improvement is illustrated using both synthetic and real experiments conducted on a vortex ring using both one and two plenoptic cameras.
Dropping In a Microgravity Environment (DIME) contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here students from Sycamore High School in Cincinnati, Ohio, talk with Dr. Dennis Stocker, one of Glenn's lead microgravity scientists, about the uses of the drop tower. This image is from a digital still camera; higher resolution is not available.
NASA Technical Reports Server (NTRS)
Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtin, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike
2014-01-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30%) quantum efficiency at the Lyman-$\\alpha$ line. The CLASP cameras were designed to operate with =10 e- /pixel/second dark current, = 25 e- read noise, a gain of 2.0 and =0.1% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.
NASA Technical Reports Server (NTRS)
Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.
2014-01-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with 10 e-/pixel/second dark current, 25 e- read noise, a gain of 2.0 +/- 0.5 and 1.0 percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.
Camera Test on Curiosity During Flight to Mars
2012-05-07
An in-flight camera check produced this out-of-focus image when NASA Mars Science Laboratory spacecraft turned on illumination sources that are part of the Curiosity rover Mars Hand Lens Imager MAHLI instrument.
Dropping In a Microgravity Environment (DIME) contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. Here Jose Carrion, a lab mechanic with AKAC, starts the orange-colored drag shield, and the experiment apparatus inside, on the hoist upward to the control station at the top of the drop tower. This image is from a digital still camera; higher resolution is not available.
Development of the FPI+ as facility science instrument for SOFIA cycle four observations
NASA Astrophysics Data System (ADS)
Pfüller, Enrico; Wiedemann, Manuel; Wolf, Jürgen; Krabbe, Alfred
2016-08-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a heavily modified Boeing 747SP aircraft, accommodating a 2.5m infrared telescope. This airborne observation platform takes astronomers to flight altitudes of up to 13.7 km (45,000ft) and therefore allows an unobstructed view of the infrared universe at wavelengths between 0.3 m and 1600 m. SOFIA is currently completing its fourth cycle of observations and utilizes eight different imaging and spectroscopic science instruments. New instruments for SOFIAs cycle 4 observations are the High-resolution Airborne Wideband Camera-plus (HAWC+) and the Focal Plane Imager (FPI+). The latter is an integral part of the telescope assembly and is used on every SOFIA flight to ensure precise tracking on the desired targets. The FPI+ is used as a visual-light photometer in its role as facility science instrument. Since the upgrade of the FPI camera and electronics in 2013, it uses a thermo-electrically cooled science grade EM-CCD sensor inside a commercial-off-the-shelf Andor camera. The back-illuminated sensor has a peak quantum efficiency of 95% and the dark current is as low as 0.01 e-/pix/sec. With this new hardware the telescope has successfully tracked on 16th magnitude stars and thus the sky coverage, e.g. the area of sky that has suitable tracking stars, has increased to 99%. Before its use as an integrated tracking imager, the same type of camera has been used as a standalone diagnostic tool to analyze the telescope pointing stability at frequencies up to 200 Hz (imaging with 400 fps). These measurements help to improve the telescope pointing control algorithms and therefore reduce the image jitter in the focal plane. Science instruments benefit from this improvement with smaller image sizes for longer exposure times. The FPI has also been used to support astronomical observations like stellar occultations by the dwarf planet Pluto and a number of exoplanet transits. Especially the observation of the occultation events benefits from the high camera sensitivity, fast readout capability and the low read noise and it was possible to achieve high time resolution on the photometric light curves. This paper will give an overview of the development from the standalone diagnostic camera to the upgraded guiding/tracking camera, fully integrated into the telescope, while still offering the diagnostic capabilities and finally to the use as a facility science instrument on SOFIA.
Evaluation of the Use of Remote Laboratories for Secondary School Science Education
NASA Astrophysics Data System (ADS)
Lowe, David; Newcombe, Peter; Stumpers, Ben
2013-06-01
Laboratory experimentation is generally considered central to science-based education. Allowing students to "experience" science through various forms of carefully designed practical work, including experimentation, is often claimed to support their learning and motivate their engagement while fulfilling specific curriculum requirements. However, logistical constraints (most especially related to funding) place significant limitations on the ability of schools to provide and maintain high-quality science laboratory experiences and equipment. One potential solution that has recently been the subject of growing interest is the use of remotely accessible laboratories to either supplant, or more commonly to supplement, conventional hands-on laboratories. Remote laboratories allow students and teachers to use high-speed networks, coupled with cameras, sensors, and controllers, to carry out experiments on real physical laboratory apparatus that is located remotely from the student. Research has shown that when used appropriately this can bring a range of potential benefits, including the ability to share resources across multiple institutions, support access to facilities that would otherwise be inaccessible for cost or technical reasons, and provide augmentation of the experimental experience. Whilst there has been considerable work on evaluating the use of remote laboratories within tertiary education, consideration of their role within secondary school science education is much more limited. This paper describes trials of the use of remote laboratories within secondary schools, reporting on the student and teacher reactions to their interactions with the laboratories. The paper concludes that remote laboratories can be highly beneficial, but considerable care must be taken to ensure that their design and delivery address a number of critical issues identified in this paper.
Cameras Monitor Spacecraft Integrity to Prevent Failures
NASA Technical Reports Server (NTRS)
2014-01-01
The Jet Propulsion Laboratory contracted Malin Space Science Systems Inc. to outfit Curiosity with four of its cameras using the latest commercial imaging technology. The company parlayed the knowledge gained under working with NASA to develop an off-the-shelf line of cameras, along with a digital video recorder, designed to help troubleshoot problems that may arise on satellites in space.
NASA Technical Reports Server (NTRS)
2008-01-01
[figure removed for brevity, see original site] Click on the image for movie of Zooming in on Landing Site This animation zooms in on the area on Mars where NASA's Phoenix Mars Lander will touchdown on May 25, 2008. The image was taken by the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. The first shot shows the spacecraft's landing ellipse in green, the area where Phoenix has a high probability of landing. It then zooms in to show the region's arctic terrain. This polar landscape is relatively free of rocks, with only about 1 to 2 rocks 1.5 meters (4.9 feet) or larger in an area about as big as two football fields. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The High Resolution Imaging Science Experiment is operated by the University of Arizona, Tucson, and the instrument was built by Ball Aerospace & Technologies Corp., Boulder, Colo.Spirit's Tracks around 'Home Plate'
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] Annotated Version This portion of an image acquired by the Mars Reconnaissance Orbiter's High Resolution Imaging Science Experiment camera shows the Spirit rover's winter campaign site. The rover is visible. So is the 'Low Ridge' feature where Spirit was parked with an 11-degree northerly tilt to maximize sunlight on the solar panels during the southern winter season. Tracks made by Spirit on the way to 'Home Plate' and to and from 'Tyrone,' an area of light-toned soils exposed by rover wheel motions, are also evident. The original image is catalogued as PSP_001513_1655_red and was taken Sept. 29, 2006. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The High Resolution Imaging Science Experiment is operated by the University of Arizona, Tucson, and the instrument was built by Ball Aerospace and Technology Corp., Boulder, Colo.Test Image of Earth Rocks by Mars Camera Stereo
2010-11-16
This stereo view of terrestrial rocks combines two images taken by a testing twin of the Mars Hand Lens Imager MAHLI camera on NASA Mars Science Laboratory. 3D glasses are necessary to view this image.
Spatial EPR entanglement in atomic vapor quantum memory
NASA Astrophysics Data System (ADS)
Parniak, Michal; Dabrowski, Michal; Wasilewski, Wojciech
Spatially-structured quantum states of light are staring to play a key role in modern quantum science with the rapid development of single-photon sensitive cameras. In particular, spatial degree of freedom holds a promise to enhance continous-variable quantum memories. Here we present the first demonstration of spatial entanglement between an atomic spin-wave and a photon measured with an I-sCMOS camera. The system is realized in a warm atomic vapor quantum memory based on rubidium atoms immersed in inert buffer gas. In the experiment we create and characterize a 12-dimensional entangled state exhibiting quantum correlations between a photon and an atomic ensemble in position and momentum bases. This state allows us to demonstrate the Einstein-Podolsky-Rosen paradox in its original version, with an unprecedented delay time of 6 μs between generation of entanglement and detection of the atomic state.
Noise and sensitivity of x-ray framing cameras at Nike (abstract)
NASA Astrophysics Data System (ADS)
Pawley, C. J.; Deniz, A. V.; Lehecka, T.
1999-01-01
X-ray framing cameras are the most widely used tool for radiographing density distributions in laser and Z-pinch driven experiments. The x-ray framing cameras that were developed specifically for experiments on the Nike laser system are described. One of these cameras has been coupled to a CCD camera and was tested for resolution and image noise using both electrons and x rays. The largest source of noise in the images was found to be due to low quantum detection efficiency of x-ray photons.
Illustrating MastCam Capabilities with a Terrestrial Scene
2011-11-28
This set of views illustrates capabilities of the Mast Camera MastCam instrument on NASA Mars Science Laboratory Curiosity rover, using a scene on Earth as an example of what MastCam two cameras can see from different distances.
3D animation model with augmented reality for natural science learning in elementary school
NASA Astrophysics Data System (ADS)
Hendajani, F.; Hakim, A.; Lusita, M. D.; Saputra, G. E.; Ramadhana, A. P.
2018-05-01
Many opinions from primary school students' on Natural Science are a difficult lesson. Many subjects are not easily understood by students, especially on materials that teach some theories about natural processes. Such as rain process, condensation and many other processes. The difficulty that students experience in understanding it is that students cannot imagine the things that have been taught in the material. Although there is material to practice some theories but is actually quite limited. There is also a video or simulation material in the form of 2D animated images. Understanding concepts in natural science lessons are also poorly understood by students. Natural Science learning media uses 3-dimensional animation models (3D) with augmented reality technology, which offers some visualization of science lessons. This application was created to visualize a process in Natural Science subject matter. The hope of making this application is to improve student's concept. This app is made to run on a personal computer that comes with a webcam with augmented reality. The app will display a 3D animation if the camera can recognize the marker.
Using Wide-Field Meteor Cameras to Actively Engage Students in Science
NASA Astrophysics Data System (ADS)
Kuehn, D. M.; Scales, J. N.
2012-08-01
Astronomy has always afforded teachers an excellent topic to develop students' interest in science. New technology allows the opportunity to inexpensively outfit local school districts with sensitive, wide-field video cameras that can detect and track brighter meteors and other objects. While the data-collection and analysis process can be mostly automated by software, there is substantial human involvement that is necessary in the rejection of spurious detections, in performing dynamics and orbital calculations, and the rare recovery and analysis of fallen meteorites. The continuous monitoring allowed by dedicated wide-field surveillance cameras can provide students with a better understanding of the behavior of the night sky including meteors and meteor showers, stellar motion, the motion of the Sun, Moon, and planets, phases of the Moon, meteorological phenomena, etc. Additionally, some students intrigued by the possibility of UFOs and "alien visitors" may find that actual monitoring data can help them develop methods for identifying "unknown" objects. We currently have two ultra-low light-level surveillance cameras coupled to fish-eye lenses that are actively obtaining data. We have developed curricula suitable for middle or high school students in astronomy and earth science courses and are in the process of testing and revising our materials.
The Chang'e 3 Mission Overview
NASA Astrophysics Data System (ADS)
Li, Chunlai; Liu, Jianjun; Ren, Xin; Zuo, Wei; Tan, Xu; Wen, Weibin; Li, Han; Mu, Lingli; Su, Yan; Zhang, Hongbo; Yan, Jun; Ouyang, Ziyuan
2015-07-01
The Chang'e 3 (CE-3) mission was implemented as the first lander/rover mission of the Chinese Lunar Exploration Program (CLEP). After its successful launch at 01:30 local time on December 2, 2013, CE-3 was inserted into an eccentric polar lunar orbit on December 6, and landed to the east of a 430 m crater in northwestern Mare Imbrium (19.51°W, 44.12°N) at 21:11 on December 14, 2013. The Yutu rover separated from the lander at 04:35, December 15, and traversed for a total of 0.114 km. Acquisition of science data began during the descent of the lander and will continue for 12 months during the nominal mission. The CE-3 lander and rover each carry four science instruments. Instruments on the lander are: Landing Camera (LCAM), Terrain Camera (TCAM), Extreme Ultraviolet Camera (EUVC), and Moon-based Ultraviolet Telescope (MUVT). The four instruments on the rover are: Panoramic Camera (PCAM), VIS-NIR Imaging Spectrometer (VNIS), Active Particle induced X-ray Spectrometer (APXS), and Lunar Penetrating Radar (LPR). The science objectives of the CE-3 mission include: (1) investigation of the morphological features and geological structures of and near the landing area; (2) integrated in-situ analysis of mineral and chemical composition of and near the landing area; and (3) exploration of the terrestrial-lunar space environment and lunar-based astronomical observations. This paper describes the CE-3 objectives and measurements that address the science objectives outlined by the Comprehensive Demonstration Report of Phase II of CLEP. The CE-3 team has archived the initial science data, and we describe data accessibility by the science community.
Pinhole Cameras: For Science, Art, and Fun!
ERIC Educational Resources Information Center
Button, Clare
2007-01-01
A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…
Camera traps and activity signs to estimate wild boar density and derive abundance indices.
Massei, Giovanna; Coats, Julia; Lambert, Mark Simon; Pietravalle, Stephane; Gill, Robin; Cowan, Dave
2018-04-01
Populations of wild boar and feral pigs are increasing worldwide, in parallel with their significant environmental and economic impact. Reliable methods of monitoring trends and estimating abundance are needed to measure the effects of interventions on population size. The main aims of this study, carried out in five English woodlands were: (i) to compare wild boar abundance indices obtained from camera trap surveys and from activity signs; and (ii) to assess the precision of density estimates in relation to different densities of camera traps. For each woodland, we calculated a passive activity index (PAI) based on camera trap surveys, rooting activity and wild boar trails on transects, and estimated absolute densities based on camera trap surveys. PAIs obtained using different methods showed similar patterns. We found significant between-year differences in abundance of wild boar using PAIs based on camera trap surveys and on trails on transects, but not on signs of rooting on transects. The density of wild boar from camera trap surveys varied between 0.7 and 7 animals/km 2 . Increasing the density of camera traps above nine per km 2 did not increase the precision of the estimate of wild boar density. PAIs based on number of wild boar trails and on camera trap data appear to be more sensitive to changes in population size than PAIs based on signs of rooting. For wild boar densities similar to those recorded in this study, nine camera traps per km 2 are sufficient to estimate the mean density of wild boar. © 2017 Crown copyright. Pest Management Science © 2017 Society of Chemical Industry. © 2017 Crown copyright. Pest Management Science © 2017 Society of Chemical Industry.
Assessing the Potential of Low-Cost 3D Cameras for the Rapid Measurement of Plant Woody Structure
Nock, Charles A; Taugourdeau, Olivier; Delagrange, Sylvain; Messier, Christian
2013-01-01
Detailed 3D plant architectural data have numerous applications in plant science, but many existing approaches for 3D data collection are time-consuming and/or require costly equipment. Recently, there has been rapid growth in the availability of low-cost, 3D cameras and related open source software applications. 3D cameras may provide measurements of key components of plant architecture such as stem diameters and lengths, however, few tests of 3D cameras for the measurement of plant architecture have been conducted. Here, we measured Salix branch segments ranging from 2–13 mm in diameter with an Asus Xtion camera to quantify the limits and accuracy of branch diameter measurement with a 3D camera. By scanning at a variety of distances we also quantified the effect of scanning distance. In addition, we also test the sensitivity of the program KinFu for continuous 3D object scanning and modeling as well as other similar software to accurately record stem diameters and capture plant form (<3 m in height). Given its ability to accurately capture the diameter of branches >6 mm, Asus Xtion may provide a novel method for the collection of 3D data on the branching architecture of woody plants. Improvements in camera measurement accuracy and available software are likely to further improve the utility of 3D cameras for plant sciences in the future. PMID:24287538
Camera systems in human motion analysis for biomedical applications
NASA Astrophysics Data System (ADS)
Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.
2015-05-01
Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.
NASA Technical Reports Server (NTRS)
Heizer, Barbara L.
1992-01-01
The Crystals by Vapor Transport Experiment (CVTE) and Space Experiments Facility (SEF) are materials processing facilities designed and built for use on the Space Shuttle mid deck. The CVTE was built as a commercial facility owned by the Boeing Company. The SEF was built under contract to the UAH Center for Commercial Development of Space (CCDS). Both facilities include up to three furnaces capable of reaching 850 C minimum, stand-alone electronics and software, and independent cooling control. In addition, the CVTE includes a dedicated stowage locker for cameras, a laptop computer, and other ancillary equipment. Both systems are designed to fly in a Middeck Accommodations Rack (MAR), though the SEF is currently being integrated into a Spacehab rack. The CVTE hardware includes two transparent furnaces capable of achieving temperatures in the 850 to 870 C range. The transparent feature allows scientists/astronauts to directly observe and affect crystal growth both on the ground and in space. Cameras mounted to the rack provide photodocumentation of the crystal growth. The basic design of the furnace allows for modification to accommodate techniques other than vapor crystal growth. Early in the CVTE program, the decision was made to assign a principal scientist to develop the experiment plan, affect the hardware/software design, run the ground and flight research effort, and interface with the scientific community. The principal scientist is responsible to the program manager and is a critical member of the engineering development team. As a result of this decision, the hardware/experiment requirements were established in such a way as to balance the engineering and science demands on the equipment. Program schedules for hardware development, experiment definition and material selection, flight operations development and crew training, both ground support and astronauts, were all planned and carried out with the understanding that the success of the program science was as important as the hardware functionality. How the CVTE payload was designed and what it is capable of, the philosophy of including the scientists in design and operations decisions, and the lessons learned during the integration process are descussed.
Baseball caps of the Atlanta Braves and Cleveland Indians in the flight deck
1995-10-25
STS073-E-5135 (26 Oct. 1995) --- Baseball caps from the two 1995 World Series representative franchises float near the cabin windows of the Earth-orbiting space shuttle Columbia, with the Earth in the background. The American League champion Cleveland Indians and their National League counterpart Atlanta Braves were engaged in a scheduled best-of-seven World Series throughout the first portion of the scheduled 16-day mission in space. Off-duty crewmembers came out of a rest period to set up the scene in tribute to the October classic. The crew will continue working in shifts around the clock on a diverse assortment of United States Microgravity Laboratory (USML-2) experiments located in the science module. Fields of study include fluid physics, materials science, biotechnology, combustion science and commercial space processing technologies. The frame was exposed with an Electronic Still Camera (ESC).
Skylab mission report, third visit
NASA Technical Reports Server (NTRS)
1974-01-01
An evaluation is presented of the operational and engineering aspects of the third Skylab visit, including information on the performance of the command and service module and the experiment hardware, the crew's evaluation of the visit, and other visit-related areas of interest such as biomedical observations. The specific areas discussed are contained in the following: (1) solar physics and astrophysics investigations; (2) Comet Kohoutek experiments; (3) medical experiments; (4) earth observations, including data for the multispectral photographic facility, the earth terrain camera, and the microwave radiometer/scattermometer and altimeter; (5) engineering and technology experiments; (6) food and medical operational equipment; (7) hardware and experiment anomalies; and (8) mission support, mission objectives, flight planning, and launch phase summary. Conclusions discussed as a result of the third visit to Skylab involve the advancement of the sciences, practical applications, the durability of man and systems in space, and spaceflight effectiveness and economy.
2001-05-02
John Henson (grade 12) and Suzi Bryce (grade 10) from DuPont Manual High School in Louisville, Kentucky, conduct a drop with NASA's Microgravity Demonstrator. A camera and a TV/VCR unit let students play back recordings of how different physical devices behave differently during freefall as compared to 1-g. The activity was part of the education outreach segment of the Pan-Pacific Basin Workshop on Microgravity Sciences held in Pasadena, California. The event originated at the California Science Center in Los Angeles. The DuPont Manual students patched in to the event through the distance learning lab at the Louisville Science Center. This image is from a digital still camera; higher resolution is not available.
The Role of APEX as a Pathfinder for AtLAST
NASA Astrophysics Data System (ADS)
Wyrowski, Friedrich
2018-01-01
Now more than 12 years in operation, the Atacama Pathfinder Experiment (APEX) 12 m submillimeter telescope has significantly contributed to a wide variety of submillimeter astronomy science areas, ranging from the discoveries of new molecules to large and deep imaging of the submillimeter sky. While ALMA operation is in full swing, APEX is strengthening its role not only as pathfinder for studying large source samples and spatial scales to prepare detailed high angular resolution ALMA follow ups, but also as fast response instruments to complement new results from ALMA. Furthermore, APEX ensures southern hemisphere access for submillimeter projects complementing archival Herschel research as well as new SOFIA science. With new broadband and multipixel receivers as well as large cameras for wide-field continuum imaging, APEX will pave the way towards the science envisioned with ATLAST. In this contribution, the current status and ongoing upgrades of APEX will be discussed, with an emphasis on the importance of continuous cutting edge science and state-of-the-art instrumentation that will bridge the gap towards ATLAST.
System of Programmed Modules for Measuring Photographs with a Gamma-Telescope
NASA Technical Reports Server (NTRS)
Averin, S. A.; Veselova, G. V.; Navasardyan, G. V.
1978-01-01
Physical experiments using tracking cameras resulted in hundreds of thousands of stereo photographs of events being received. To process such a large volume of information, automatic and semiautomatic measuring systems are required. At the Institute of Space Research of the Academy of Science of the USSR, a system for processing film information from the spark gamma-telescope was developed. The system is based on a BPS-75 projector in line with the minicomputer Elektronika 1001. The report describes this system. The various computer programs available to the operators are discussed.
2001-01-24
The Laminar Soot Processes (LSP) experiment under way during the Microgravity Sciences Lab-1 mission in 1997. LSP-2 will fly in the STS-107 Research 1 mission in 2001. The principal investigator is Dr. Gerard Faeth of the University of Michigan. LSP uses a small jet burner, similar to a classroom butane lighter, that produces flames up to 60 mm (2.3 in) long. Measurements include color TV cameras and a temperature sensor, and laser images whose darkness indicates the quantity of soot produced in the flame. Glenn Research in Cleveland, OH, manages the project.
2001-01-24
Image of soot (smoke) plume made for the Laminar Soot Processes (LSP) experiment during the Microgravity Sciences Lab-1 mission in 1997. LSP-2 will fly in the STS-107 Research 1 mission in 2002. The principal investigator is Dr. Gerard Faeth of the University of Michigan. LSP uses a small jet burner, similar to a classroom butane lighter, that produces flames up to 60 mm (2.3 in) long. Measurements include color TV cameras and a temperature sensor, and laser images whose darkness indicates the quantity of soot produced in the flame. Glenn Research in Cleveland, OH, manages the project.
NASA Technical Reports Server (NTRS)
2001-01-01
The Laminar Soot Processes (LSP) experiment under way during the Microgravity Sciences Lab-1 mission in 1997. LSP-2 will fly in the STS-107 Research 1 mission in 2001. The principal investigator is Dr. Gerard Faeth of the University of Michigan. LSP uses a small jet burner, similar to a classroom butane lighter, that produces flames up to 60 mm (2.3 in) long. Measurements include color TV cameras and a temperature sensor, and laser images whose darkness indicates the quantity of soot produced in the flame. Glenn Research in Cleveland, OH, manages the project.
Payload Specialist Taylor Wang performs repairs on Drop Dynamics Module
1985-05-01
51B-03-035 (29 April-6 May 1985) --- Payload specialist Taylor G. Wang performs a repair task on the Drop Dynamics Module (DDM) in the Science Module aboard the Earth-orbiting Space Shuttle Challenger. The photo was taken with a 35mm camera. Dr. Wang is principal investigator for the first time-to-fly experiment, developed by his team at NASA?s Jet Propulsion Laboratory (JPL), Pasadena, California. This photo was among the first to be released by NASA upon return to Earth by the Spacelab 3 crew.
NASA Astrophysics Data System (ADS)
Adams, M.; Smith, J. A.; Kloostra, E.; Knupp, K. R.; Taylor, K.; Anderson, S.; Baskauf, C. J.; Buckner, S.; DiMatties, J.; Fry, C. D.; Gaither, B.; Galben, C. W.; Gallagher, D. L.; Heaston, M. P.; Kraft, J.; Meisch, K.; Mills, R.; Nations, C.; Nielson, D.; Oelgoetz, J.; Rawlins, L. P.; Sudbrink, D. L.; Wright, A.
2017-12-01
For the August 2017 eclipse, NASA's Marshall Space Flight Center partnered with the U.S. Space and Rocket Center (USSRC), Austin Peay State University (APSU) in Clarksville, Tennessee, the University of Alabama in Huntsville (UAH), the Interactive NASA Space Physics Ionosphere Radio Experiments (INSPIRE) Project, and the local school systems of Montgomery County, Tennessee, and Christian County, Kentucky. Multiple site visits and workshops were carried out during the first eight months of 2017 to prepare local teachers and students for the eclipse. A special curriculum was developed to prepare USSRC Space Camp and INSPIRE students to observe and participate in science measurements during the eclipse. Representatives from Christian County school system and APSU carried out observations for the Citizen Continental-America Telescopic Eclipse (CATE) Experiment in two separate locations. UAH and APSU as part of the Montana State Ballooning Project, launched balloons containing video cameras and other instruments. USSRC Space Camp students and counselors and INSPIRE students conducted science experiments that included the following: atmospheric science investigations of the atmospheric boundary layer, very-low frequency and Ham radio observations to investigate ionospheric responses to the eclipse, animal and insect observations, solar-coronal observations, eclipse shadow bands. We report on the results of all these investigations.
United States Naval Academy Polar Science Program's Visual Arctic Observing Buoys; The IceGoat
NASA Astrophysics Data System (ADS)
Woods, J. E.; Clemente-Colon, P.; Nghiem, S. V.; Rigor, I.; Valentic, T. A.
2012-12-01
The U.S. Naval Academy Oceanography Department currently has a curriculum based Polar Science Program (USNA PSP). Within the PSP there is an Arctic Buoy Program (ABP) student research component that will include the design, build, testing and deployment of Arctic Buoys. Establishing an active, field-research program in Polar Science will greatly enhance Midshipman education and research, as well as introduce future Naval Officers to the Arctic environment. The Oceanography Department has engaged the USNA Ocean Engineering, Systems Engineering, Aerospace Engineering, and Computer Science Departments and developed a USNA Visual Arctic Observing Buoy, IceGoat1, which was designed, built, and deployed by midshipmen. The experience gained through Polar field studies and data derived from these buoys will be used to enhance course materials and laboratories and will also be used directly in Midshipman independent research projects. The USNA PSP successfully deployed IceGoat1 during the BROMEX 2012 field campaign out of Barrow, AK in March 2012. This buoy reports near real-time observation of Air Temperature, Sea Temperature, Atmospheric Pressure, Position and Images from 2 mounted webcams. The importance of this unique type of buoy being inserted into the U.S. Interagency Arctic Buoy Program and the International Arctic Buoy Programme (USIABP/IABP) array is cross validating satellite observations of sea ice cover in the Arctic with the buoys webcams. We also propose to develop multiple sensor packages for the IceGoat to include a more robust weather suite, and a passive acoustic hydrophone. Remote cameras on buoys have provided crucial qualitative information that complements the quantitative measurements of geophysical parameters. For example, the mechanical anemometers on the IABP Polar Arctic Weather Station at the North Pole Environmental Observatory (NPEO) have at times reported zero winds speeds, and inspection of the images from the NPEO cameras have showed frosting on the camera during these same periods indicating that the anemometer has temporarily frozen up. Later when the camera lens clears, the anemometers resume providing reasonable wind speeds. The cameras have also provided confirmation of the onset of melt and freeze, and indications of cloudy and clear skies. USNA PSP will monitor meteorological and oceanographic parameters of the Arctic environment remotely via its own buoys. Web cameras will provide near real time visual observations of the buoys current positions, allowing for instant validation of other remotes sensors and modeled data. Each buoy will be developed with at a minimum a meteorological sensor package in accordance with IABP protocol (2m Air Temp, SLP). Platforms will also be developed with new sensor packages to possibly include, wind speed, ice temperature, sea ice thickness, underwater acoustics, and new communications suites (Iridium, Radio). The uniqueness of the IceGoat is that it is based on the new AXIB buoy designed by LBI, Inc. that has a proven record of being able to survive in the harsh marginal ice zone environment. IceGoat1 will be deployed in the High Arctic during the USCGC HEALY cruise in late August 2012.
Mars Color Imager (MARCI) on the Mars Climate Orbiter
Malin, M.C.; Bell, J.F.; Calvin, W.; Clancy, R.T.; Haberle, R.M.; James, P.B.; Lee, S.W.; Thomas, P.C.; Caplinger, M.A.
2001-01-01
The Mars Color Imager, or MARCI, experiment on the Mars Climate Orbiter (MCO) consists of two cameras with unique optics and identical focal plane assemblies (FPAs), Data Acquisition System (DAS) electronics, and power supplies. Each camera is characterized by small physical size and mass (???6 x 6 x 12 cm, including baffle; <500 g), low power requirements (<2.5 W, including power supply losses), and high science performance (1000 x 1000 pixel, low noise). The Wide Angle (WA) camera will have the capability to map Mars in five visible and two ultraviolet spectral bands at a resolution of better than 8 km/pixel under the worst case downlink data rate. Under better downlink conditions the WA will provide kilometer-scale global maps of atmospheric phenomena such as clouds, hazes, dust storms, and the polar hood. Limb observations will provide additional detail on atmospheric structure at 1/3 scale-height resolution. The Medium Angle (MA) camera is designed to study selected areas of Mars at regional scale. From 400 km altitude its 6?? FOV, which covers ???40 km at 40 m/pixel, will permit all locations on the planet except the poles to be accessible for image acquisitions every two mapping cycles (roughly 52 sols). Eight spectral channels between 425 and 1000 nm provide the ability to discriminate both atmospheric and surface features on the basis of composition. The primary science objectives of MARCI are to (1) observe Martian atmospheric processes at synoptic scales and mesoscales, (2) study details of the interaction of the atmosphere with the surface at a variety of scales in both space and time, and (3) examine surface features characteristic of the evolution of the Martian climate over time. MARCI will directly address two of the three high-level goals of the Mars Surveyor Program: Climate and Resources. Life, the third goal, will be addressed indirectly through the environmental factors associated with the other two goals. Copyright 2001 by the American Geophysical Union.
The Mars Color Imager (MARCI) on the Mars Climate Orbiter
NASA Astrophysics Data System (ADS)
Malin, M. C.; Calvin, W.; Clancy, R. T.; Haberle, R. M.; James, P. B.; Lee, S. W.; Thomas, P. C.; Caplinger, M. A.
2001-08-01
The Mars Color Imager, or MARCI, experiment on the Mars Climate Orbiter (MCO) consists of two cameras with unique optics and identical focal plane assemblies (FPAs), Data Acquisition System (DAS) electronics, and power supplies. Each camera is characterized by small physical size and mass (~6 × 6 × 12 cm, including baffle; <500 g), low power requirements (<2.5 W, including power supply losses), and high science performance (1000 × 1000 pixel, low noise). The Wide Angle (WA) camera will have the capability to map Mars in five visible and two ultraviolet spectral bands at a resolution of better than 8 km/pixel under the worst case downlink data rate. Under better downlink conditions the WA will provide kilometer-scale global maps of atmospheric phenomena such as clouds, hazes, dust storms, and the polar hood. Limb observations will provide additional detail on atmospheric structure at
Key and Driving Requirements for the Juno Payload of Instruments
NASA Technical Reports Server (NTRS)
Dodge, Randy; Boyles, Mark A.; Rasbach, Chuck E.
2007-01-01
The Juno Mission was selected in the summer of 2005 via NASA's New Frontiers competitive AO process (refer to http://www.nasa.gov/home/hqnews/2005/jun/HQ_05138_New_Frontiers_2.html). The Juno project is led by a Principle Investigator based at Southwest Research Institute [SwRI] in San Antonio, Texas, with project management based at the Jet Propulsion Laboratory [JPL] in Pasadena, California, while the Spacecraft design and Flight System Integration are under contract to Lockheed Martin Space Systems Company [LM-SSC] in Denver, Colorado. the payload suite consists of a large number of instruments covering a wide spectrum of experimentation. The science team includes a lead Co-investigator for each one of the following experiments: A Magnetometer experiment (consisting of both a FluxGate Magnetometer (FGM) built at Goddard Space Flight Center GSFC] and a Scalar Helium Magnetometer (SHM) built at JPL, a MicroWave Radiometer (MWR) also built at JPL, a Gravity Science experiment (GS) implemented via the telecom subsystem, two complementary particle instruments (Jovian Auroral Distribution Experiment, JADE developed by SwRI and Juno Energetic-particle Detector Instrument, JEDI from the Applied Physics Lab (APL)--JEDI and JADE both measure electrons and ions), an Ultraviolet Spectrometer (UVS) also developed at SwRI, and a radio and plasma (WAVES) experiment (from the University of Iowa). In addition, a visible camera (JunoCam) is included in the payload to facilitate education and public outreach (designed & fabricated by Malin Space Science Systems [MSSS]).
NASA Astrophysics Data System (ADS)
Ivanov, A. B.; Rossi, A.
2009-04-01
Studies of layered terrains in polar regions as well as inside craters and other areas on Mars often require knowledge of local topography at much finer resolution than global MOLA topography allows. For example, in the polar layered deposits spatial relationships are important to understand unconformities that are observed on the edges of the layered terrains [15,3]. Their formation process is not understood at this point, yet fine scale topography, joint with ground penetrating radar like SHARAD and MARSIS may shed light on their 3D structure. Landing site analysis also requires knowledge of local slopes and roughness at scales from 1 to 10 m [1,2]. Mars Orbiter Camera [13] has taken stereo images at these scales, however interpretation was difficult due to unstable behavior of the Mars Global Surveyor spacecraft during image take (wobbling effect). Mars Reconnaissance Orbiter (MRO) is much better stabilized, since it is required for optimal operation of its high resolution camera. In this work we have utilized data from MRO sensors (CTX camera [11] and HIRISE camera [12] in order to derive digital elevation models (DEM) from images targeted as stereo pairs. We employed methods and approaches utilized for the Mars Orbiter Camera (MOC) stereo data [4,5]. CTX data varies in resolution and stereo pairs analyzed in this work can be derived at approximately 10m scale. HIRISE images allow DEM post spacing at around 1 meter. The latter are very big images and our computer infrastructure was only able to process either reduced resolution images, covering larger surface or working with smaller patches at the original resolution. We employed stereo matching technique described in [5,9], in conjunction with radiometric and geometric image processing in ISIS3 [16]. This technique is capable of deriving tiepoint co-registration at subpixel precision and has proven itself when used for Pathfinder and MER operations [8]. Considerable part of this work was to accommodate CTX and HIRISE image processing in the existing data processing pipeline and improve it at the same time. Currently the workflow is not finished: DEM units are relative and are not in elevation. We have been able to derive successful DEMs from CTX data Becquerel [14] and Crommelin craters as well as for some areas in the North Polar Layered Terrain. Due to its tremendous resolution HIRISE data showing great surface detail, hence allowing better correlation than other sensors considered in this work. In all cases DEM were showing considerable potential for exploration of terrain characteristics. Next steps include cross validation results with DEM produced by other teams and sensors (HRSC [6], HIRISE [7]) and providing elevation in terms of absolute height over a MOLA areoid. MRO imaging data allows us an unprecedented look at Martian terrain. This work provides a step forward derivation of DEM from HIRISE and CTX datasets and currently undergoing validation vs. other existing datasets. We will present our latest results for layering structures in both North and South Polar Layered deposits as well as layered structures inside Becquerel and Crommelin craters. Digital Elevation models derived from the CTX sensor can also be utilized effectively as a input for clutter reduction models, which are in turn used for the ground penetrating SHARAD instrument [13]. References. [1] R. Arvidson, et al. Mars exploration program 2007 phoenix landing site selection and characteristics. Journal of Geophysical Research-Planets, 113, JUN 19 2008. [2] M. Golombek, et al. Assessment of mars exploration rover landing site predictions. Nature, 436(7047):44-48, JUL 7 2005. [3] K. E. Herkenhoff, et al. Meter-scale morphology of the north polar region of mars. Science, 317(5845):1711-1715, SEP 21 2007. [4] A. B. Ivanov. Ten-Meter Scale Topography and Roughness of Mars Exploration Rovers Landing Sites and Martian Polar Regions. volume 34 of Lunar and Planetary Inst. Technical Report, pages 2084-+, Mar. 2003. [5] A. B. Ivanov and J. J. Lorre. Analysis of Mars Orbiter Camera Stereo Pairs. In Lunar and Planetary Institute Conference Abstracts, volume 33 of Lunar and Planetary Inst. Technical Report, pages 1845-+, Mar. 2002. [6] R. Jaumann, et al. The high-resolution stereo camera (HRSC) experiment on mars express: Instrument aspects and experiment conduct from interplanetary cruise through the nominal mission. Planetary and Space Science, 55(7-8):928-952, MAY 2007. [7] R. L. Kirk, et al. Ultrahigh resolution topographic mapping of mars with MRO HIRISE stereo images: Meter-scale slopes of candidate phoenix landing sites. Journal of Geophysical Research-Planets, 113, NOV 15 2008. [8] S. Lavoie, et al. Processing and analysis of mars pathfinder science data at the jet propulsion laboratory's science data processing systems section. Journal of Geophysical Research-Planets, 104(E4):8831-8852, APR 25 1999. [9] J. J. Lorre, et al. Recent developments at JPL in the application of image processing to astronomy. In D. L. Crawford, editor, Instrumentation in Astronomy III, volume 172 of Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, pages 394-402, 1979. [10] M. Malin, et al. Early views of the martian surface from the mars orbiter camera of mars global surveyor. Science, 279(5357):1681-1685, MAR 13 1998. [11] M. C. Malin,et al. Context camera investigation on board the mars reconnaissance orbiter. Journal of Geophysical Research-Planets, 112(E5), MAY 18 2007. [12] A. S. McEwen, et al.. Mars reconnaissance orbiter's high resolution imaging science experiment (HIRISE). Journal of Geophysical Research-Planets, 112(E5), MAY 17 2007. [13] A. Rossi, et al. Multi-spacecraft synergy with MEX HRSC and MRO SHARAD: Light-Toned Deposits in crater bulges. AGU Fall Meeting Abstracts, pages B1371+, Dec. 2008. [14] A. P. Rossi, et al. Stratigraphic architecture and structural control on sediment emplacement in Becquerel crater (Mars). volume 40. Lunar and Planetary Science Institute, 2009. [15] K. L. Tanaka,et al. North polar region of mars: Advances in stratigraphy, structure, and erosional modification, AUG 2008. Icarus. [16] USGS. Planetary image processing software: ISIS3. http://isis.astrogeology.usgs.gov/
Lambert, Mark; Bellamy, Fiona; Budgey, Richard; Callaby, Rebecca; Coats, Julia; Talling, Janet
2018-01-01
Indices of rodent activity are used as indicators of population change during field evaluation of rodenticides. We investigated the potential for using camera traps to determine activity indices for commensal rodents living in and around farm buildings, and sought to compare these indices against previously calibrated survey methods. We recorded 41 263 images of 23 species, including Norway rats (Rattus norvegicus Berk.) and house mice (Mus musculus L.). We found a positive correlation between activity indices from camera traps and activity indices from a method (footprint tracking) previously shown to have a linear relationship with population size for Norway rats. Filtering the camera trap data to simulate a 30-s delay between camera trigger events removed 59.9% of data and did not adversely affect the correlation between activity indices from camera traps and footprint tracking. The relationship between activity indices from footprint tracking and Norway rat population size is known from a previous study; from this, we determined the relationship between activity indices from camera traps and population size for Norway rats living in and around farm buildings. Systematic use of camera traps was used to determine activity indices for Norway rats living in and around farm buildings; the activity indices were positively correlated with those derived from a method previously calibrated against known population size for this species in this context. © 2017 Crown copyright. Pest Management Science © 2017 Society of Chemical Industry. © 2017 Crown copyright. Pest Management Science © 2017 Society of Chemical Industry.
NASA Technical Reports Server (NTRS)
Karcz, J. S.; Bowling, D.; Cornelison, C.; Parrish, A.; Perez, A.; Raiche, G.; Wiens, J.-P.
2016-01-01
The Ames Vertical Gun Range (AVGR) is a national facility for conducting laboratory- scale investigations of high-speed impact processes. It provides a set of light-gas, powder, and compressed gas guns capable of accelerating projectiles to speeds up to 7 km s(exp -1). The AVGR has a unique capability to vary the angle between the projectile-launch and gravity vectors between 0 and 90 deg. The target resides in a large chamber (diameter approximately 2.5 m) that can be held at vacuum or filled with an experiment-specific atmosphere. The chamber provides a number of viewing ports and feed-throughs for data, power, and fluids. Impacts are observed via high-speed digital cameras along with investigation-specific instrumentation, such as spectrometers. Use of the range is available via grant proposals through any Planetary Science Research Program element of the NASA Research Opportunities in Space and Earth Sciences (ROSES) calls. Exploratory experiments (one to two days) are additionally possible in order to develop a new proposal.
NASA Technical Reports Server (NTRS)
Ostrogorsky, A.; Marin, C.; Volz, M. P.; Bonner, W. A.
2005-01-01
Solidification Using a Baffle in Sealed Ampoules (SUBSA) is the first investigation conducted in the Microgravity Science Glovebox (MSG) Facility at the International Space Station (ISS) Alpha. 8 single crystals of InSb, doped with Te and Zn, were directionally solidified in microgravity. The experiments were conducted in a furnace with a transparent gradient section, and a video camera, sending images to the earth. The real time images (i) helped seeding, (ii) allowed a direct measurement of the solidification rate. The post-flight characterization of the crystals includes: computed x-ray tomography, Secondary Ion Mass Spectroscopy (SIMS), Hall measurements, Atomic Absorption (AA), and 4 point probe analysis. For the first time in microgravity, several crystals having nearly identical initial transients were grown. Reproducible initial transients were obtained with Te-doped InSb. Furthermore, the diffusion controlled end-transient was demonstrated experimentally (SUBSA 02). From the initial transients, the diffusivity of Te and Zn in InSb was determined.
NASA Astrophysics Data System (ADS)
Vaidya, Ashwin; Munakata, Mika
2014-03-01
The Art of Science project at Montclair State University strives to communicate the creativity inherent in the sciences to students and the general public alike. The project uses connections between the arts and sciences to show the underlying unity and interdependence of the two. The project is planned as one big `performance' bringing together the two disciplines around the theme of sustainability. In the first phase, physics students learned about and built human-powered generators including hand cranks and bicycle units. In the second phase, using the generators to power video cameras, art students worked with a visiting artist to make short films on the subject of sustainability, science, and art. The generators and films were showcased at an annual university Physics and Art exhibition which was open to the university and local community. In the final phase, to be conducted, K12 teachers will learn about the project through a professional development workshop and will be encouraged to adapt the experiment for their own classrooms. The last phase will also combine the university and K12 projects for an exhibition to be displayed on Earth Day, 2014. Project funded by the APS Outreach Grant.
Pettit runs a drill while looking through a camera mounted on the Nadir window in the U.S. Lab
2003-04-05
ISS006-E-44305 (5 April 2003) --- Astronaut Donald R. Pettit, Expedition Six NASA ISS science officer, runs a drill while looking through a camera mounted on the nadir window in the Destiny laboratory on the International Space Station (ISS). The device is called a barn door tracker. The drill turns the screw, which moves the camera and its spotting scope.
The Citizen CATE Experiment for the 2017 Total Solar Eclipse
NASA Astrophysics Data System (ADS)
Penn, M. J.
2015-12-01
The path of the total solar eclipse of 21 August 2017 passes over about 10 million homes in the USA. Tens of millions more people will travel to the path of totality to view the eclipse first-hand. Using TV and the internet broadcasts, hundreds of millions of people will watch the eclipse, making the event the most viewed astronomical event in the history of mankind. The Citizen Continental-America Telescopic Eclipse (CATE) Experiment for 2017 is being developed at the National Solar Observatory in partnership with universities, schools, astronomy clubs, and corporations. The CATE experiment will use more than 60 identical telescopes equipped with digital cameras positioned from Oregon to South Carolina to image the solar corona. The project will then splice these images together to show the corona during a 90-minute period, revealing for the first time the plasma dynamics of the inner solar corona. The goals for the highly leveraged CATE experiment are diverse and range from providing an authentic STEM research experience for students and lifelong learners, to making state-of-the-art solar coronal observations of the plasma dynamics of coronal polar plumes, to increasing the US scientific literacy. A key goal of this experiment is to donate the telescope and camera system to the volunteer who collects data with it during the total eclipse. The instrument will be then used for a variety of follow-up citizen science projects in astronomy, ranging from solar to cometary to variable star observations. For this reason no government funding is being sought for the equipment costs, but rather private and corporate sources are being developed. The data collected for the 2017 eclipse will be freely available to the scientific, education and amateur astronomy communities. Crowd sourcing the data collection is an essential part of this project, as there are not enough solar physicists in this country to collect these observations. Finally, each site is expected to collect about 10 Gbytes of science data and 10 Gbytes of calibration data, resulting in 1.2 Tbytes of data for the project.
ERIC Educational Resources Information Center
Ruiz, Michael J.
1982-01-01
The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…
Developing Short Films of Geoscience Research
NASA Astrophysics Data System (ADS)
Shipman, J. S.; Webley, P. W.; Dehn, J.; Harrild, M.; Kienenberger, D.; Salganek, M.
2015-12-01
In today's prevalence of social media and networking, video products are becoming increasingly more useful to communicate research quickly and effectively to a diverse audience, including outreach activities as well as within the research community and to funding agencies. Due to the observational nature of geoscience, researchers often take photos and video footage to document fieldwork or to record laboratory experiments. Here we present how researchers can become more effective storytellers by collaborating with filmmakers to produce short documentary films of their research. We will focus on the use of traditional high-definition (HD) camcorders and HD DSLR cameras to record the scientific story while our research topic focuses on the use of remote sensing techniques, specifically thermal infrared imaging that is often used to analyze time varying natural processes such as volcanic hazards. By capturing the story in the thermal infrared wavelength range, in addition to traditional red-green-blue (RGB) color space, the audience is able to experience the world differently. We will develop a short film specifically designed using thermal infrared cameras that illustrates how visual storytellers can use these new tools to capture unique and important aspects of their research, convey their passion for earth systems science, as well as engage and captive the viewer.
NASA Astrophysics Data System (ADS)
Waltham, N.; Beardsley, S.; Clapp, M.; Lang, J.; Jerram, P.; Pool, P.; Auker, G.; Morris, D.; Duncan, D.
2017-11-01
Solar Dynamics Observatory (SDO) is imaging the Sun in many wavelengths near simultaneously and with a resolution ten times higher than the average high-definition television. In this paper we describe our innovative systems approach to the design of the CCD cameras for two of SDO's remote sensing instruments, the Atmospheric Imaging Assembly (AIA) and the Helioseismic and Magnetic Imager (HMI). Both instruments share use of a custom-designed 16 million pixel science-grade CCD and common camera readout electronics. A prime requirement was for the CCD to operate with significantly lower drive voltages than before, motivated by our wish to simplify the design of the camera readout electronics. Here, the challenge lies in the design of circuitry to drive the CCD's highly capacitive electrodes and to digitize its analogue video output signal with low noise and to high precision. The challenge is greatly exacerbated when forced to work with only fully space-qualified, radiation-tolerant components. We describe our systems approach to the design of the AIA and HMI CCD and camera electronics, and the engineering solutions that enabled us to comply with both mission and instrument science requirements.
STS-42 MS/PLC Norman E. Thagard adjusts Rack 10 FES equipment in IML-1 module
1992-01-30
STS042-05-006 (22-30 Jan 1992) --- Astronaut Norman E. Thagard, payload commander, performs the Fluids Experiment System (FES) in the International Microgravity Laboratory (IML-1) science module. The FES is a NASA-developed facility that produces optical images of fluid flows during the processing of materials in space. The system's sophisticated optics consist of a laser to make holograms of samples and a video camera to record images of flows in and around samples. Thagard was joined by six fellow crewmembers for eight days of scientific research aboard Discovery in Earth-orbit. Most of their on-duty time was spent in this IML-1 science module, positioned in the cargo bay and attached via a tunnel to Discovery's airlock.
Architecture of PAU survey camera readout electronics
NASA Astrophysics Data System (ADS)
Castilla, Javier; Cardiel-Sas, Laia; De Vicente, Juan; Illa, Joseph; Jimenez, Jorge; Maiorino, Marino; Martinez, Gustavo
2012-07-01
PAUCam is a new camera for studying the physics of the accelerating universe. The camera will consist of eighteen 2Kx4K HPK CCDs: sixteen for science and two for guiding. The camera will be installed at the prime focus of the WHT (William Herschel Telescope). In this contribution, the architecture of the readout electronics system is presented. Back- End and Front-End electronics are described. Back-End consists of clock, bias and video processing boards, mounted on Monsoon crates. The Front-End is based on patch panel boards. These boards are plugged outside the camera feed-through panel for signal distribution. Inside the camera, individual preamplifier boards plus kapton cable completes the path to connect to each CCD. The overall signal distribution and grounding scheme is shown in this paper.
Multi-Purpose Crew Vehicle Camera Asset Planning: Imagery Previsualization
NASA Technical Reports Server (NTRS)
Beaulieu, K.
2014-01-01
Using JSC-developed and other industry-standard off-the-shelf 3D modeling, animation, and rendering software packages, the Image Science Analysis Group (ISAG) supports Orion Project imagery planning efforts through dynamic 3D simulation and realistic previsualization of ground-, vehicle-, and air-based camera output.
2001-05-02
John Henson (grade 12) and Suzi Bryce (grade 10) conducted the drop from DuPont Manual High School in Louisville, Kentucky, conduct a drop with NASA's Microgravity Demonstrator. A camera and a TV/VCR unit let students play back recordings of how different physical devices behave differently during freefall as compared to 1-g. The activity was part of the education outreach segment of the Pan-Pacific Basin Workshop on Microgravity Sciences held in Pasadena, California. The event originated at the California Science Center in Los Angeles. The DuPont Manual students patched in to the event through the distance learning lab at the Louisville Science Center. This image is from a digital still camera; higher resolution is not available.
NASA Technical Reports Server (NTRS)
Kuebert, E. J.
1977-01-01
A Laser Altimeter and Mapping Camera System was included in the Apollo Lunar Orbital Experiment Missions. The backup system, never used in the Apollo Program, is available for use in the Lidar Test Experiments on the STS Orbital Flight Tests 2 and 4. Studies were performed to assess the problem associated with installation and operation of the Mapping Camera System in the STS. They were conducted on the photographic capabilities of the Mapping Camera System, its mechanical and electrical interface with the STS, documentation, operation and survivability in the expected environments, ground support equipment, test and field support.
NASA Astrophysics Data System (ADS)
Muller, Jan-Peter; Tao, Yu; Sidiropoulos, Panagiotis; Gwinner, Klaus; Willner, Konrad; Fanara, Lida; Waehlisch, Marita; van Gasselt, Stephan; Walter, Sebastian; Steikert, Ralf; Schreiner, Bjoern; Ivanov, Anton; Cantini, Federico; Wardlaw, Jessica; Morley, Jeremy; Sprinks, James; Giordano, Michele; Marsh, Stuart; Kim, Jungrack; Houghton, Robert; Bamford, Steven
2016-06-01
Understanding planetary atmosphere-surface exchange and extra-terrestrial-surface formation processes within our Solar System is one of the fundamental goals of planetary science research. There has been a revolution in planetary surface observations over the last 15 years, especially in 3D imaging of surface shape. This has led to the ability to overlay image data and derived information from different epochs, back in time to the mid 1970s, to examine changes through time, such as the recent discovery of mass movement, tracking inter-year seasonal changes and looking for occurrences of fresh craters. Within the EU FP-7 iMars project, we have developed a fully automated multi-resolution DTM processing chain, called the Coregistration ASP-Gotcha Optimised (CASP-GO), based on the open source NASA Ames Stereo Pipeline (ASP) [Tao et al., this conference], which is being applied to the production of planetwide DTMs and ORIs (OrthoRectified Images) from CTX and HiRISE. Alongside the production of individual strip CTX & HiRISE DTMs & ORIs, DLR [Gwinner et al., 2015] have processed HRSC mosaics of ORIs and DTMs for complete areas in a consistent manner using photogrammetric bundle block adjustment techniques. A novel automated co-registration and orthorectification chain has been developed by [Sidiropoulos & Muller, this conference]. Using the HRSC map products (both mosaics and orbital strips) as a map-base it is being applied to many of the 400,000 level-1 EDR images taken by the 4 NASA orbital cameras. In particular, the NASA Viking Orbiter camera (VO), Mars Orbiter Camera (MOC), Context Camera (CTX) as well as the High Resolution Imaging Science Experiment (HiRISE) back to 1976. A webGIS has been developed [van Gasselt et al., this conference] for displaying this time sequence of imagery and will be demonstrated showing an example from one of the HRSC quadrangle map-sheets. Automated quality control [Sidiropoulos & Muller, 2015] techniques are applied to screen for suitable images and these are extended to detect temporal changes in features on the surface such as mass movements, streaks, spiders, impact craters, CO2 geysers and Swiss Cheese terrain. For result verification these data mining techniques are then being employed within a citizen science project within the Zooniverse family. Examples of data mining and its verification will be presented.
Wilderness experience in Rocky Mountain National Park 2002; report to respondents
Schuster, Elke; Johnson, S. Shea; Taylor, Jonathan G.
2003-01-01
A substantial amount of backcountry (about 250,000 acres) in Rocky Mountain National Park [RMNP of the Park] may be designated as wilderness areas in the coming years. Currently, over 3 million visitors drives through the park on Trail Ridge Road, camp in designated campgrounds, day hike, etc. each year. Many of those visitors also report using the backcountry-wilderness areas that are not easily accessible by roads or trails. Use of the backcountry is growing at RMNP and is accompanied by changing visitor expectations and preferences for wilderness management. For these reasons it is of great importance for the Park to periodically assess what types of environments and conditions wilderness users seek to facilitate a quality experience. To assist in this effort, the Political Analysis and Science Assistance [PSAS] program / Fort Collins Center / U.S. Geological Survey, in close collaboration with personnel and volunteers from RMNP, as well as the Natural Resource Recreation and Tourism [NRRT] Department at Colorado State University, launched a research effort in the summer of 2002 to investigate visitorsa?? wilderness experiences in the Park. Specifically, the purpose of this research was: (1) To determine what constitutes a wilderness experience; (2) To identify important places, visual features, and sounds essential to a quality wilderness experience and; (3) To determine what aspects may detract from wilderness experience. Thus, answers to these questions should provide insight for Park managers about visitorsa?? expectation for wilderness recreation and the conditions they seek for quality wilderness experiences. Ultimately, this information can be used to support wilderness management decisions within RMNP. The social science technique of Visitor Employed Photography [VEP] was used to obtain information from visitors about wilderness experiences. Visitors were selected at random from Park-designated wilderness trails, in proportion to their use, and asked to participate in the survey. Respondents were given single use, 10-exposure cameras and photo-log diaries to record experiences. A total of 293 cameras were distributed, with a response rate of 87%. Following the development of the photos, a copy of the photos, two pertinent pages from the photo-log, and a follow-up survey were mailed to respondents. Fifty-six percent of the follow-up surveys were returned. Findings from the two surveys were analyzed and compareda?|
Aoki, Hisae; Yamashita, Hiromasa; Mori, Toshiyuki; Fukuyo, Tsuneo; Chiba, Toshio
2014-11-01
We developed a new ultrahigh-sensitive CMOS camera using a specific sensor that has a wide range of spectral sensitivity characteristics. The objective of this study is to present our updated endoscopic technology that has successfully integrated two innovative functions; ultrasensitive imaging as well as advanced fluorescent viewing. Two different experiments were conducted. One was carried out to evaluate the function of the ultrahigh-sensitive camera. The other was to test the availability of the newly developed sensor and its performance as a fluorescence endoscope. In both studies, the distance from the endoscopic tip to the target was varied and those endoscopic images in each setting were taken for further comparison. In the first experiment, the 3-CCD camera failed to display the clear images under low illumination, and the target was hardly seen. In contrast, the CMOS camera was able to display the targets regardless of the camera-target distance under low illumination. Under high illumination, imaging quality given by both cameras was quite alike. In the second experiment as a fluorescence endoscope, the CMOS camera was capable of clearly showing the fluorescent-activated organs. The ultrahigh sensitivity CMOS HD endoscopic camera is expected to provide us with clear images under low illumination in addition to the fluorescent images under high illumination in the field of laparoscopic surgery.
Science-Filters Study of Martian Rock Sees Hematite
2017-11-01
This false-color image demonstrates how use of special filters available on the Mast Camera (Mastcam) of NASA's Curiosity Mars rover can reveal the presence of certain minerals in target rocks. It is a composite of images taken through three "science" filters chosen for making hematite, an iron-oxide mineral, stand out as exaggerated purple. This target rock, called "Christmas Cove," lies in an area on Mars' "Vera Rubin Ridge" where Mastcam reconnaissance imaging (see PIA22065) with science filters suggested a patchy distribution of exposed hematite. Bright lines within the rocks are fractures filled with calcium sulfate minerals. Christmas Cove did not appear to contain much hematite until the rover team conducted an experiment on this target: Curiosity's wire-bristled brush, the Dust Removal Tool, scrubbed the rock, and a close-up with the Mars Hand Lens Imager (MAHLI) confirmed the brushing. The brushed area is about is about 2.5 inches (6 centimeters) across. The next day -- Sept. 17, 2017, on the mission's Sol 1819 -- this observation with Mastcam and others with the Chemistry and Camera (ChemCam showed a strong hematite presence that had been subdued beneath the dust. The team is continuing to explore whether the patchiness in the reconnaissance imaging may result more from variations in the amount of dust cover rather than from variations in hematite content. Curiosity's Mastcam combines two cameras: one with a telephoto lens and the other with a wider-angle lens. Each camera has a filter wheel that can be rotated in front of the lens for a choice of eight different filters. One filter for each camera is clear to all visible light, for regular full-color photos, and another is specifically for viewing the Sun. Some of the other filters were selected to admit wavelengths of light that are useful for identifying iron minerals. Each of the filters used for this image admits light from a narrow band of wavelengths, extending to only about 5 nanometers longer or shorter than the filter's central wavelength. Three observations are combined for this image, each through one of the filters centered at 751 nanometers (in the near-infrared part of the spectrum just beyond red light), 527 nanometers (green) and 445 nanometers (blue). Usual color photographs from digital cameras -- such as a Mastcam one of this same place (see PIA22067) -- also combine information from red, green and blue filtering, but the filters are in a microscopic grid in a "Bayer" filter array situated directly over the detector behind the lens, with wider bands of wavelengths. Mastcam's narrow-band filters used for this view help to increase spectral contrast, making blues bluer and reds redder, particularly with the processing used to boost contrast in each of the component images of this composite. Fine-grained hematite preferentially absorbs sunlight around in the green portion of the spectrum around 527 nanometers. That gives it the purple look from a combination of red and blue light reflected by the hematite and reaching the camera through the other two filters. https://photojournal.jpl.nasa.gov/catalog/PIA22066
NASA Technical Reports Server (NTRS)
2006-01-01
This full-frame image from the High Resolution Imaging Science Experiment camera on NASA's Mars Reconnaissance Orbiter shows faults and pits in Mars' north polar residual cap that have not been previously recognized. The faults and depressions between them are similar to features seen on Earth where the crust is being pulled apart. Such tectonic extension must have occurred very recently because the north polar residual cap is very young, as indicated by the paucity of impact craters on its surface. Alternatively, the faults and pits may be caused by collapse due to removal of material beneath the surface. The pits are aligned along the faults, either because material has drained into the subsurface along the faults or because gas has escaped from the subsurface through them. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The High Resolution Imaging Science Experiment is operated by the University of Arizona, Tucson, and the instrument was built by Ball Aerospace and Technology Corp., Boulder, Colo.CUVE - Cubesat UV Experiment: Unveil Venus' UV Absorber with Cubesat UV Mapping Spectrometer
NASA Astrophysics Data System (ADS)
Cottini, V.; Aslam, S.; D'Aversa, E.; Glaze, L.; Gorius, N.; Hewagama, T.; Ignatiev, N.; Piccioni, G.
2017-09-01
Our Venus mission concept Cubesat UV Experiment (CUVE) is one of ten proposals selected for funding by the NASA PSDS3 Program - Planetary Science Deep Space SmallSat Studies. CUVE concept is to insert a CubeSat spacecraft into a Venusian orbit and perform remote sensing of the UV spectral region using a high spectral resolution point spectrometer to resolve UV molecular bands, observe nightglow, and characterize the unidentified main UV absorber. The UV spectrometer is complemented by an imaging UV camera with multiple bands in the UV absorber main band range for contextual imaging. CUVE Science Objectives are: the nature of the "Unknown" UV-absorber; the abundances and distributions of SO2 and SO at and above Venus's cloud tops and their correlation with the UV absorber; the atmospheric dynamics at the cloud tops, structure of upper clouds and wind measurements from cloud-tracking; the nightglow emissions: NO, CO, O2. This mission will therefore be an excellent platform to study Venus' cloud top atmospheric properties where the UV absorption drives the planet's energy balance. CUVE would complement past, current and future Venus missions with conventional spacecraft, and address critical science questions cost effectively.
CATE 2016 Indonesia: Camera, Software, and User Interface
NASA Astrophysics Data System (ADS)
Kovac, S. A.; Jensen, L.; Hare, H. S.; Mitchell, A. M.; McKay, M. A.; Bosh, R.; Watson, Z.; Penn, M.
2016-12-01
The Citizen Continental-America Telescopic Eclipse (Citizen CATE) Experiment will use a fleet of 60 identical telescopes across the United States to image the inner solar corona during the 2017 total solar eclipse. For a proof of concept, five sites were hosted along the path of totality during the 2016 total solar eclipse in Indonesia. Tanjung Pandan, Belitung, Indonesia was the first site to experience totality. This site had the best seeing conditions and focus, resulting in the highest quality images. This site proved that the equipment that is going to be used is capable of recording high quality images of the solar corona. Because 60 sites will be funded, each set up needs to be cost effective. This requires us to use an inexpensive camera, which consequently has a small dynamic range. To compensate for the corona's intensity drop off factor of 1,000, images are taken at seven frames per second, at exposures 0.4ms, 1.3ms, 4.0ms, 13ms, 40ms, 130ms, and 400ms. Using MatLab software, we are able to capture a high dynamic range with an Arduino that controls the 2448 x 2048 CMOS camera. A major component of this project is to train average citizens to use the software, meaning it needs to be as user friendly as possible. The CATE team is currently working with MathWorks to create a graphic user interface (GUI) that will make data collection run smoothly. This interface will include tabs for alignment, focus, calibration data, drift data, GPS, totality, and a quick look function. This work was made possible through the National Solar Observatory Research Experiences for Undergraduates (REU) Program, which is funded by the National Science Foundation (NSF). The NSO Training for 2017 Citizen CATE Experiment, funded by NASA (NASA NNX16AB92A), also provided support for this project. The National Solar Observatory is operated by the Association of Universities for Research in Astronomy, Inc. (AURA) under cooperative agreement with the NSF.
German activities in optical space instrumentation
NASA Astrophysics Data System (ADS)
Hartmann, G.
2018-04-01
In the years of space exploration since the mid-sixties, a wide experience in optical space instrumentation has developed in Germany. This experience ranges from large telescopes in the 1 m and larger category with the accompanying focal plane detectors and spectrometers for all regimes of the electromagnetic spectrum (infrared, visible, ultraviolet, x-rays), to miniature cameras for cometary and planetary explorations. The technologies originally developed for space science. are now also utilized in the fields of earth observation and even optical telecommunication. The presentation will cover all these areas, with examples for specific technological or scientific highlights. Special emphasis will be given to the current state-of-the-art instrumentation technologies in scientific institutions and industry, and to the future perspective in approved and planned projects.
Explosives Instrumentation Group Trial 6/77-Propellant Fire Trials (Series Two).
1981-10-01
frames/s. A 19 mm Sony U-Matic video cassette recorder (VCR) and camera were used to view the hearth from a tower 100 m from ground-zero (GZ). Normal...camera started. This procedure permitted increased recording time of the event. A 19 mm Sony U-Matic VCR and camera was used to view the container...Lumpur, Malaysia Exchange Section, British Library, U.K. Periodicals Recording Section, Science Reference Library, British Library, U.K. Library, Chemical
A hidden view of wildlife conservation: How camera traps aid science, research and management
O'Connell, Allan F.
2015-01-01
Camera traps — remotely activated cameras with infrared sensors — first gained measurable popularity in wildlife conservation in the early 1990s. Today, they’re used for a variety of activities, from species-specific research to broad-scale inventory or monitoring programs that, in some cases, attempt to detect biodiversity across vast landscapes. As this modern tool continues to evolve, it’s worth examining its uses and benefits for wildlife management and conservation.
1996-01-01
used to locate and characterize a magnetic dipole source, and this finding accelerated the development of superconducting tensor gradiometers for... superconducting magnetic field gradiometer, two-color infrared camera, synthetic aperture radar, and a visible spectrum camera. The combination of these...Pieter Hoekstra, Blackhawk GeoSciences ......................................... 68 Prediction for UXO Shape and Orientation Effects on Magnetic
The prototype cameras for trans-Neptunian automatic occultation survey
NASA Astrophysics Data System (ADS)
Wang, Shiang-Yu; Ling, Hung-Hsu; Hu, Yen-Sang; Geary, John C.; Chang, Yin-Chang; Chen, Hsin-Yo; Amato, Stephen M.; Huang, Pin-Jie; Pratlong, Jerome; Szentgyorgyi, Andrew; Lehner, Matthew; Norton, Timothy; Jorden, Paul
2016-08-01
The Transneptunian Automated Occultation Survey (TAOS II) is a three robotic telescope project to detect the stellar occultation events generated by TransNeptunian Objects (TNOs). TAOS II project aims to monitor about 10000 stars simultaneously at 20Hz to enable statistically significant event rate. The TAOS II camera is designed to cover the 1.7 degrees diameter field of view of the 1.3m telescope with 10 mosaic 4.5k×2k CMOS sensors. The new CMOS sensor (CIS 113) has a back illumination thinned structure and high sensitivity to provide similar performance to that of the back-illumination thinned CCDs. Due to the requirements of high performance and high speed, the development of the new CMOS sensor is still in progress. Before the science arrays are delivered, a prototype camera is developed to help on the commissioning of the robotic telescope system. The prototype camera uses the small format e2v CIS 107 device but with the same dewar and also the similar control electronics as the TAOS II science camera. The sensors, mounted on a single Invar plate, are cooled to the operation temperature of about 200K as the science array by a cryogenic cooler. The Invar plate is connected to the dewar body through a supporting ring with three G10 bipods. The control electronics consists of analog part and a Xilinx FPGA based digital circuit. One FPGA is needed to control and process the signal from a CMOS sensor for 20Hz region of interests (ROI) readout.
ERIC Educational Resources Information Center
Vollmer, Michael; Mollmann, Klaus-Peter
2011-01-01
A selection of hands-on experiments from different fields of physics, which happen too fast for the eye or video cameras to properly observe and analyse the phenomena, is presented. They are recorded and analysed using modern high speed cameras. Two types of cameras were used: the first were rather inexpensive consumer products such as Casio…
2009-10-06
NASA Conducts Airborne Science Aboard Zeppelin Airship: equipped with two imaging instruments enabling remote sensing and atmospheric science measurements not previously practical. Hyperspectral imager and large format camera mounted inside the Zeppelin nose fairing.
High Energy Replicated Optics to Explore the Sun: Hard X-Ray Balloon-Borne Telescope
NASA Technical Reports Server (NTRS)
Gaskin, Jessica; Apple, Jeff; StevensonChavis, Katherine; Dietz, Kurt; Holt, Marlon; Koehler, Heather; Lis, Tomasz; O'Connor, Brian; RodriquezOtero, Miguel; Pryor, Jonathan;
2013-01-01
Set to fly in the Fall of 2013 from Ft. Sumner, NM, the High Energy Replicated Optics to Explore the Sun (HEROES) mission is a collaborative effort between the NASA Marshall Space Flight Center and the Goddard Space Flight Center to upgrade an existing payload, the High Energy Replicated Optics (HERO) balloon-borne telescope, to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES science payload consists of 8 mirror modules, housing a total of 109 grazing-incidence optics. These modules are mounted on a carbon-fiber - and Aluminum optical bench 6 m from a matching array of high pressure xenon gas scintillation proportional counters, which serve as the focal-plane detectors. The HERO gondola utilizes a differential GPS system (backed by a magnetometer) for coarse pointing in the azimuth and a shaft angle encoder plus inclinometer provides the coarse elevation. The HEROES payload will incorporate a new solar aspect system to supplement the existing star camera, for fine pointing during both the day and night. A mechanical shutter will be added to the star camera to protect it during solar observations. HEROES will also implement two novel alignment monitoring system that will measure the alignment between the optical bench and the star camera and between the optics and detectors for improved pointing and post-flight data reconstruction. The overall payload will also be discussed. This mission is funded by the NASA HOPE (Hands On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist
High Energy Replicated Optics to Explore the Sun: Hard X-ray balloon-borne telescope
NASA Astrophysics Data System (ADS)
Gaskin, J.; Apple, J.; Chavis, K. S.; Dietz, K.; Holt, M.; Koehler, H.; Lis, T.; O'Connor, B.; Otero, M. R.; Pryor, J.; Ramsey, B.; Rinehart-Dawson, M.; Smith, L.; Sobey, A.; Wilson-Hodge, C.; Christe, S.; Cramer, A.; Edgerton, M.; Rodriguez, M.; Shih, A.; Gregory, D.; Jasper, J.; Bohon, S.
Set to fly in the Fall of 2013 from Ft. Sumner, NM, the High Energy Replicated Optics to Explore the Sun (HEROES) mission is a collaborative effort between the NASA Marshall Space Flight Center and the Goddard Space Flight Center to upgrade an existing payload, the High Energy Replicated Optics (HERO) balloon-borne telescope, to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES science payload consists of 8 mirror modules, housing a total of 109 grazing-incidence optics. These modules are mounted on a carbon-fiber - and Aluminum optical bench 6 m from a matching array of high pressure xenon gas scintillation proportional counters, which serve as the focal-plane detectors. The HERO gondola utilizes a differential GPS system (backed by a magnetometer) for coarse pointing in the azimuth and a shaft angle encoder plus inclinometer provides the coarse elevation. The HEROES payload will incorporate a new solar aspect system to supplement the existing star camera, for fine pointing during both the day and night. A mechanical shutter will be added to the star camera to protect it during solar observations. HEROES will also implement two novel alignment monitoring system that will measure the alignment between the optical bench and the star camera and between the optics and detectors for improved pointing and post-flight data reconstruction. The overall payload will also be discussed. This mission is funded by the NASA HOPE (Hands On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist.
A Mars Rover Mission Simulation on Kilauea Volcano
NASA Technical Reports Server (NTRS)
Stoker, Carol; Cuzzi, Jeffery N. (Technical Monitor)
1995-01-01
A field experiment to simulate a rover mission on Mars was performed using the Russian Marsokhod rover deployed on Kilauea Volcano HI in February, 1995. A Russian Marsokhod rover chassis was equipped with American avionics equipment, stereo cameras on a pan and tilt platform, a digital high resolution body-mounted camera, and a manipulator arm on which was mounted a camera with a close-up lens. The six wheeled rover is 2 meters long and has a mass of 120 kg. The imaging system was designed to simulate that used on the planned "Mars Together" mission. The rover was deployed on Kilauea Volcano HI and operated from NASA Ames by a team of planetary geologists and exobiologists. Two modes of mission operations were simulated for three days each: (1) long time delay, low data bandwidth (simulating a Mars mission), and (2) live video, wide-bandwidth data (allowing active control simulating a Lunar rover mission or a Mars rover mission controlled from on or near the Martian surface). Simulated descent images (aerial photographs) were used to plan traverses to address a detailed set of science questions. The actual route taken was determined by the science team and the traverse path was frequently changed in response to the data acquired and to unforeseen operational issues. Traverses were thereby optimized to efficiently answer scientific questions. During the Mars simulation, the rover traversed a distance of 800 m. Based on the time delay between Earth and Mars, we estimate that the same operation would have taken 30 days to perform on Mars. This paper will describe the mission simulation and make recommendations about incorporating rovers into the Mars surveyor program.
Slight Blurring in Newer Image from Mars Orbiter
2018-02-09
These two frames were taken of the same place on Mars by the same orbiting camera before (left) and after some images from the camera began showing unexpected blur. The images are from the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. They show a patch of ground about 500 feet or 150 meters wide in Gusev Crater. The one on the left, from HiRISE observation ESP_045173_1645, was taken March 16, 2016. The one on the right was taken Jan. 9, 2018. Gusev Crater, at 15 degrees south latitude and 176 degrees east longitude, is the landing site of NASA's Spirit Mars rover in 2004 and a candidate landing site for a rover to be launched in 2020. HiRISE images provide important information for evaluating potential landing sites. The smallest boulders with measurable diameters in the left image are about 3 feet (90 centimeters) wide. In the blurred image, the smallest measurable are about double that width. As of early 2018, most full-resolution images from HiRISE are not blurred, and the cause of the blur is still under investigation. Even before blurred images were first seen, in 2017, observations with HiRISE commonly used a technique that covers more ground area at half the resolution. This shows features smaller than can be distinguished with any other camera orbiting Mars, and little blurring has appeared in these images. https://photojournal.jpl.nasa.gov/catalog/PIA22215
More than Meets the Eye - Infrared Cameras in Open-Ended University Thermodynamics Labs
NASA Astrophysics Data System (ADS)
Melander, Emil; Haglund, Jesper; Weiszflog, Matthias; Andersson, Staffan
2016-12-01
Educational research has found that students have challenges understanding thermal science. Undergraduate physics students have difficulties differentiating basic thermal concepts, such as heat, temperature, and internal energy. Engineering students have been found to have difficulties grasping surface emissivity as a thermal material property. One potential source of students' challenges with thermal science is the lack of opportunity to visualize energy transfer in intuitive ways with traditional measurement equipment. Thermodynamics laboratories have typically depended on point measures of temperature by use of thermometers (detecting heat conduction) or pyrometers (detecting heat radiation). In contrast, thermal imaging by means of an infrared (IR) camera provides a real-time, holistic image. Here we provide some background on IR cameras and their uses in education, and summarize five qualitative investigations that we have used in our courses.
ERIC Educational Resources Information Center
Squibb, Matt
2009-01-01
This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)
2001-01-24
The potential for investigating combustion at the limits of flammability, and the implications for spacecraft fire safety, led to the Structures Of Flame Balls At Low Lewis-number (SOFBALL) experiment flown twice aboard the Space Shuttle in 1997. The success there led to on STS-107 Research 1 mission plarned for 2002. Shown here are video frames captured during the Microgravity Sciences Lab-1 mission in 1997. Flameballs are intrinsically dim, thus requiring the use of image intensifiers on video cameras. The principal investigator is Dr. Paul Ronney of the University of Southern California, Los Angeles. Glenn Research in Cleveland, OH, manages the project.
Baby Spider: Growth of a Martian Trough Network
2016-12-20
This image from NASA Mars Reconnaissance Orbiter shows the growth of a branching network of troughs carved by thawing carbon dioxide over the span of three Martian years. This process is believed to also form larger radially patterned channel features known as Martian "spiders." The image is one of three taken by the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter and included in a report on the first detection of such troughs persisting and growing, from one Mars year to the next. An animation is available at http://photojournal.jpl.nasa.gov/catalog/PIA21257
NASA Technical Reports Server (NTRS)
2001-01-01
Interior of the Equipment Module for the Laminar Soot Processes (LSP-2) experiment that fly in the STS-107 Research 1 mission in 2002 (LSP-1 flew on Microgravity Sciences Lab-1 mission in 1997). The principal investigator is Dr. Gerard Faeth of the University of Michigan. LSP uses a small jet burner (yellow ellipse), similar to a classroom butane lighter, that produces flames up to 60 mm (2.3 in) long. Measurements include color TV cameras and a radiometer or heat sensor (blue circle), and laser images whose darkness indicates the quantity of soot produced in the flame. Glenn Research in Cleveland, OH, manages the project.
NASA Technical Reports Server (NTRS)
2001-01-01
Image of soot (smoke) plume made for the Laminar Soot Processes (LSP) experiment during the Microgravity Sciences Lab-1 mission in 1997. LSP-2 will fly in the STS-107 Research 1 mission in 2002. The principal investigator is Dr. Gerard Faeth of the University of Michigan. LSP uses a small jet burner, similar to a classroom butane lighter, that produces flames up to 60 mm (2.3 in) long. Measurements include color TV cameras and a temperature sensor, and laser images whose darkness indicates the quantity of soot produced in the flame. Glenn Research in Cleveland, OH, manages the project.
2001-01-24
Interior of the Equipment Module for the Laminar Soot Processes (LSP-2) experiment that fly in the STS-107 Research 1 mission in 2002 (LSP-1 flew on Microgravity Sciences Lab-1 mission in 1997). The principal investigator is Dr. Gerard Faeth of the University of Michigan. LSP uses a small jet burner (yellow ellipse), similar to a classroom butane lighter, that produces flames up to 60 mm (2.3 in) long. Measurements include color TV cameras and a radiometer or heat sensor (blue circle), and laser images whose darkness indicates the quantity of soot produced in the flame. Glenn Research in Cleveland, OH, manages the project.
Imaging experiment: The Viking Lander
Mutch, T.A.; Binder, A.B.; Huck, F.O.; Levinthal, E.C.; Morris, E.C.; Sagan, C.; Young, A.T.
1972-01-01
The Viking Lander Imaging System will consist of two identical facsimile cameras. Each camera has a high-resolution mode with an instantaneous field of view of 0.04??, and survey and color modes with instantaneous fields of view of 0.12??. Cameras are positioned one meter apart to provide stereoscopic coverage of the near-field. The Imaging Experiment will provide important information about the morphology, composition, and origin of the Martian surface and atmospheric features. In addition, lander pictures will provide supporting information for other experiments in biology, organic chemistry, meteorology, and physical properties. ?? 1972.
SOFIA science instruments: commissioning, upgrades and future opportunities
NASA Astrophysics Data System (ADS)
Smith, Erin C.; Miles, John W.; Helton, L. Andrew; Sankrit, Ravi; Andersson, B. G.; Becklin, Eric E.; De Buizer, James M.; Dowell, C. D.; Dunham, Edward W.; Güsten, Rolf; Harper, Doyal A.; Herter, Terry L.; Keller, Luke D.; Klein, Randolf; Krabbe, Alfred; Logsdon, Sarah; Marcum, Pamela M.; McLean, Ian S.; Reach, William T.; Richter, Matthew J.; Roellig, Thomas L.; Sandell, Göran; Savage, Maureen L.; Temi, Pasquale; Vacca, William D.; Vaillancourt, John E.; Van Cleve, Jeffrey E.; Young, Erick T.
2014-07-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is the world's largest airborne observatory, featuring a 2.5 meter effective aperture telescope housed in the aft section of a Boeing 747SP aircraft. SOFIA's current instrument suite includes: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), a 5-40 μm dual band imager/grism spectrometer developed at Cornell University; HIPO (High-speed Imaging Photometer for Occultations), a 0.3-1.1μm imager built by Lowell Observatory; GREAT (German Receiver for Astronomy at Terahertz Frequencies), a multichannel heterodyne spectrometer from 60-240 μm, developed by a consortium led by the Max Planck Institute for Radio Astronomy; FLITECAM (First Light Infrared Test Experiment CAMera), a 1-5 μm wide-field imager/grism spectrometer developed at UCLA; FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), a 42-200 μm IFU grating spectrograph completed by University Stuttgart; and EXES (Echelon-Cross-Echelle Spectrograph), a 5-28 μm highresolution spectrometer designed at the University of Texas and being completed by UC Davis and NASA Ames Research Center. HAWC+ (High-resolution Airborne Wideband Camera) is a 50-240 μm imager that was originally developed at the University of Chicago as a first-generation instrument (HAWC), and is being upgraded at JPL to add polarimetry and new detectors developed at Goddard Space Flight Center (GSFC). SOFIA will continually update its instrument suite with new instrumentation, technology demonstration experiments and upgrades to the existing instrument suite. This paper details the current instrument capabilities and status, as well as the plans for future instrumentation.
2007-09-28
KENNEDY SPACE CENTER, FLA. -- In the Orbiter Processing Facility, members of the STS-122 crew look over cameras that will be used during the mission. From left are Mission Specialists Hans Schlegel and Rex Walheim. Schlegel represents the European Space Agency. The crew is at Kennedy Space Center to take part in a crew equipment interface test, which helps familiarize them with equipment and payloads for the mission. Among the activities standard to a CEIT are harness training, inspection of the thermal protection system and camera operation for planned extravehicular activities, or EVAs. The mission will carry and install the Columbus Lab, a multifunctional, pressurized laboratory that will be permanently attached to Node 2 of the space station to carry out experiments in materials science, fluid physics and biosciences, as well as to perform a number of technological applications. It is Europe’s largest contribution to the construction of the International Space Station and will support scientific and technological research in a microgravity environment. STS-122 is targeted for launch in December. Photo credit: NASA/Kim Shiflett
Flammability Limits of Gases Under Low Gravity Conditions
NASA Technical Reports Server (NTRS)
Strehlow, R. A.
1985-01-01
The purpose of this combustion science investigation is to determine the effect of zero, fractional, and super gravity on the flammability limits of a premixed methane air flame in a standard 51 mm diameter flammability tube and to determine, if possible, the fluid flow associated with flame passage under zero-g conditions and the density (and hence, temperature) profiles associated with the flame under conditions of incipient extinction. This is accomplished by constructing an appropriate apparatus for placement in NASA's Lewis Research Center Lear Jet facility and flying the prescribed g-trajectories while the experiment is being performed. Data is recorded photographically using the visible light of the flame. The data acquired is: (1) the shape and propagation velocity of the flame under various g-conditions for methane compositions that are inside the flammable limits, and (2) the effect of gravity on the limits. Real time accelerometer readings for the three orthogonal directions are displayed in full view of the cameras and the framing rate of the cameras is used to measure velocities.
NASA Astrophysics Data System (ADS)
Muraishi, Hiroshi; Hara, Hidetake; Abe, Shinji; Yokose, Mamoru; Watanabe, Takara; Takeda, Tohoru; Koba, Yusuke; Fukuda, Shigekazu
2016-03-01
We have developed a heavy-ion computed tomography (IonCT) system using a scintillation screen and an electron-multiplying charged coupled device (EMCCD) camera that can measure a large object such as a human head. In this study, objective with the development of the system was to investigate the possibility of applying this system to heavy-ion treatment planning from the point of view of spatial resolution in a reconstructed image. Experiments were carried out on a rotation phantom using 12C accelerated up to 430 MeV/u by the Heavy-Ion Medical Accelerator in Chiba (HIMAC) at the National Institute of Radiological Sciences (NIRS). We demonstrated that the reconstructed image of an object with a water equivalent thickness (WET) of approximately 18 cm was successfully achieved with the spatial resolution of 1 mm, which would make this IonCT system worth applying to the heavy-ion treatment planning for head and neck cancers.
2007-09-28
KENNEDY SPACE CENTER, FLA. -- In the Orbiter Processing Facility, members of the STS-122 crew look over cameras that will be used during the mission. From left are Mission Specialists Stanley Love, Hans Schlegel and Rex Walheim and Pilot Alan Poindexter. The crew is at Kennedy Space Center to take part in a crew equipment interface test, which helps familiarize them with equipment and payloads for the mission. Among the activities standard to a CEIT are harness training, inspection of the thermal protection system and camera operation for planned extravehicular activities, or EVAs. The mission will carry and install the Columbus Lab, a multifunctional, pressurized laboratory that will be permanently attached to Node 2 of the space station to carry out experiments in materials science, fluid physics and biosciences, as well as to perform a number of technological applications. It is Europe’s largest contribution to the construction of the International Space Station and will support scientific and technological research in a microgravity environment. STS-122 is targeted for launch in December. Photo credit: NASA/Kim Shiflett
Lunar Satellite Snaps Image of Earth
2014-05-07
This image, captured Feb. 1, 2014, shows a colorized view of Earth from the moon-based perspective of NASA's Lunar Reconnaissance Orbiter. Credit: NASA/Goddard/Arizona State University -- NASA's Lunar Reconnaissance Orbiter (LRO) experiences 12 "earthrises" every day, however LROC (short for LRO Camera) is almost always busy imaging the lunar surface so only rarely does an opportunity arise such that LROC can capture a view of Earth. On Feb. 1, 2014, LRO pitched forward while approaching the moon's north pole allowing the LROC Wide Angle Camera to capture Earth rising above Rozhdestvenskiy crater (112 miles, or 180 km, in diameter). Read more: go.nasa.gov/1oqMlgu NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Engineer's drawing of Skylab 4 Far Ultraviolet Electronographic camera
1973-11-19
S73-36910 (November 1973) --- An engineer's drawing of the Skylab 4 Far Ultraviolet Electronographic camera (Experiment S201). Arrows point to various features and components of the camera. As the Comet Kohoutek streams through space at speeds of 100,000 miles per hour, the Skylab 4 crewmen will use the S201 UV camera to photograph features of the comet not visible from the Earth's surface. While the comet is some distance from the sun, the camera will be pointed through the scientific airlock in the wall of the Skylab space station Orbital Workshop (OWS). By using a movable mirror system built for the Ultraviolet Stellar Astronomy (S019) Experiment and rotating the space station, the S201 camera will be able to photograph the comet around the side of the space station. Photo credit: NASA
NIRCam: Development and Testing of the JWST Near-Infrared Camera
NASA Technical Reports Server (NTRS)
Greene, Thomas; Beichman, Charles; Gully-Santiago, Michael; Jaffe, Daniel; Kelly, Douglas; Krist, John; Rieke, Marcia; Smith, Eric H.
2011-01-01
The Near Infrared Camera (NIRCam) is one of the four science instruments of the James Webb Space Telescope (JWST). Its high sensitivity, high spatial resolution images over the 0.6 - 5 microns wavelength region will be essential for making significant findings in many science areas as well as for aligning the JWST primary mirror segments and telescope. The NIRCam engineering test unit was recently assembled and has undergone successful cryogenic testing. The NIRCam collimator and camera optics and their mountings are also progressing, with a brass-board system demonstrating relatively low wavefront error across a wide field of view. The flight model?s long-wavelength Si grisms have been fabricated, and its coronagraph masks are now being made. Both the short (0.6 - 2.3 microns) and long (2.4 - 5.0 microns) wavelength flight detectors show good performance and are undergoing final assembly and testing. The flight model subsystems should all be completed later this year through early 2011, and NIRCam will be cryogenically tested in the first half of 2011 before delivery to the JWST integrated science instrument module (ISIM).
Test Rover at JPL During Preparation for Mars Rover Low-Angle Selfie
2015-08-19
This view of a test rover at NASA's Jet Propulsion Laboratory, Pasadena, California, results from advance testing of arm positions and camera pointings for taking a low-angle self-portrait of NASA's Curiosity Mars rover. This rehearsal in California led to a dramatic Aug. 5, 2015, selfie of Curiosity, online at PIA19807. Curiosity's arm-mounted Mars Hand Lens Imager (MAHLI) camera took 92 of component images that were assembled into that mosaic. The rover team positioned the camera lower in relation to the rover body than for any previous full self-portrait of Curiosity. This practice version was taken at JPL's Mars Yard in July 2013, using the Vehicle System Test Bed (VSTB) rover, which has a test copy of MAHLI on its robotic arm. MAHLI was built by Malin Space Science Systems, San Diego. JPL, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19810
NASA Technical Reports Server (NTRS)
Volmer, Paul; Sullivan, Pam (Technical Monitor)
2003-01-01
The Advanced Camera for Surveys ACS was launched aboard the Space Shuttle Columbia just before dawn on March 1, 2002. After successfully docking with the Hubble Space Telescope (HST), several components were replaced. One of the components was the Advanced Camera for Surveys built by Ball Aerospace & Technologies Corp. (BATC) in Boulder, Colorado. Over the life of the HST contract at BATC hundreds of employees had the pleasure of working on the concept, design, fabrication, assembly and test of ACS. Those employees thank NASA - Goddard Space Flight Center and the science team at Johns Hopkins University (JHU) for the opportunity to participate in building a great science instrument for HST. After installation in HST a mini-functional test was performed and later a complete functional test. ACS performed well and has continued performing well since then. One of the greatest rewards for the BATC employees is a satisfied science team. Following is an excerpt from the JHU final report, "The foremost promise of ACS was to increase Hubble's capability for surveys in the near infrared by a factor of 10. That promise was kept. "
Smart Cameras for Remote Science Survey
NASA Technical Reports Server (NTRS)
Thompson, David R.; Abbey, William; Allwood, Abigail; Bekker, Dmitriy; Bornstein, Benjamin; Cabrol, Nathalie A.; Castano, Rebecca; Estlin, Tara; Fuchs, Thomas; Wagstaff, Kiri L.
2012-01-01
Communication with remote exploration spacecraft is often intermittent and bandwidth is highly constrained. Future missions could use onboard science data understanding to prioritize downlink of critical features [1], draft summary maps of visited terrain [2], or identify targets of opportunity for followup measurements [3]. We describe a generic approach to classify geologic surfaces for autonomous science operations, suitable for parallelized implementations in FPGA hardware. We map these surfaces with texture channels - distinctive numerical signatures that differentiate properties such as roughness, pavement coatings, regolith characteristics, sedimentary fabrics and differential outcrop weathering. This work describes our basic image analysis approach and reports an initial performance evaluation using surface images from the Mars Exploration Rovers. Future work will incorporate these methods into camera hardware for real-time processing.
ERIC Educational Resources Information Center
Felt, Wallace A.
2011-01-01
This qualitative case study of a rural high school examines the impact of technology tools on secondary science classrooms. Specifically, document cameras, student response systems, and probeware are examined for their affect in instructional practices in science classrooms where they are used. Observational data, student surveys, and teacher…
ERIC Educational Resources Information Center
Jeppsson, Fredrik; Frejd, Johanna; Lundmark, Frida
2017-01-01
This study focuses on investigating how students make use of their bodily experiences in combination with infrared (IR) cameras, as a way to make meaning in learning about heat, temperature, and friction. A class of 20 primary students (age 7-8 years), divided into three groups, took part in three IR camera laboratory experiments. The qualitative…
Who Goes There? Linking Remote Cameras and Schoolyard Science to Empower Action
ERIC Educational Resources Information Center
Tanner, Dawn; Ernst, Julie
2013-01-01
Taking Action Opportunities (TAO) is a curriculum that combines guided reflection, a focus on the local environment, and innovative use of wildlife technology to empower student action toward improving the environment. TAO is experientially based and uses remote cameras as a tool for schoolyard exploration. Through TAO, students engage in research…
Students' framing of laboratory exercises using infrared cameras
NASA Astrophysics Data System (ADS)
Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.
2015-12-01
Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N =30 ) partook in four IR-camera laboratory activities, designed around the predict-observe-explain approach of White and Gunstone. The activities involved central thermal concepts that focused on heat conduction and dissipative processes such as friction and collisions. Students' interactions within each activity were videotaped and the analysis focuses on how a purposefully selected group of three students engaged with the exercises. As the basis for an interpretative study, a "thick" narrative description of the students' epistemological and conceptual framing of the exercises and how they took advantage of the disciplinary affordance of IR cameras in the thermal domain is provided. Findings include that the students largely shared their conceptual framing of the four activities, but differed among themselves in their epistemological framing, for instance, in how far they found it relevant to digress from the laboratory instructions when inquiring into thermal phenomena. In conclusion, the study unveils the disciplinary affordances of infrared cameras, in the sense of their use in providing access to knowledge about macroscopic thermal science.
ERIC Educational Resources Information Center
Jones, Alex D.
2010-01-01
As an amateur photographer and science teacher, the author shares how to use digital media to enhance the study of habitat and adaptation. The author also helps you to discover how to use the camera to boost science vocabulary, especially for English language learners. Through breathtaking photographs--the possibilities are endless, your students…
2005-05-02
This recent image of Titan reveals more complex patterns of bright and dark regions on the surface, including a small, dark, circular feature, completely surrounded by brighter material. During the two most recent flybys of Titan, on March 31 and April 16, 2005, Cassini captured a number of images of the hemisphere of Titan that faces Saturn. The image at the left is taken from a mosaic of images obtained in March 2005 (see PIA06222) and shows the location of the more recently acquired image at the right. The new image shows intriguing details in the bright and dark patterns near an 80-kilometer-wide (50-mile) crater seen first by Cassini's synthetic aperture radar experiment during a Titan flyby in February 2005 (see PIA07368) and subsequently seen by the imaging science subsystem cameras as a dark spot (center of the image at the left). Interestingly, a smaller, roughly 20-kilometer-wide (12-mile), dark and circular feature can be seen within an irregularly-shaped, brighter ring, and is similar to the larger dark spot associated with the radar crater. However, the imaging cameras see only brightness variations, and without topographic information, the identity of this feature as an impact crater cannot be conclusively determined from this image. The visual infrared mapping spectrometer, which is sensitive to longer wavelengths where Titan's atmospheric haze is less obscuring -- observed this area simultaneously with the imaging cameras, so those data, and perhaps future observations by Cassini's radar, may help to answer the question of this feature's origin. The new image at the right consists of five images that have been added together and enhanced to bring out surface detail and to reduce noise, although some camera artifacts remain. These images were taken with the Cassini spacecraft narrow-angle camera using a filter sensitive to wavelengths of infrared light centered at 938 nanometers -- considered to be the imaging science subsystem's best spectral filter for observing the surface of Titan. This view was acquired from a distance of 33,000 kilometers (20,500 miles). The pixel scale of this image is 390 meters (0.2 miles) per pixel, although the actual resolution is likely to be several times larger. http://photojournal.jpl.nasa.gov/catalog/PIA06234
Noisy Ocular Recognition Based on Three Convolutional Neural Networks.
Lee, Min Beom; Hong, Hyung Gil; Park, Kang Ryoung
2017-12-17
In recent years, the iris recognition system has been gaining increasing acceptance for applications such as access control and smartphone security. When the images of the iris are obtained under unconstrained conditions, an issue of undermined quality is caused by optical and motion blur, off-angle view (the user's eyes looking somewhere else, not into the front of the camera), specular reflection (SR) and other factors. Such noisy iris images increase intra-individual variations and, as a result, reduce the accuracy of iris recognition. A typical iris recognition system requires a near-infrared (NIR) illuminator along with an NIR camera, which are larger and more expensive than fingerprint recognition equipment. Hence, many studies have proposed methods of using iris images captured by a visible light camera without the need for an additional illuminator. In this research, we propose a new recognition method for noisy iris and ocular images by using one iris and two periocular regions, based on three convolutional neural networks (CNNs). Experiments were conducted by using the noisy iris challenge evaluation-part II (NICE.II) training dataset (selected from the university of Beira iris (UBIRIS).v2 database), mobile iris challenge evaluation (MICHE) database, and institute of automation of Chinese academy of sciences (CASIA)-Iris-Distance database. As a result, the method proposed by this study outperformed previous methods.
NASA Astrophysics Data System (ADS)
Lee, Victor R.
2015-04-01
Biomechanics, and specifically the biomechanics associated with human movement, is a potentially rich backdrop against which educators can design innovative science teaching and learning activities. Moreover, the use of technologies associated with biomechanics research, such as high-speed cameras that can produce high-quality slow-motion video, can be deployed in such a way to support students' participation in practices of scientific modeling. As participants in classroom design experiment, fifteen fifth-grade students worked with high-speed cameras and stop-motion animation software (SAM Animation) over several days to produce dynamic models of motion and body movement. The designed series of learning activities involved iterative cycles of animation creation and critique and use of various depictive materials. Subsequent analysis of flipbooks of human jumping movements created by the students at the beginning and end of the unit revealed a significant improvement in both the epistemic fidelity of students' representations. Excerpts from classroom observations highlight the role that the teacher plays in supporting students' thoughtful reflection of and attention to slow-motion video. In total, this design and research intervention demonstrates that the combination of technologies, activities, and teacher support can lead to improvements in some of the foundations associated with students' modeling.
NASA Astrophysics Data System (ADS)
Gulick, Ginny
2009-09-01
We report on the accomplishments of the HiRISE EPO program over the last two and a half years of science operations. We have focused primarily on delivering high impact science opportunities through our various participatory science and citizen science websites. Uniquely, we have invited students from around the world to become virtual HiRISE team members by submitting target suggestions via our HiRISE Quest Image challenges using HiWeb the team's image suggestion facility web tools. When images are acquired, students analyze their returned images, write a report and work with a HiRISE team member to write a image caption for release on the HiRISE website (http://hirise.lpl.arizona.edu). Another E/PO highlight has been our citizen scientist effort, HiRISE Clickworkers (http://clickworkers.arc.nasa.gov/hirise). Clickworkers enlists volunteers to identify geologic features (e.g., dunes, craters, wind streaks, gullies, etc.) in the HiRISE images and help generate searchable image databases. In addition, the large image sizes and incredible spatial resolution of the HiRISE camera can tax the capabilities of the most capable computers, so we have also focused on enabling typical users to browse, pan and zoom the HiRISE images using our HiRISE online image viewer (http://marsoweb.nas.nasa.gov/HiRISE/hirise_images/). Our educational materials available on the HiRISE EPO web site (http://hirise.seti.org/epo) include an assortment of K through college level, standards-based activity books, a K through 3 coloring/story book, a middle school level comic book, and several interactive educational games, including Mars jigsaw puzzles, crosswords, word searches and flash cards.
MagAO: Status and on-sky performance of the Magellan adaptive optics system
NASA Astrophysics Data System (ADS)
Morzinski, Katie M.; Close, Laird M.; Males, Jared R.; Kopon, Derek; Hinz, Phil M.; Esposito, Simone; Riccardi, Armando; Puglisi, Alfio; Pinna, Enrico; Briguglio, Runa; Xompero, Marco; Quirós-Pacheco, Fernando; Bailey, Vanessa; Follette, Katherine B.; Rodigas, T. J.; Wu, Ya-Lin; Arcidiacono, Carmelo; Argomedo, Javier; Busoni, Lorenzo; Hare, Tyson; Uomoto, Alan; Weinberger, Alycia
2014-07-01
MagAO is the new adaptive optics system with visible-light and infrared science cameras, located on the 6.5-m Magellan "Clay" telescope at Las Campanas Observatory, Chile. The instrument locks on natural guide stars (NGS) from 0th to 16th R-band magnitude, measures turbulence with a modulating pyramid wavefront sensor binnable from 28×28 to 7×7 subapertures, and uses a 585-actuator adaptive secondary mirror (ASM) to provide at wavefronts to the two science cameras. MagAO is a mutated clone of the similar AO systems at the Large Binocular Telescope (LBT) at Mt. Graham, Arizona. The high-level AO loop controls up to 378 modes and operates at frame rates up to 1000 Hz. The instrument has two science cameras: VisAO operating from 0.5-1μm and Clio2 operating from 1-5 μm. MagAO was installed in 2012 and successfully completed two commissioning runs in 2012-2013. In April 2014 we had our first science run that was open to the general Magellan community. Observers from Arizona, Carnegie, Australia, Harvard, MIT, Michigan, and Chile took observations in collaboration with the MagAO instrument team. Here we describe the MagAO instrument, describe our on-sky performance, and report our status as of summer 2014.
Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography
Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.
1972-01-01
Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.
MRO High Resolution Imaging Science Experiment (HiRISE): Instrument Development
NASA Technical Reports Server (NTRS)
Delamere, Alan; Becker, Ira; Bergstrom, Jim; Burkepile, Jon; Day, Joe; Dorn, David; Gallagher, Dennis; Hamp, Charlie; Lasco, Jeffrey; Meiers, Bill
2003-01-01
The primary functional requirement of the HiRISE imager is to allow identification of both predicted and unknown features on the surface of Mars to a much finer resolution and contrast than previously possible. This results in a camera with a very wide swath width, 6km at 300km altitude, and a high signal to noise ratio, >100:1. Generation of terrain maps, 30 cm vertical resolution, from stereo images requires very accurate geometric calibration. The project limitations of mass, cost and schedule make the development challenging. In addition, the spacecraft stability must not be a major limitation to image quality. The nominal orbit for the science phase of the mission is a 3pm orbit of 255 by 320 km with periapsis locked to the south pole. The track velocity is approximately 3,400 m/s.
NASA Technical Reports Server (NTRS)
Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtain, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike
2014-01-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro- polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with a gain of 2.0 +/- 0.5, less than or equal to 25 e- readout noise, less than or equal to 10 e-/second/pixel dark current, and less than 0.1percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; system gain, dark current, read noise, and residual non-linearity.
Astronaut Owen Garriott - Test Subject - Human Vestibular Function Experiment
1973-08-09
S73-34171 (9 Aug. 1973) --- Scientist-astronaut Owen K. Garriott, Skylab 3 science pilot, serves as test subject for the Skylab ?Human Vestibular Function? M131 Experiment, as seen in this photographic reproduction taken from a television transmission made by a color TV camera aboard the Skylab space station in Earth orbit. The objectives of the Skylab M131 experiment are to obtain data pertinent to establishing the validity of measurements of specific behavioral/physiological responses influenced by vestibular activity under one-g and zero-g conditions; to determine man?s adaptability to unusual vestibular conditions and predict habitability of future spacecraft conditions involving reduced gravity and Coriollis forces; and to measure the accuracy and variability in man?s judgment of spatial coordinates based on atypical gravity receptor cues and inadequate visual cures. Dr. Garriott is seated in the experiment?s litter chair which can rotate the test subject at predetermined rotational velocity or programmed acceleration/decelerational profile. Photo credit: NASA
A multiple camera tongue switch for a child with severe spastic quadriplegic cerebral palsy.
Leung, Brian; Chau, Tom
2010-01-01
The present study proposed a video-based access technology that facilitated a non-contact tongue protrusion access modality for a 7-year-old boy with severe spastic quadriplegic cerebral palsy (GMFCS level 5). The proposed system featured a centre camera and two peripheral cameras to extend coverage of the frontal face view of this user for longer durations. The child participated in a descriptive case study. The participant underwent 3 months of tongue protrusion training while the multiple camera tongue switch prototype was being prepared. Later, the participant was brought back for five experiment sessions where he worked on a single-switch picture matching activity, using the multiple camera tongue switch prototype in a controlled environment. The multiple camera tongue switch achieved an average sensitivity of 82% and specificity of 80%. In three of the experiment sessions, the peripheral cameras were associated with most of the true positive switch activations. These activations would have been missed by a centre-camera-only setup. The study demonstrated proof-of-concept of a non-contact tongue access modality implemented by a video-based system involving three cameras and colour video processing.
Lessons Learned from Crime Caught on Camera
Bernasco, Wim
2018-01-01
Objectives: The widespread use of camera surveillance in public places offers criminologists the opportunity to systematically and unobtrusively observe crime, their main subject matter. The purpose of this essay is to inform the reader of current developments in research on crimes caught on camera. Methods: We address the importance of direct observation of behavior and review criminological studies that used observational methods, with and without cameras, including the ones published in this issue. We also discuss the uses of camera recordings in other social sciences and in biology. Results: We formulate six key insights that emerge from the literature and make recommendations for future research. Conclusions: Camera recordings of real-life crime are likely to become part of the criminological tool kit that will help us better understand the situational and interactional elements of crime. Like any source, it has limitations that are best addressed by triangulation with other sources. PMID:29472728
The use of student-driven video projects as an educational and outreach tool
NASA Astrophysics Data System (ADS)
Bamzai, A.; Farrell, W.; Klemm, T.
2014-12-01
With recent technological advances, the barriers to filmmaking have been lowered, and it is now possible to record and edit video footage with a smartphone or a handheld camera and free software. Students accustomed to documenting their every-day experiences for multimedia-rich social networking sites feel excited and creatively inspired when asked to take on ownership of more complex video projects. With a small amount of guidance on shooting primary and secondary footage and an overview of basic interview skills, students are self-motivated to identify the learning themes with which they resonate most strongly and record their footage in a way that is true to their own experience. The South Central Climate Science Center (SC-CSC) is one of eight regional centers formed by the U.S. Department of the Interior in order to provide decision makers with the science, tools, and information they need to address the impacts of climate variability and change on their areas of responsibility. An important component of this mission is to innovate in the areas of translational science and science communication. This presentation will highlight how the SC-CSC used student-driven video projects to document our Early Career Researcher Workshop and our Undergraduate Internship for Underrepresented Minorities. These projects equipped the students with critical thinking and project management skills, while also providing a finished product that the SC-CSC can use for future outreach purposes.
Flow visualization by mobile phone cameras
NASA Astrophysics Data System (ADS)
Cierpka, Christian; Hain, Rainer; Buchmann, Nicolas A.
2016-06-01
Mobile smart phones were completely changing people's communication within the last ten years. However, these devices do not only offer communication through different channels but also devices and applications for fun and recreation. In this respect, mobile phone cameras include now relatively fast (up to 240 Hz) cameras to capture high-speed videos of sport events or other fast processes. The article therefore explores the possibility to make use of this development and the wide spread availability of these cameras in the terms of velocity measurements for industrial or technical applications and fluid dynamics education in high schools and at universities. The requirements for a simplistic PIV (particle image velocimetry) system are discussed. A model experiment of a free water jet was used to prove the concept and shed some light on the achievable quality and determine bottle necks by comparing the results obtained with a mobile phone camera with data taken by a high-speed camera suited for scientific experiments.
NASA Technical Reports Server (NTRS)
Brown, Robert A. (Editor)
1993-01-01
The scientific and technical basis for an Advanced Camera (AC) for the Hubble Space Telescope (HST) is discussed. In March 1992, the NASA Program Scientist for HST invited the Space Telescope Science Institute to conduct a community-based study of an AC, which would be installed on a scheduled HST servicing mission in 1999. The study had three phases: a broad community survey of views on candidate science program and required performance of the AC, an analysis of technical issues relating to its implementation, and a panel of experts to formulate conclusions and prioritize recommendations. From the assessment of the imaging tasks astronomers have proposed for or desired from HST, we believe the most valuable 1999 instrument would be a camera with both near ultraviolet/optical (NUVO) and far ultraviolet (FUV) sensitivity, and with both wide field and high resolution options.
Science observations with the IUE using the one-gyro mode
NASA Technical Reports Server (NTRS)
Imhoff, C.; Pitts, R.; Arquilla, R.; Shrader, Chris R.; Perez, M. R.; Webb, J.
1990-01-01
The International Ultraviolet Explorer (IUE) attitude control system originally included an inertial reference package containing six gyroscopes for three axis stabilization. The science instrument includes a prime and redundant Field Error Sensor (FES) camera for target acquisition and offset guiding. Since launch, four of the six gyroscopes have failed. The current attitude control system utilizes the remaining two gyros and a Fine Sun Sensor (FSS) for three axis stabilization. When the next gyro fails, a new attitude control system will be uplinked which will rely on the remaining gyro and the FSS for general three axis stabilization. In addition to the FSS, the FES cameras will be required to assist in maintaining fine attitude control during target acquisition. This has required thoroughly determining the characteristics of the FES cameras and the spectrograph aperture plate as well as devising new target acquisition procedures. The results of this work are presented.
Science observations with the IUE using the one-gyro mode
NASA Technical Reports Server (NTRS)
Imhoff, C.; Pitts, R.; Arquilla, R.; Shrader, C.; Perez, M.; Webb, J.
1990-01-01
The International Ultraviolet Explorer (IUE) attitude control system originally included an inertial reference package containing six gyroscopes for three axis stabilization. The science instrument includes a prime and redundant Field Error Sensor (FES) camera for target acquisition and offset guiding. Since launch, four of the six gyroscopes have failed. The current attitude control system utilizes the remaining two gyros and a Fine Sun Sensor (FSS) for three axis stabilization. When the next gyro fails, a new attitude control system will be uplinked, which will relay on the remaining gyro and the FSS for general three axis stabilization. In addition to the FSS, the FES cameras will be required to assist in maintaining fine attitude control during target acquisition. This has required thoroughly determining the characteristics of the FES cameras and the spectrograph aperture plate as well as devising new target acquisition procedures. The results of this work are presented.
Video at Sea: Telling the Stories of the International Ocean Discovery Program
NASA Astrophysics Data System (ADS)
Wright, M.; Harned, D.
2014-12-01
Seagoing science expeditions offer an ideal opportunity for storytelling. While many disciplines involve fieldwork, few offer the adventure of spending two months at sea on a vessel hundreds of miles from shore with several dozen strangers from all over the world. As a medium, video is nearly ideal for telling these stories; it can capture the thrill of discovery, the agony of disappointment, the everyday details of life at sea, and everything in between. At the International Ocean Discovery Program (IODP, formerly the Integrated Ocean Drilling Program), we have used video as a storytelling medium for several years with great success. Over this timeframe, camera equipment and editing software have become cheaper and easier to use, while web sites such as YouTube and Vimeo have enabled sharing with just a few mouse clicks. When it comes to telling science stories with video, the barriers to entry have never been lower. As such, we have experimented with many different approaches and a wide range of styles. On one end of the spectrum, live "ship-to-shore" broadcasts with school groups - conducted with an iPad and free videoconferencing software such as Skype and Zoom - enable curious minds to engage directly with scientists in real-time. We have also contracted with professional videographers and animators who offer the experience, skill, and equipment needed to produce polished clips of the highest caliber. Amateur videographers (including some scientists looking to make use of their free time on board) have shot and produced impressive shorts using little more than a phone camera. In this talk, I will provide a brief overview of our efforts to connect with the public using video, including a look at how effective certain tactics are for connecting to specific audiences.
NASA Astrophysics Data System (ADS)
Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu
2015-04-01
For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.
Astronaut Owen Garriott lies in Lower Body Negative Pressure Device
1973-08-06
SL3-108-1278 (July-September 1973) --- Scientist-astronaut Owen K. Garriott, science pilot of the Skylab 3 mission, lies in the Lower Body Negative Pressure Device in the work and experiments area of the Orbital Workshop (OWS) crew quarters of the Skylab space station cluster in Earth orbit. This picture was taken with a hand-held 35mm Nikon camera. Astronauts Garriott, Alan L. Bean and Jack R. Lousma remained with the Skylab space station in orbit for 59 days conducting numerous medical, scientific and technological experiments. The LBNPD (MO92) Experiment is to provide information concerning the time course of cardiovascular adaptation during flight, and to provide in-flight data for predicting the degree of orthostatic intolerance and impairment of physical capacity to be expected upon return to Earth environment. The bicycle ergometer is in the right foreground. Photo credit: NASA
Astronaut Charles Conrad as test subject for Lower Body Negative Pressure
1973-06-09
S73-27707 (9 June 1973) --- Astronaut Charles Conrad Jr., Skylab 2 commander, serves as test subject for the Lower Body Negative Pressure (MO92) Experiment, as seen in this reproduction taken from a color television transmission made by a TV camera aboard the Skylab 1/2 space station cluster in Earth orbit. Scientist-astronaut Joseph P. Kerwin, Skylab 2 science pilot, assists Conrad into the LBNP device. Kerwin served as monitor for the experiment. The purpose of the MO92 experiment is to provide information concerning the time course of cardiovascular adaptation during flight, and to provide inflight data for predicting the degree of orthostatic intolerance and impairment of physical capacity to be expected upon return to Earth environment. The data collected in support of MO92 blood pressure, heart rate, body temperature, vectorcardiogram, LBNPD pressure, leg volume changes, and body weight. Photo credit: NASA
SKYLAB (SL)-3 - ASTRONAUT GARRIOTT, OWEN
1973-08-09
S73-32113 (9 Aug. 1973) --- Scientist-astronaut Owen K. Garriott, Skylab 3 science pilot, serves as test subject for the Skylab ?Human Vestibular Function? M131 Experiment, as seen in this photographic reproduction taken from a television transmission made by a color TV camera aboard the Skylab space station in Earth orbit. The objectives of the Skylab M131 experiment are to obtain data pertinent to establishing the validity of measurements of specific behavioral/physiological responses influenced by vestibular activity under one-g and zero-g conditions; to determine man?s adaptability to unusual vestibular conditions and predict habitability of future spacecraft conditions involving reduced gravity and Coriollis forces; and to measure the accuracy and variability in man?s judgment of spatial coordinates based on atypical gravity receptor cues and inadequate visual cues. Photo credit: NASA
ERIC Educational Resources Information Center
Lee, Victor R.
2015-01-01
Biomechanics, and specifically the biomechanics associated with human movement, is a potentially rich backdrop against which educators can design innovative science teaching and learning activities. Moreover, the use of technologies associated with biomechanics research, such as high-speed cameras that can produce high-quality slow-motion video,…
Cameras in the Courtroom: A U.S. Survey. Journalism Monographs No. 60.
ERIC Educational Resources Information Center
White, Frank Wm.
Changes in the prohibition against cameras in state courtrooms are examined in this report. It provides a historical sketch of camera usage in the courtroom since 1935 and reports on the states permitting still, videotape, film cameras, and other electronic equipment in courtrooms since 1978, on the states now experimenting with the matter, and on…
NASA Astrophysics Data System (ADS)
Malin, Michal C.; Ravine, Michael A.; Caplinger, Michael A.; Tony Ghaemi, F.; Schaffner, Jacob A.; Maki, Justin N.; Bell, James F.; Cameron, James F.; Dietrich, William E.; Edgett, Kenneth S.; Edwards, Laurence J.; Garvin, James B.; Hallet, Bernard; Herkenhoff, Kenneth E.; Heydari, Ezat; Kah, Linda C.; Lemmon, Mark T.; Minitti, Michelle E.; Olson, Timothy S.; Parker, Timothy J.; Rowland, Scott K.; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J.; Sumner, Dawn Y.; Aileen Yingst, R.; Duston, Brian M.; McNair, Sean; Jensen, Elsa H.
2017-08-01
The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam-34 has an f/8, 34 mm focal length lens, and the M-100 an f/10, 100 mm focal length lens. The M-34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M-100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M-34 can focus from 0.5 m to infinity, and the M-100 from 1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the 2 m tall Remote Sensing Mast, have a 360° azimuth and 180° elevation field of regard. Mars Descent Imager is fixed-mounted to the bottom left front side of the rover at 66 cm above the surface. Its fixed focus lens is in focus from 2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of 70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression.
Presence capture cameras - a new challenge to the image quality
NASA Astrophysics Data System (ADS)
Peltoketo, Veli-Tapani
2016-04-01
Commercial presence capture cameras are coming to the markets and a new era of visual entertainment starts to get its shape. Since the true presence capturing is still a very new technology, the real technical solutions are just passed a prototyping phase and they vary a lot. Presence capture cameras have still the same quality issues to tackle as previous phases of digital imaging but also numerous new ones. This work concentrates to the quality challenges of presence capture cameras. A camera system which can record 3D audio-visual reality as it is has to have several camera modules, several microphones and especially technology which can synchronize output of several sources to a seamless and smooth virtual reality experience. Several traditional quality features are still valid in presence capture cameras. Features like color fidelity, noise removal, resolution and dynamic range create the base of virtual reality stream quality. However, co-operation of several cameras brings a new dimension for these quality factors. Also new quality features can be validated. For example, how the camera streams should be stitched together with 3D experience without noticeable errors and how to validate the stitching? The work describes quality factors which are still valid in the presence capture cameras and defines the importance of those. Moreover, new challenges of presence capture cameras are investigated in image and video quality point of view. The work contains considerations how well current measurement methods can be used in presence capture cameras.
First Results of the Athena Microscopic Imager Investigation
NASA Technical Reports Server (NTRS)
Herkenhoff, K.; Squyres, S.; Archinal, B.; Arvidson, R.; Bass, D.; Barrett, J.; Becker, K.; Becker, T.; Bell, J., III; Burr, D.
2004-01-01
The Athena science payload on the Mars Exploration Rovers (MER) includes the Microscopic Imager (MI). The MI is a fixed-focus camera mounted on an extendable arm, the Instrument Deployment Device (IDD). The MI acquires images at a spatial resolution of 30 microns/pixel over a broad spectral range (400 - 700 nm). The MI uses the same electronics design as the other MER cameras but its optics yield a field of view of 31 x 31 mm across a 1024 x 1024 pixel CCD image. The MI acquires images using only solar or skylight illumination of the target surface. A contact sensor is used to place the MI slightly closer to the target surface than its best focus distance (about 69 mm), allowing concave surfaces to be imaged in good focus. Coarse focusing (approx. 2 mm precision) is achieved by moving the IDD away from a rock target after contact is sensed. The MI optics are protected from the Martian environment by a retractable dust cover. This cover includes a Kapton window that is tinted orange to restrict the spectral bandpass to 500 - 700 nm, allowing crude color information to be obtained by acquiring images with the cover open and closed. The MI science objectives, instrument design and calibration, operation, and data processing were described by Herkenhoff et al. Initial results of the MI experiment on both MER rovers ('Spirit' and 'Opportunity') are described below.
Dust Removal Target on 'Vera Rubin Ridge'
2017-11-01
This image from the Mars Hand Lens Imager (MAHLI) camera on NASA's Curiosity Mars rover shows effects of using the rover's wire-bristled Dust Removal Tool (DRT) on a rock target called "Christmas Cove." The tool brushed an area about 2.5 inches (6 centimeters) across on Sept. 16, 2017, during the 1,118th Martian day, or sol of Curiosity's work on Mars. MAHLI took this image later the same sol. Both DRT and MAHLI are on the turret of tools at the end of Curiosity's arm. The site is partway up "Vera Rubin Ridge" on lower Mount Sharp, in an area where reconnaissance imaging with science filters revealed variability in indications of the mineral hematite. Removing dust from part of the Christmas Cove target was part of an experiment to check whether dust is subduing the apparent indications of hematite in some of the area's bedrock. The brushed area's purplish tint in this MAHLI image, accentuated even more when observed with science filters of the rover's Mast Camera, is characteristic of fine-grained hematite. Brushing of this target also exposed details in the fine layering and bright veins within the bedrock of this part of Vera Rubin Ridge. The image is oriented so that sunlight comes from upper left. Layers are lower (older) toward lower right. https://photojournal.jpl.nasa.gov/catalog/PIA22064
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft rolls out of the Vertical Integration Facility on its way to the launch pad. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft rolls out of the Vertical Integration Facility (left) on its way to the launch pad. Liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft settles into position with the launcher umbilical tower on the pad. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft moves with the launcher umbilical tower to the pad. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station in Florida, workers take a moment to observe the Atlas V expendable launch vehicle with the New Horizons spacecraft poised for launch. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft is being moved from the Vertical Integration Facility to the pad. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft moves with the launcher umbilical tower to the pad. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft rolls out of the Vertical Integration Facility on its way to the pad. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
The Panoramic Camera (Pancam) Investigation on the NASA 2003 Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.
2003-01-01
The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover.
The Ocean in Depth - Ideas for Using Marine Technology in Science Communication
NASA Astrophysics Data System (ADS)
Gerdes, A.
2009-04-01
By deploying camera and video systems on remotely operated diving vehicles (ROVs), new and fascinating insights concerning the functioning of deep ocean ecosystems like cold-water coral reef communities can be gained. Moreover, mapping hot vents at mid-ocean ridge locations, and exploring asphalt and mud volcanoes in the Gulf of Mexico and the Mediterranean Sea with the aid of video camera systems have illustrated the scientific value of state-of-the-art diving tools. In principle, the deployment of sophisticated marine technology on seagoing expeditions and their results - video tapes and photographs of fascinating submarine environments, publication of new scientific findings - offer unique opportunities for communicating marine sciences. Experience shows that an interest in marine technology can easily be stirred in laypersons if the deployment of underwater vehicles such as ROVs during seagoing expeditions can be presented using catchwords like "discovery", "new frontier", groundbreaking mission", etc. On the other hand, however, a number of restrictions and challenges have to be kept in mind. Communicating marine science in general, and the achievements of marine technology in particular, can only be successful with the application of a well-defined target-audience concept. While national and international TV stations and production companies are very much interested in using high quality underwater video footage, the involvement of journalists and camera teams in seagoing expeditions entails a number a challenges: berths onboard research vessels are limited; safety aspects have to be considered; copyright and utilisation questions of digitalized video and photo material has to be handled with special care. To cite one example: on-board video material produced by professional TV teams cannot be used by the research institute that operated the expedition. This presentation aims at (1)informing members of the scientific community about new opportunities related to marine technology, (2)discussing challenges and limitations in cooperative projects with media,(3) presenting new ways of marketing scientific findings, (4) promoting the interest of the media present at the EGU09 conference in cooperating with research institutes.
2011-07-01
cameras were installed around the test pan and an underwater GoPro ® video camera recorded the fire from below the layer of fuel. 3.2.2. Camera Images...Distribution A: Approved for public release; distribution unlimited. 3.2.3. Video Images A GoPro video camera with a wide angle lens recorded the tests...camera and the GoPro ® video camera were not used for fire suppression experiments. 3.3.2. Test Pans Two ¼-in thick stainless steel test pans were
SOFIA tracking image simulation
NASA Astrophysics Data System (ADS)
Taylor, Charles R.; Gross, Michael A. K.
2016-09-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) tracking camera simulator is a component of the Telescope Assembly Simulator (TASim). TASim is a software simulation of the telescope optics, mounting, and control software. Currently in its fifth major version, TASim is relied upon for telescope operator training, mission planning and rehearsal, and mission control and science instrument software development and testing. TASim has recently been extended for hardware-in-the-loop operation in support of telescope and camera hardware development and control and tracking software improvements. All three SOFIA optical tracking cameras are simulated, including the Focal Plane Imager (FPI), which has recently been upgraded to the status of a science instrument that can be used on its own or in parallel with one of the seven infrared science instruments. The simulation includes tracking camera image simulation of starfields based on the UCAC4 catalog at real-time rates of 4-20 frames per second. For its role in training and planning, it is important for the tracker image simulation to provide images with a realistic appearance and response to changes in operating parameters. For its role in tracker software improvements, it is vital to have realistic signal and noise levels and precise star positions. The design of the software simulation for precise subpixel starfield rendering (including radial distortion), realistic point-spread function as a function of focus, tilt, and collimation, and streaking due to telescope motion will be described. The calibration of the simulation for light sensitivity, dark and bias signal, and noise will also be presented
SHUTTLE - PAYLOADS (STS-41G) - KSC
1984-10-05
Payload canister transporter in Vertical Processing Facility Clean Room loaded with Earth Radiation Budget Experiment (ERBS), Large Format Camera (LFC), and Orbital Reservicing System (ORS) for STS-41G Mission. 1. STS-41G - EXPERIMENTS 2. CAMERAS - LFC KSC, FL Also available in 4x5 CN
Planning Bepicolombo MPO Science Operations to study Mercury Interior
NASA Astrophysics Data System (ADS)
De La Fuente, Sara; Carasa, Angela; Ortiz, Iñaki; Rodriguez, Pedro; Casale, Mauro; Benkhoff, Johannes; Zender, Joe
2017-04-01
BepiColombo is an Interdisciplinary Cornerstone ESA-JAXA Mission to Mercury, with two orbiters, the ESA Mercury Planetary Orbiter (MPO) and the JAXA Mercury Magnetospheric Orbiter (MMO) dedicated to study of the planet and its magnetosphere. The MPO, is a three-axis-stabilized, nadir-pointing spacecraft which will be placed in a polar orbit, providing excellent spatial resolution over the entire planet surface. The MPO's scientific payload comprises 11 instrument packages, including laser altimeter, cameras and the radio science experiment that will be dedicated to the study of Mercury's interior: structure, composition, formation and evolution. The planning of the science operations to be carried out by the Mercury's interior scientific instruments will be done by the SGS located at the European Space Astronomy Centre (ESAC), in conjunction with the scientific instrument teams. The process will always consider the complete nominal mission duration, such that the contribution of the scheduled science operations to the science objectives, the total data volume generated, and the seasonal interdependency, can be tracked. The heart of the science operations planning process is the Observations Catalogue (OC), a web-accessed database to collect and analyse all science operations requests. From the OC, the SGS will first determine all science opportunity windows compatible with the spacecraft operational constraints. Secondly, only those compatible with the resources (power and data volume) and pointing constraints will be chosen, including slew feasibility.
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.
2014-06-01
As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. We demonstrate new possibility of the passive THz camera using for a temperature difference observing on the human skin if this difference is caused by different temperatures inside the body. We discuss some physical experiments, in which a person drinks hot, and warm, and cold water and he eats. After computer processing of images captured by passive THz camera TS4 we may see the pronounced temperature trace on skin of the human body. For proof of validity of our statement we make the similar physical experiment using the IR camera. Our investigation allows to increase field of the passive THz camera using for the detection of objects concealed in the human body because the difference in temperature between object and parts of human body will be reflected on the human skin. However, modern passive THz cameras have not enough resolution in a temperature to see this difference. That is why, we use computer processing to enhance the camera resolution for this application. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp.
The Juno Radiation Monitoring (RM) Investigation
NASA Astrophysics Data System (ADS)
Becker, H. N.; Alexander, J. W.; Adriani, A.; Mura, A.; Cicchetti, A.; Noschese, R.; Jørgensen, J. L.; Denver, T.; Sushkova, J.; Jørgensen, A.; Benn, M.; Connerney, J. E. P.; Bolton, S. J.; Allison, J.; Watts, S.; Adumitroaie, V.; Manor-Chapman, E. A.; Daubar, I. J.; Lee, C.; Kang, S.; McAlpine, W. J.; Di Iorio, T.; Pasqui, C.; Barbis, A.; Lawton, P.; Spalsbury, L.; Loftin, S.; Sun, J.
2017-11-01
The Radiation Monitoring Investigation of the Juno Mission will actively retrieve and analyze the noise signatures from penetrating radiation in the images of Juno's star cameras and science instruments at Jupiter. The investigation's objective is to profile Jupiter's >10-MeV electron environment in regions of the Jovian magnetosphere which today are still largely unexplored. This paper discusses the primary instruments on Juno which contribute to the investigation's data suite, the measurements of camera noise from penetrating particles, spectral sensitivities and measurement ranges of the instruments, calibrations performed prior to Juno's first science orbit, and how the measurements may be used to infer the external relativistic electron environment.
Science Activity Planner for the MER Mission
NASA Technical Reports Server (NTRS)
Norris, Jeffrey S.; Crockett, Thomas M.; Fox, Jason M.; Joswig, Joseph C.; Powell, Mark W.; Shams, Khawaja S.; Torres, Recaredo J.; Wallick, Michael N.; Mittman, David S.
2008-01-01
The Maestro Science Activity Planner is a computer program that assists human users in planning operations of the Mars Explorer Rover (MER) mission and visualizing scientific data returned from the MER rovers. Relative to its predecessors, this program is more powerful and easier to use. This program is built on the Java Eclipse open-source platform around a Web-browser-based user-interface paradigm to provide an intuitive user interface to Mars rovers and landers. This program affords a combination of advanced display and simulation capabilities. For example, a map view of terrain can be generated from images acquired by the High Resolution Imaging Science Explorer instrument aboard the Mars Reconnaissance Orbiter spacecraft and overlaid with images from a navigation camera (more precisely, a stereoscopic pair of cameras) aboard a rover, and an interactive, annotated rover traverse path can be incorporated into the overlay. It is also possible to construct an overhead perspective mosaic image of terrain from navigation-camera images. This program can be adapted to similar use on other outer-space missions and is potentially adaptable to numerous terrestrial applications involving analysis of data, operations of robots, and planning of such operations for acquisition of scientific data.
STS-31 crew activity on the middeck of the Earth-orbiting Discovery, OV-103
1990-04-29
STS031-05-002 (24-29 April 1990) --- A 35mm camera with a "fish eye" lens captured this high angle image on Discovery's middeck. Astronaut Kathryn D. Sullivan works with the IMAX camera in foreground, while Astronaut Steven A. Hawley consults a checklist in corner. An Arriflex motion picture camera records student ion arc experiment in apparatus mounted on stowage locker. The experiment was the project of Gregory S. Peterson, currently a student at Utah State University.
NASA Astrophysics Data System (ADS)
Voss, H. D.; Dailey, J.; Snyder, S. J.
2011-09-01
Students creating and flying experiments into near-space using a low-cost balloon High-Altitude Research Platform (HARP) greatly advance understanding in introductory astronomy and advanced classes across several disciplines. Remote sensing above 98% of the atmosphere using cameras, image intensifiers, IR, and UV sensors provides access to the heavens and large regions of the earth below. In situ and limb atmospheric gas measurements, near-space stratosphere measurements, and cosmic rays engage students in areas from planetary atmospheres to supernova acceleration. This new capability is possible by exposing students to recent advances in MEMS technology, nanotechnology, wireless telecommunication systems, GPS, DSPs and other microchip miniaturizations to build less than 4 kg payloads. The HARP program provides an engaging laboratory, gives challenging science, technology, engineering, and mathematics (STEM) field experiences, reaches students from diverse backgrounds, encourages collaboration among science faculty, and provides quantitative assessment of the learning outcomes. Over a seven-year period, Taylor University, an undergraduate liberal arts school, has successfully launched over 230 HARP systems to altitudes over 30 km (100% retrieval success with rapid recovery) with flight times between two and six hours. The HARP payloads included two GPS tracking systems, cameras and monitors, a 110 kbit down link, an uplink command capability for educational experiments (K-12 and undergraduate). Launches were conducted during the day and night, with multiple balloons, with up to 10 payloads for experiments, and under varying weather and upper atmospheric conditions. The many launches in a short period of time allowed the payload bus design to evolve toward increased performance, reliability, standardization, simplicity, and modularity for low-cost launch services. Through NSF and NASA grants, the program has expanded, leading to representatives from more than 52 universities being trained at workshops to implement high-altitude balloon launches in the classroom. A spin-off company, StratoStar Systems LLC, now sells the turn-key high-altitude balloon system, and another spin-off company, NearSpace Launch, now offers a low cost ride-for-hire into near-space.
2001-04-26
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. NASA and contractor personnel who conducted the DIME activity with the students. Shown (L-R) are: Daniel Dietrich (NASA) mentor for Sycamore High School team), Carol Hodanbosi (National Center for Microgravity Research; DIME staff), Jose Carrion (GRC Akima, drop tower technician), Dennis Stocker (NASA; DIME staff), Richard DeLombard (NASA; DIME staff), Sandi Thompson (NSMR sabbatical teacher; DIME staff), Peter Sunderland (NCMR, mentor for COSI Academy student team), Adam Malcolm (NASA co-op student; DIME staff). This image is from a digital still camera; higher resolution is not available.
CATE 2016 Indonesia: Science goals and student training for 2017
NASA Astrophysics Data System (ADS)
Penn, M. J.; McKay, M. A.; Kovac, S. A.; Jensen, L.; Hare, H. S.; Mitchell, A. M.; Bosh, R.; Watson, Z.; Baer, R.; Pierce, M.; Gelderman, R.; Walter, D. K.
2016-12-01
The Citizen Continental-America Telescopic Eclipse (CATE) Experiment for 2017 is being developed at the National Solar Observatory in partnership with universities, schools, astronomy clubs, and corporations. The CATE experiment will use more than 60 identical telescopes equipped with digital cameras from Oregon to South Carolina to image the solar corona. The project will then splice these images together to show the corona during a 90-minute period, revealing for the first time the plasma dynamics of the inner solar corona. The goals for the CATE experiment range from providing an authentic STEM research experience for students and lifelong learners, to making state-of-the-art solar coronal observations of the plasma dynamics of coronal polar plumes, to increasing the US scientific literacy. Private funds are being raised for the CATE equipment, and so the telescopes will stay with the volunteers after the eclipse and be used in follow-on citizen science astronomy projects. The 2017 eclipse will be viewed by hundreds of millions of people. Four sets of undergraduate students in the path of the 2017 eclipse have become local experts for the eclipse and trainers for the CATE volunteers. These students traveled to the 2016 March eclipse in Indonesia and collected observations with prototype CATE telescopes; science results from these 2016 observations will be discussed. Training videos for use in 2017 were developed and tested on volunteers. Finally several high school groups along the path of totality have been engaged in the CATE project and will participate in the eclipse data collection. This work was supported by the NSO "Training for the 2017 Citizen CATE Experiment" funded by NASA (NASA NNX16AB92A). The National Solar Observatory is operated by the Association of Universities for Research in Astronomy, Inc. (AURA) under cooperative agreement with the NSF.
Innovative Video Diagnostic Equipment for Material Science
NASA Technical Reports Server (NTRS)
Capuano, G.; Titomanlio, D.; Soellner, W.; Seidel, A.
2012-01-01
Materials science experiments under microgravity increasingly rely on advanced optical systems to determine the physical properties of the samples under investigation. This includes video systems with high spatial and temporal resolution. The acquisition, handling, storage and transmission to ground of the resulting video data are very challenging. Since the available downlink data rate is limited, the capability to compress the video data significantly without compromising the data quality is essential. We report on the development of a Digital Video System (DVS) for EML (Electro Magnetic Levitator) which provides real-time video acquisition, high compression using advanced Wavelet algorithms, storage and transmission of a continuous flow of video with different characteristics in terms of image dimensions and frame rates. The DVS is able to operate with the latest generation of high-performance cameras acquiring high resolution video images up to 4Mpixels@60 fps or high frame rate video images up to about 1000 fps@512x512pixels.
ERIC Educational Resources Information Center
School Science Review, 1972
1972-01-01
Short articles describe techniques suitable for junior high school science, including the use of a toy drinking bird" to start discussion, using cobalt chloride solution to demonstrate convection currents, demonstration of the relationship between freezing point and concentration, and instructions for building a simple lens camera, a circuit…
The Mars Surveyor '01 Rover and Robotic Arm
NASA Technical Reports Server (NTRS)
Bonitz, Robert G.; Nguyen, Tam T.; Kim, Won S.
1999-01-01
The Mars Surveyor 2001 Lander will carry with it both a Robotic Arm and Rover to support various science and technology experiments. The Marie Curie Rover, the twin sister to Sojourner Truth, is expected to explore the surface of Mars in early 2002. Scientific investigations to determine the elemental composition of surface rocks and soil using the Alpha Proton X-Ray Spectrometer (APXS) will be conducted along with several technology experiments including the Mars Experiment on Electrostatic Charging (MEEC) and the Wheel Abrasion Experiment (WAE). The Rover will follow uplinked operational sequences each day, but will be capable of autonomous reactions to the unpredictable features of the Martian environment. The Mars Surveyor 2001 Robotic Arm will perform rover deployment, and support various positioning, digging, and sample acquiring functions for MECA (Mars Environmental Compatibility Assessment) and Mossbauer Spectrometer experiments. The Robotic Arm will also collect its own sensor data for engineering data analysis. The Robotic Arm Camera (RAC) mounted on the forearm of the Robotic Arm will capture various images with a wide range of focal length adjustment during scientific experiments and rover deployment
Design and fabrication of a CCD camera for use with relay optics in solar X-ray astronomy
NASA Technical Reports Server (NTRS)
1984-01-01
Configured as a subsystem of a sounding rocket experiment, a camera system was designed to record and transmit an X-ray image focused on a charge coupled device. The camera consists of a X-ray sensitive detector and the electronics for processing and transmitting image data. The design and operation of the camera are described. Schematics are included.
General Astrophysics with the HabEx Workhorse Camera
NASA Astrophysics Data System (ADS)
Stern, Daniel; Clarke, John; Gaudi, B. Scott; Kiessling, Alina; Krause, Oliver; Martin, Stefan; Scowen, Paul; Somerville, Rachel; HabEx STDT
2018-01-01
The Habitable Exoplanet Imaging Mission (HabEx) concept has been designed to enable an extensive suite of science, broadly put under the rubric of General Astrophysics, in addition to its exoplanet direct imaging science. General astrophysics directly addresses multiple NASA programmatic branches, and HabEx will enable investigations ranging from cosmology, to galaxy evolution, to stellar population studies, to exoplanet transit spectroscopy, to Solar System studies. This poster briefly describes one of the two primary HabEx General Astrophysics instruments, the HabEx Workhorse Camera (HWC). HWC will be a dual-detector UV-to-near-IR imager and multi-object grism spectrometer with a microshutter array and a moderate (3' x 3') field-of-view. We detail some of the key science we expect HWC to undertake, emphasizing unique capabilities enabled by a large-aperture, highly stable space-borne platform at these wavelengths.
Wilkes, Thomas C; McGonigle, Andrew J S; Pering, Tom D; Taggart, Angus J; White, Benjamin S; Bryant, Robert G; Willmott, Jon R
2016-10-06
Here, we report, for what we believe to be the first time, on the modification of a low cost sensor, designed for the smartphone camera market, to develop an ultraviolet (UV) camera system. This was achieved via adaptation of Raspberry Pi cameras, which are based on back-illuminated complementary metal-oxide semiconductor (CMOS) sensors, and we demonstrated the utility of these devices for applications at wavelengths as low as 310 nm, by remotely sensing power station smokestack emissions in this spectral region. Given the very low cost of these units, ≈ USD 25, they are suitable for widespread proliferation in a variety of UV imaging applications, e.g., in atmospheric science, volcanology, forensics and surface smoothness measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birch, Gabriel Carisle; Griffin, John Clark
2015-01-01
The horizontal television lines (HTVL) metric has been the primary quantity used by division 6000 related to camera resolution for high consequence security systems. This document shows HTVL measurements are fundamen- tally insufficient as a metric to determine camera resolution, and propose a quantitative, standards based methodology by measuring the camera system modulation transfer function (MTF), the most common and accepted metric of res- olution in the optical science community. Because HTVL calculations are easily misinterpreted or poorly defined, we present several scenarios in which HTVL is frequently reported, and discuss their problems. The MTF metric is discussed, and scenariosmore » are presented with calculations showing the application of such a metric.« less
Mapping Land and Water Surface Topography with instantaneous Structure from Motion
NASA Astrophysics Data System (ADS)
Dietrich, J.; Fonstad, M. A.
2012-12-01
Structure from Motion (SfM) has given researchers an invaluable tool for low-cost, high-resolution 3D mapping of the environment. These SfM 3D surface models are commonly constructed from many digital photographs collected with one digital camera (either handheld or attached to aerial platform). This method works for stationary or very slow moving objects. However, objects in motion are impossible to capture with one-camera SfM. With multiple simultaneously triggered cameras, it becomes possible to capture multiple photographs at the same time which allows for the construction 3D surface models of moving objects and surfaces, an instantaneous SfM (ISfM) surface model. In river science, ISfM provides a low-cost solution for measuring a number of river variables that researchers normally estimate or are unable to collect over large areas. With ISfM and sufficient coverage of the banks and RTK-GPS control it is possible to create a digital surface model of land and water surface elevations across an entire channel and water surface slopes at any point within the surface model. By setting the cameras to collect time-lapse photography of a scene it is possible to create multiple surfaces that can be compared using traditional digital surface model differencing. These water surface models could be combined the high-resolution bathymetry to create fully 3D cross sections that could be useful in hydrologic modeling. Multiple temporal image sets could also be used in 2D or 3D particle image velocimetry to create 3D surface velocity maps of a channel. Other applications in earth science include anything where researchers could benefit from temporal surface modeling like mass movements, lava flows, dam removal monitoring. The camera system that was used for this research consisted of ten pocket digital cameras (Canon A3300) equipped with wireless triggers. The triggers were constructed with an Arduino-style microcontroller and off-the-shelf handheld radios with a maximum range of several kilometers. The cameras are controlled from another microcontroller/radio combination that allows for manual or automatic triggering of the cameras. The total cost of the camera system was approximately 1500 USD.
Automatic calibration method for plenoptic camera
NASA Astrophysics Data System (ADS)
Luan, Yinsen; He, Xing; Xu, Bing; Yang, Ping; Tang, Guomao
2016-04-01
An automatic calibration method is proposed for a microlens-based plenoptic camera. First, all microlens images on the white image are searched and recognized automatically based on digital morphology. Then, the center points of microlens images are rearranged according to their relative position relationships. Consequently, the microlens images are located, i.e., the plenoptic camera is calibrated without the prior knowledge of camera parameters. Furthermore, this method is appropriate for all types of microlens-based plenoptic cameras, even the multifocus plenoptic camera, the plenoptic camera with arbitrarily arranged microlenses, or the plenoptic camera with different sizes of microlenses. Finally, we verify our method by the raw data of Lytro. The experiments show that our method has higher intelligence than the methods published before.
Noisy Ocular Recognition Based on Three Convolutional Neural Networks
Lee, Min Beom; Hong, Hyung Gil; Park, Kang Ryoung
2017-01-01
In recent years, the iris recognition system has been gaining increasing acceptance for applications such as access control and smartphone security. When the images of the iris are obtained under unconstrained conditions, an issue of undermined quality is caused by optical and motion blur, off-angle view (the user’s eyes looking somewhere else, not into the front of the camera), specular reflection (SR) and other factors. Such noisy iris images increase intra-individual variations and, as a result, reduce the accuracy of iris recognition. A typical iris recognition system requires a near-infrared (NIR) illuminator along with an NIR camera, which are larger and more expensive than fingerprint recognition equipment. Hence, many studies have proposed methods of using iris images captured by a visible light camera without the need for an additional illuminator. In this research, we propose a new recognition method for noisy iris and ocular images by using one iris and two periocular regions, based on three convolutional neural networks (CNNs). Experiments were conducted by using the noisy iris challenge evaluation-part II (NICE.II) training dataset (selected from the university of Beira iris (UBIRIS).v2 database), mobile iris challenge evaluation (MICHE) database, and institute of automation of Chinese academy of sciences (CASIA)-Iris-Distance database. As a result, the method proposed by this study outperformed previous methods. PMID:29258217
Selection of the InSight landing site
Golombek, M.; Kipp, D.; Warner, N.; Daubar, Ingrid J.; Fergason, Robin L.; Kirk, Randolph L.; Beyer, R.; Huertas, A.; Piqueux, Sylvain; Putzig, N.E.; Campbell, B.A.; Morgan, G. A.; Charalambous, C.; Pike, W. T.; Gwinner, K.; Calef, F.; Kass, D.; Mischna, M A; Ashley, J.; Bloom, C.; Wigton, N.; Hare, T.; Schwartz, C.; Gengl, H.; Redmond, L.; Trautman, M.; Sweeney, J.; Grima, C.; Smith, I. B.; Sklyanskiy, E.; Lisano, M.; Benardini, J.; Smrekar, S.E.; Lognonne, P.; Banerdt, W. B.
2017-01-01
The selection of the Discovery Program InSight landing site took over four years from initial identification of possible areas that met engineering constraints, to downselection via targeted data from orbiters (especially Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) and High-Resolution Imaging Science Experiment (HiRISE) images), to selection and certification via sophisticated entry, descent and landing (EDL) simulations. Constraints on elevation (≤−2.5 km">≤−2.5 km≤−2.5 km for sufficient atmosphere to slow the lander), latitude (initially 15°S–5°N and later 3°N–5°N for solar power and thermal management of the spacecraft), ellipse size (130 km by 27 km from ballistic entry and descent), and a load bearing surface without thick deposits of dust, severely limited acceptable areas to western Elysium Planitia. Within this area, 16 prospective ellipses were identified, which lie ∼600 km north of the Mars Science Laboratory (MSL) rover. Mapping of terrains in rapidly acquired CTX images identified especially benign smooth terrain and led to the downselection to four northern ellipses. Acquisition of nearly continuous HiRISE, additional Thermal Emission Imaging System (THEMIS), and High Resolution Stereo Camera (HRSC) images, along with radar data confirmed that ellipse E9 met all landing site constraints: with slopes <15° at 84 m and 2 m length scales for radar tracking and touchdown stability, low rock abundance (<10 %) to avoid impact and spacecraft tip over, instrument deployment constraints, which included identical slope and rock abundance constraints, a radar reflective and load bearing surface, and a fragmented regolith ∼5 m thick for full penetration of the heat flow probe. Unlike other Mars landers, science objectives did not directly influence landing site selection.
1997-05-24
This unusual view of the underside of the Space Shuttle orbiter Atlantis shortly before landing was taken by a fish-eye camera lens from KSC’s Shuttle Landing Facility. The Vehicle Assembly Building is in the background at left. The Shuttle Training Aircraft can be seen in the distance, at center. Atlantis is wrapping up its nine-day STS-84 mission, which was the sixth docking of the Space Shuttle with the Russian Space Station Mir. Atlantis was docked with the Mir for five days. STS-84 Mission Specialist C. Michael Foale replaced astronaut and Mir 23 crew member Jerry M. Linenger, who has been on the Russian space station since Jan. 15. Linenger is returning to Earth on Atlantis with the rest of the STS-84 crew, Mission Commander Charles J. Precourt, Pilot Eileen Marie Collins, and Mission Specialists Carlos I. Noriega, Edward Tsang Lu, Elena V. Kondakova of the Russian Space Agency and Jean-Francois Clervoy of the European Space Agency. Foale is scheduled to remain on the Mir for approximately four months, until he is replaced by STS-86 crew member Wendy B. Lawrence in September. Besides the docking and crew exchange, STS-84 included the transfer of more than 7,300 pounds of water, logistics and science experiments and hardware to and from the Mir. Scientific experiments conducted during the STS-84 mission, and scheduled for Foale’s stay on the Mir, are in the fields of advanced technology, Earth sciences, fundamental biology, human life sciences, International Space Station risk mitigation, microgravity sciences and space sciences
NASA Astrophysics Data System (ADS)
Balthasar, Heike; Dumke, Alexander; van Gasselt, Stephan; Gross, Christoph; Michael, Gregory; Musiol, Stefanie; Neu, Dominik; Platz, Thomas; Rosenberg, Heike; Schreiner, Björn; Walter, Sebastian
2014-05-01
Since 2003 the High Resolution Stereo Camera (HRSC) experiment on the Mars Express mission is in orbit around Mars. First images were sent to Earth on January 14th, 2004. The goal-oriented HRSC data dissemination and the transparent representation of the associated work and results are the main aspects that contributed to the success in the public perception of the experiment. The Planetary Sciences and Remote Sensing Group at Freie Universität Berlin (FUB) offers both, an interactive web based data access, and browse/download options for HRSC press products [www.fu-berlin.de/planets]. Close collaborations with exhibitors as well as print and digital media representatives allows for regular and directed dissemination of, e.g., conventional imagery, orbital/synthetic surface epipolar images, video footage, and high-resolution displays. On a monthly basis we prepare press releases in close collaboration with the European Space Agency (ESA) and the German Aerospace Center (DLR) [http://www.geo.fu-berlin.de/en/geol/fachrichtungen/planet/press/index.html]. A release comprises panchromatic, colour, anaglyph, and perspective views of a scene taken from an HRSC image of the Martian surface. In addition, a context map and descriptive texts in English and German are provided. More sophisticated press releases include elaborate animations and simulated flights over the Martian surface, perspective views of stereo data combined with colour and high resolution, mosaics, and perspective views of data mosaics. Altogether 970 high quality PR products and 15 movies were created at FUB during the last decade and published via FUB/DLR/ESA platforms. We support educational outreach events, as well as permanent and special exhibitions. Examples for that are the yearly "Science Fair", where special programs for kids are offered, and the exhibition "Mars Mission and Vision" which is on tour until 2015 through 20 German towns, showing 3-D movies, surface models, and images of the HRSC camera experiment. Press and media appearances of group members, and talks to school classes and interested communities also contribute to the public outreach. For HRSC data dissemination we use digital platforms. Since 2007 HRSC image data can be viewed and accessed via the online interface HRSCview [http://hrscview.fu-berlin.de] which was built in cooperation with the DLR Institute for Planetary Research. Additionally HRSC ortho images (level 4) are presented in a modern MapServer setup in GIS-read format since 2013 [http://www.geo.fu-berlin.de/en/geol/fachrichtungen/planet/projects/marsexpress/level4downloads/index.html]. All of these offers ensured the accessibility of HRSC data and products to the science community as well as to the general public for the last ten years and will do so also in the future, taking advantage of modern and user-optimized applications and networks.
Line drawing Scientific Instrument Module and lunar orbital science package
NASA Technical Reports Server (NTRS)
1970-01-01
A line drawing of the Scientific Instrument Module (SIM) with its lunar orbital science package. The SIM will be mounted in a previously vacant sector of the Apollo Service Module. It will carry specialized cameras and instrumentation for gathering lunar orbit scientific data.
The Large Synoptic Survey Telescope (LSST) Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.
EARLY SCIENCE WITH SOFIA, THE STRATOSPHERIC OBSERVATORY FOR INFRARED ASTRONOMY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, E. T.; Becklin, E. E.; De Buizer, J. M.
The Stratospheric Observatory For Infrared Astronomy (SOFIA) is an airborne observatory consisting of a specially modified Boeing 747SP with a 2.7 m telescope, flying at altitudes as high as 13.7 km (45,000 ft). Designed to observe at wavelengths from 0.3 {mu}m to 1.6 mm, SOFIA operates above 99.8% of the water vapor that obscures much of the infrared and submillimeter. SOFIA has seven science instruments under development, including an occultation photometer, near-, mid-, and far-infrared cameras, infrared spectrometers, and heterodyne receivers. SOFIA, a joint project between NASA and the German Aerospace Center Deutsches Zentrum fuer Luft und-Raumfahrt, began initial sciencemore » flights in 2010 December, and has conducted 30 science flights in the subsequent year. During this early science period three instruments have flown: the mid-infrared camera FORCAST, the heterodyne spectrometer GREAT, and the occultation photometer HIPO. This Letter provides an overview of the observatory and its early performance.« less
Camera Perspective Bias in Videotaped Confessions: Evidence that Visual Attention Is a Mediator
ERIC Educational Resources Information Center
Ware, Lezlee J.; Lassiter, G. Daniel; Patterson, Stephen M.; Ransom, Michael R.
2008-01-01
Several experiments have demonstrated a "camera perspective bias" in evaluations of videotaped confessions: videotapes with the camera focused on the suspect lead to judgments of greater voluntariness than alternative presentation formats. The present research investigated potential mediators of this bias. Using eye tracking to measure visual…
Development of biostereometric experiments. [stereometric camera system
NASA Technical Reports Server (NTRS)
Herron, R. E.
1978-01-01
The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.
NASA Astrophysics Data System (ADS)
Zoletnik, S.; Biedermann, C.; Cseh, G.; Kocsis, G.; König, R.; Szabolics, T.; Szepesi, T.; Wendelstein 7-X Team
2018-01-01
A special video camera has been developed for the 10-camera overview video system of the Wendelstein 7-X (W7-X) stellarator considering multiple application needs and limitations resulting from this complex long-pulse superconducting stellarator experiment. The event detection intelligent camera (EDICAM) uses a special 1.3 Mpixel CMOS sensor with non-destructive read capability which enables fast monitoring of smaller Regions of Interest (ROIs) even during long exposures. The camera can perform simple data evaluation algorithms (minimum/maximum, mean comparison to levels) on the ROI data which can dynamically change the readout process and generate output signals. Multiple EDICAM cameras were operated in the first campaign of W7-X and capabilities were explored in the real environment. Data prove that the camera can be used for taking long exposure (10-100 ms) overview images of the plasma while sub-ms monitoring and even multi-camera correlated edge plasma turbulence measurements of smaller areas can be done in parallel. These latter revealed that filamentary turbulence structures extend between neighboring modules of the stellarator. Considerations emerging for future upgrades of this system and similar setups on future long-pulse fusion experiments such as ITER are discussed.
Looking ever so much like an alien spacecraft, the Altus II remotely piloted aircraft shows off some
NASA Technical Reports Server (NTRS)
2002-01-01
Looking ever so much like an alien spacecraft, the Altus II remotely piloted aircraft shows off some of the instruments and camera lenses mounted in its nose for a lightning study over Florida flown during the summer of 2002. The Altus Cumulus Electrification Study (ACES), led by Dr. Richard Blakeslee of NASA Marshall Space Flight center, focused on the collection of electrical, magnetic and optical measurements of thunderstorms. Data collected will help scientists understand the development and life cycles of thunderstorms, which in turn may allow meteorologists to more accurately predict when destructive storms may hit. The Altus II, built by General Atomics Aeronautical Systems, Inc., is one of several remotely operated aircraft developed and matured under NASA's Environmental Research Aircraft and Sensor Technology (ERAST) program. The program focused on developing airframe, propulsion, control system and communications technologies to allow unmanned aerial vehicles (UAVs) to operate at very high altitudes for long durations while carrying a variety of sensors, cameras or other instruments for science experiments, surveillance or telecommunications relay missions.
NASA Astrophysics Data System (ADS)
Johnson, Kyle; Thurow, Brian; Kim, Taehoon; Blois, Gianluca; Christensen, Kenneth
2016-11-01
Three-dimensional, three-component (3D-3C) measurements were made using a plenoptic camera on the flow around a roughness element immersed in a turbulent boundary layer. A refractive index matched approach allowed whole-field optical access from a single camera to a measurement volume that includes transparent solid geometries. In particular, this experiment measures the flow over a single hemispherical roughness element made of acrylic and immersed in a working fluid consisting of Sodium Iodide solution. Our results demonstrate that plenoptic particle image velocimetry (PIV) is a viable technique to obtaining statistically-significant volumetric velocity measurements even in a complex separated flow. The boundary layer to roughness height-ratio of the flow was 4.97 and the Reynolds number (based on roughness height) was 4.57×103. Our measurements reveal key flow features such as spiraling legs of the shear layer, a recirculation region, and shed arch vortices. Proper orthogonal decomposition (POD) analysis was applied to the instantaneous velocity and vorticity data to extract these features. Supported by the National Science Foundation Grant No. 1235726.
Final Technical Report. Training in Building Audit Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brosemer, Kathleen
In 2011, the Tribe proposed and was awarded the Training in Building Audit Technologies grant from the DOE in the amount of $55,748 to contract for training programs for infrared cameras, blower door technology applications and building systems. The coursework consisted of; Infrared Camera Training: Level I - Thermal Imaging for Energy Audits; Blower Door Analysis and Building-As-A-System Training, Building Performance Institute (BPI) Building Analyst; Building Envelope Training, Building Performance Institute (BPI) Envelope Professional; and Audit/JobFLEX Tablet Software. Competitive procurement of the training contractor resulted in lower costs, allowing the Tribe to request and receive DOE approval to additionally purchasemore » energy audit equipment and contract for residential energy audits of 25 low-income Tribal Housing units. Sault Tribe personnel received field training to supplement the classroom instruction on proper use of the energy audit equipment. Field experience was provided through the second DOE energy audits grant, allowing Sault Tribe personnel to join the contractor, Building Science Academy, in conducting 25 residential energy audits of low-income Tribal Housing units.« less
Candidate cave entrances on Mars
Cushing, Glen E.
2012-01-01
This paper presents newly discovered candidate cave entrances into Martian near-surface lava tubes, volcano-tectonic fracture systems, and pit craters and describes their characteristics and exploration possibilities. These candidates are all collapse features that occur either intermittently along laterally continuous trench-like depressions or in the floors of sheer-walled atypical pit craters. As viewed from orbit, locations of most candidates are visibly consistent with known terrestrial features such as tube-fed lava flows, volcano-tectonic fractures, and pit craters, each of which forms by mechanisms that can produce caves. Although we cannot determine subsurface extents of the Martian features discussed here, some may continue unimpeded for many kilometers if terrestrial examples are indeed analogous. The features presented here were identified in images acquired by the Mars Odyssey's Thermal Emission Imaging System visible-wavelength camera, and by the Mars Reconnaissance Orbiter's Context Camera. Select candidates have since been targeted by the High-Resolution Imaging Science Experiment. Martian caves are promising potential sites for future human habitation and astrobiology investigations; understanding their characteristics is critical for long-term mission planning and for developing the necessary exploration technologies.
Where Boron? Mars Rover Detects It
2016-12-13
This map shows the route driven by NASA's Curiosity Mars rover (blue line) and locations where the rover's Chemistry and Camera (ChemCam) instrument detected the element boron (dots, colored by abundance of boron according to the key at right). The main map shows the traverse from landing day (Sol 0) in August 2012 to the rover's location in September 2016, with boron detections through September 2015. The inset at upper left shows a magnified version of the most recent portion of that traverse, with boron detections during that portion. Overlapping dots represent cases when boron was detected in multiple ChemCam observation points in the same target and non-overlapping dots represent cases where two different targets in the same location have boron. Most of the mission's detections of boron have been made in the most recent seven months (about 200 sols) of the rover's uphill traverse. The base image for the map is from the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. North is up. The scale bar at lower right represents one kilometer (0.62 mile). http://photojournal.jpl.nasa.gov/catalog/PIA21150
View of Hadley-Apennine area, looking north, photographed by Apollo 15
1971-08-25
S71-44667 (31 July-2 Aug. 1971) --- An oblique view of the Hadley-Apennine area, looking north, as photographed by the Fairchild metric camera in the Scientific Instrumentation Module (SIM) bay of the Apollo 15 Command and Service Modules (CSM) in lunar orbit. Hadley Rille meanders through the lower center of the picture. The Apennine Mountains are at lower right. The Apollo 15 Lunar Module (LM) touchdown point is on the east side of the "chicken beak" of Hadley Rille. The Caucasus Mountains are at upper right. The dark mare area at the extreme upper right is a portion of the Sea of Serenity. The Marsh of Decay is at lower left. The large crater near the horizon is Aristillus, which is about 55 kilometers (34.18 statute miles) in diameter. The crater just to the south of Aristillus is Autolycus, which is about 40 kilometers (25 statute miles) in diameter. The crater Cassini is barely visible on the horizon at upper right. The three-inch mapping camera was one of eight lunar orbital science experiments mounted in the SIM bay.
NASA Astrophysics Data System (ADS)
Harvey, Nate
2016-08-01
Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.
MAHLI on Mars: lessons learned operating a geoscience camera on a landed payload robotic arm
NASA Astrophysics Data System (ADS)
Aileen Yingst, R.; Edgett, Kenneth S.; Kennedy, Megan R.; Krezoski, Gillian M.; McBride, Marie J.; Minitti, Michelle E.; Ravine, Michael A.; Williams, Rebecca M. E.
2016-06-01
The Mars Hand Lens Imager (MAHLI) is a 2-megapixel, color camera with resolution as high as 13.9 µm pixel-1. MAHLI has operated successfully on the Martian surface for over 1150 Martian days (sols) aboard the Mars Science Laboratory (MSL) rover, Curiosity. During that time MAHLI acquired images to support science and science-enabling activities, including rock and outcrop textural analysis; sand characterization to further the understanding of global sand properties and processes; support of other instrument observations; sample extraction site documentation; range-finding for arm and instrument placement; rover hardware and instrument monitoring and safety; terrain assessment; landscape geomorphology; and support of rover robotic arm commissioning. Operation of the instrument has demonstrated that imaging fully illuminated, dust-free targets yields the best results, with complementary information obtained from shadowed images. The light-emitting diodes (LEDs) allow satisfactory night imaging but do not improve daytime shadowed imaging. MAHLI's combination of fine-scale, science-driven resolution, RGB color, the ability to focus over a large range of distances, and relatively large field of view (FOV), have maximized the return of science and science-enabling observations given the MSL mission architecture and constraints.
The High Definition Earth Viewing (HDEV) Payload
NASA Technical Reports Server (NTRS)
Muri, Paul; Runco, Susan; Fontanot, Carlos; Getteau, Chris
2017-01-01
The High Definition Earth Viewing (HDEV) payload enables long-term experimentation of four, commercial-of-the-shelf (COTS) high definition video, cameras mounted on the exterior of the International Space Station. The payload enables testing of cameras in the space environment. The HDEV cameras transmit imagery continuously to an encoder that then sends the video signal via Ethernet through the space station for downlink. The encoder, cameras, and other electronics are enclosed in a box pressurized to approximately one atmosphere, containing dry nitrogen, to provide a level of protection to the electronics from the space environment. The encoded video format supports streaming live video of Earth for viewing online. Camera sensor types include charge-coupled device and complementary metal-oxide semiconductor. Received imagery data is analyzed on the ground to evaluate camera sensor performance. Since payload deployment, minimal degradation to imagery quality has been observed. The HDEV payload continues to operate by live streaming and analyzing imagery. Results from the experiment reduce risk in the selection of cameras that could be considered for future use on the International Space Station and other spacecraft. This paper discusses the payload development, end-to- end architecture, experiment operation, resulting image analysis, and future work.
Improving depth maps of plants by using a set of five cameras
NASA Astrophysics Data System (ADS)
Kaczmarek, Adam L.
2015-03-01
Obtaining high-quality depth maps and disparity maps with the use of a stereo camera is a challenging task for some kinds of objects. The quality of these maps can be improved by taking advantage of a larger number of cameras. The research on the usage of a set of five cameras to obtain disparity maps is presented. The set consists of a central camera and four side cameras. An algorithm for making disparity maps called multiple similar areas (MSA) is introduced. The algorithm was specially designed for the set of five cameras. Experiments were performed with the MSA algorithm and the stereo matching algorithm based on the sum of sum of squared differences (sum of SSD, SSSD) measure. Moreover, the following measures were included in the experiments: sum of absolute differences (SAD), zero-mean SAD (ZSAD), zero-mean SSD (ZSSD), locally scaled SAD (LSAD), locally scaled SSD (LSSD), normalized cross correlation (NCC), and zero-mean NCC (ZNCC). Algorithms presented were applied to images of plants. Making depth maps of plants is difficult because parts of leaves are similar to each other. The potential usability of the described algorithms is especially high in agricultural applications such as robotic fruit harvesting.
IET. Aerial view of snaptran destructive experiment in 1964. Camera ...
IET. Aerial view of snaptran destructive experiment in 1964. Camera facing north. Test cell building (TAN-624) is positioned away from coupling station. Weather tower in right foreground. Divided duct just beyond coupling station. Air intake structure on south side of shielded control room. Experiment is on dolly at coupling station. Date: 1964. INEEL negative no. 64-1736 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID
ERIC Educational Resources Information Center
School Science Review, 1981
1981-01-01
Presents a variety of laboratory procedures, discussions, and demonstrations including centripedal force apparatus, model ear drum, hot air balloons, air as a real substance, centering a ball, simple test tube rack, demonstration fire extinguisher, pin-hole camera, and guidelines for early primary science education (5-10 years) concepts and lesson…
Precise color images a high-speed color video camera system with three intensified sensors
NASA Astrophysics Data System (ADS)
Oki, Sachio; Yamakawa, Masafumi; Gohda, Susumu; Etoh, Takeharu G.
1999-06-01
High speed imaging systems have been used in a large field of science and engineering. Although the high speed camera systems have been improved to high performance, most of their applications are only to get high speed motion pictures. However, in some fields of science and technology, it is useful to get some other information, such as temperature of combustion flame, thermal plasma and molten materials. Recent digital high speed video imaging technology should be able to get such information from those objects. For this purpose, we have already developed a high speed video camera system with three-intensified-sensors and cubic prism image splitter. The maximum frame rate is 40,500 pps (picture per second) at 64 X 64 pixels and 4,500 pps at 256 X 256 pixels with 256 (8 bit) intensity resolution for each pixel. The camera system can store more than 1,000 pictures continuously in solid state memory. In order to get the precise color images from this camera system, we need to develop a digital technique, which consists of a computer program and ancillary instruments, to adjust displacement of images taken from two or three image sensors and to calibrate relationship between incident light intensity and corresponding digital output signals. In this paper, the digital technique for pixel-based displacement adjustment are proposed. Although the displacement of the corresponding circle was more than 8 pixels in original image, the displacement was adjusted within 0.2 pixels at most by this method.
Development of high-speed video cameras
NASA Astrophysics Data System (ADS)
Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk
2001-04-01
Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.
Foam Experiment Hardware are Flown on Microgravity Rocket MAXUS 4
NASA Astrophysics Data System (ADS)
Lockowandt, C.; Löth, K.; Jansson, O.; Holm, P.; Lundin, M.; Schneider, H.; Larsson, B.
2002-01-01
The Foam module was developed by Swedish Space Corporation and was used for performing foam experiments on the sounding rocket MAXUS 4 launched from Esrange 29 April 2001. The development and launch of the module has been financed by ESA. Four different foam experiments were performed, two aqueous foams by Doctor Michele Adler from LPMDI, University of Marne la Vallée, Paris and two non aqueous foams by Doctor Bengt Kronberg from YKI, Institute for Surface Chemistry, Stockholm. The foam was generated in four separate foam systems and monitored in microgravity with CCD cameras. The purpose of the experiment was to generate and study the foam in microgravity. Due to loss of gravity there is no drainage in the foam and the reactions in the foam can be studied without drainage. Four solutions with various stabilities were investigated. The aqueous solutions contained water, SDS (Sodium Dodecyl Sulphate) and dodecanol. The organic solutions contained ethylene glycol a cationic surfactant, cetyl trimethyl ammonium bromide (CTAB) and decanol. Carbon dioxide was used to generate the aqueous foam and nitrogen was used to generate the organic foam. The experiment system comprised four complete independent systems with injection unit, experiment chamber and gas system. The main part in the experiment system is the experiment chamber where the foam is generated and monitored. The chamber inner dimensions are 50x50x50 mm and it has front and back wall made of glass. The front window is used for monitoring the foam and the back window is used for back illumination. The front glass has etched crosses on the inside as reference points. In the bottom of the cell is a glass frit and at the top is a gas in/outlet. The foam was generated by injecting the experiment liquid in a glass frit in the bottom of the experiment chamber. Simultaneously gas was blown through the glass frit and a small amount of foam was generated. This procedure was performed at 10 bar. Then the pressure was lowered in the experiment chamber to approximately 0,1 bar to expand the foam to a dry foam that filled the experiment chamber. The foam was regenerated during flight by pressurise the cell and repeat the foam generation procedures. The module had 4 individual experiment chambers for the four different solutions. The four experiment chambers were controlled individually with individual experiment parameters and procedures. The gas system comprise on/off valves and adjustable valves to control the pressure and the gas flow and liquid flow during foam generation. The gas system can be divided in four sections, each section serving one experiment chamber. The sections are partly connected in two pairs with common inlet and outlet. The two pairs are supplied with a 1l gas bottle each filled to a pressure of 40 bar and a pressure regulator lowering the pressure from 40 bar to 10 bar. Two sections are connected to the same outlet. The gas outlets from the experiment chambers are connected to two symmetrical placed outlets on the outer structure with diffusers not to disturb the g-levels. The foam in each experiment chamber was monitored with one tomography camera and one overview camera (8 CCD cameras in total). The tomography camera is placed on a translation table which makes it possible to move it in the depth direction of the experiment chamber. The video signal from the 8 CCD cameras were stored onboard with two DV recorders. Two video signals were also transmitted to ground for real time evaluation and operation of the experiment. Which camera signal that was transmitted to ground could be selected with telecommands. With help of the tomography system it was possible to take sequences of images of the foam at different depths in the foam. This sequences of images are used for constructing a 3-D model of the foam after flight. The overview camera has a fixed position and a field of view that covers the total experiment chamber. This camera is used for monitoring the generation of foam and the overall behaviour of the foam. The experiment was performed successfully with foam generation in all 4 experiment chambers. Foam was also regenerated during flight with telecommands. The experiment data is under evaluation.
Planetary Sciences practical experiences at the Master level with small telescopes
NASA Astrophysics Data System (ADS)
Sanchez-Lavega, A.; Perez-Hoyos, S.; del Rio-Gaztelurrutia, T.; Hueso, R.; Ordonez Etxeberria, I.; Rojas, J. F.
2016-12-01
The Master in Space Science and Technology of the Basque Country University UPV/EHU in Bilbao (Spain) has been taught during 7 years (A. Sanchez-Lavega et al., Eur. J. of Eng. Education. 2014). Along the different courses, a series of practical observations and studies of planetary sciences have been conducted with Master students, using telescopes with diameters in the range 28-50 cm pertaining to the Aula EspaZio Gela Observatory (http://www.ehu.eus/aula-espazio/presentacion.html). Simple instrumentation (cameras and a spectrograph) have been employed to study planetary atmospheres (dynamics and cloud structure) and orbital mechanics using the Galilean satellites. Here we present a sample of these studies, which have lead to publications in refereed journals and have been presented at different meetings with the coauthoring of the students. Plans for the future include involving the master students in high-resolution observations of Solar System planets using a remote controlled 36 cm telescope at the Calar Alto observatory in Southern Spain (separated 1000 km from the teaching facilities in Bilbao).
NASA Technical Reports Server (NTRS)
Stanboli, Alice
2013-01-01
Phxtelemproc is a C/C++ based telemetry processing program that processes SFDU telemetry packets from the Telemetry Data System (TDS). It generates Experiment Data Records (EDRs) for several instruments including surface stereo imager (SSI); robotic arm camera (RAC); robotic arm (RA); microscopy, electrochemistry, and conductivity analyzer (MECA); and the optical microscope (OM). It processes both uncompressed and compressed telemetry, and incorporates unique subroutines for the following compression algorithms: JPEG Arithmetic, JPEG Huffman, Rice, LUT3, RA, and SX4. This program was in the critical path for the daily command cycle of the Phoenix mission. The products generated by this program were part of the RA commanding process, as well as the SSI, RAC, OM, and MECA image and science analysis process. Its output products were used to advance science of the near polar regions of Mars, and were used to prove that water is found in abundance there. Phxtelemproc is part of the MIPL (Multi-mission Image Processing Laboratory) system. This software produced Level 1 products used to analyze images returned by in situ spacecraft. It ultimately assisted in operations, planning, commanding, science, and outreach.
The value of citizen science for ecological monitoring of mammals.
Parsons, Arielle Waldstein; Goforth, Christine; Costello, Robert; Kays, Roland
2018-01-01
Citizen science approaches are of great interest for their potential to efficiently and sustainably monitor wildlife populations on both public and private lands. Here we present two studies that worked with volunteers to set camera traps for ecological surveys. The photographs recorded by these citizen scientists were archived and verified using the eMammal software platform, providing a professional grade, vouchered database of biodiversity records. Motivated by managers' concern with perceived high bear activity, our first example enlisted the help of homeowners in a short-term study to compare black bear activity inside a National Historic Site with surrounding private land. We found similar levels of bear activity inside and outside the NHS, and regional comparisons suggest the bear population is typical. Participants benefited from knowing their local bear population was normal and managers refocused bear management given this new information. Our second example is a continuous survey of wildlife using the grounds of a nature education center that actively manages habitat to maintain a grassland prairie. Center staff incorporated the camera traps into educational programs, involving visitors with camera setup and picture review. Over two years and 5,968 camera-nights this survey has collected 41,393 detections of 14 wildlife species. Detection rates and occupancy were higher in open habitats compared to forest, suggesting that the maintenance of prairie habitat is beneficial to some species. Over 500 volunteers of all ages participated in this project over two years. Some of the greatest benefits have been to high school students, exemplified by a student with autism who increased his communication and comfort level with others through field work with the cameras. These examples show how, with the right tools, training and survey design protocols, citizen science can be used to answer a variety of applied management questions while connecting participants with their secretive mammal neighbors.
The value of citizen science for ecological monitoring of mammals
Goforth, Christine; Costello, Robert; Kays, Roland
2018-01-01
Citizen science approaches are of great interest for their potential to efficiently and sustainably monitor wildlife populations on both public and private lands. Here we present two studies that worked with volunteers to set camera traps for ecological surveys. The photographs recorded by these citizen scientists were archived and verified using the eMammal software platform, providing a professional grade, vouchered database of biodiversity records. Motivated by managers’ concern with perceived high bear activity, our first example enlisted the help of homeowners in a short-term study to compare black bear activity inside a National Historic Site with surrounding private land. We found similar levels of bear activity inside and outside the NHS, and regional comparisons suggest the bear population is typical. Participants benefited from knowing their local bear population was normal and managers refocused bear management given this new information. Our second example is a continuous survey of wildlife using the grounds of a nature education center that actively manages habitat to maintain a grassland prairie. Center staff incorporated the camera traps into educational programs, involving visitors with camera setup and picture review. Over two years and 5,968 camera-nights this survey has collected 41,393 detections of 14 wildlife species. Detection rates and occupancy were higher in open habitats compared to forest, suggesting that the maintenance of prairie habitat is beneficial to some species. Over 500 volunteers of all ages participated in this project over two years. Some of the greatest benefits have been to high school students, exemplified by a student with autism who increased his communication and comfort level with others through field work with the cameras. These examples show how, with the right tools, training and survey design protocols, citizen science can be used to answer a variety of applied management questions while connecting participants with their secretive mammal neighbors. PMID:29610706
An Overview of the HST Advanced Camera for Surveys' On-orbit Performance
NASA Astrophysics Data System (ADS)
Hartig, G. F.; Ford, H. C.; Illingworth, G. D.; Clampin, M.; Bohlin, R. C.; Cox, C.; Krist, J.; Sparks, W. B.; De Marchi, G.; Martel, A. R.; McCann, W. J.; Meurer, G. R.; Sirianni, M.; Tsvetanov, Z.; Bartko, F.; Lindler, D. J.
2002-05-01
The Advanced Camera for Surveys (ACS) was installed in the HST on 7 March 2002 during the fourth servicing mission to the observatory, and is now beginning science operations. The ACS provides HST observers with a considerably more sensitive, higher-resolution camera with wider field and polarimetric, coronagraphic, low-resolution spectrographic and solar-blind FUV capabilities. We review selected results of the early verification and calibration program, comparing the achieved performance with the advertised specifications. Emphasis is placed on the optical characteristics of the camera, including image quality, throughput, geometric distortion and stray-light performance. More detailed analyses of various aspects of the ACS performance are presented in other papers at this meeting. This work was supported by a NASA contract and a NASA grant.
Wilkes, Thomas C.; McGonigle, Andrew J. S.; Pering, Tom D.; Taggart, Angus J.; White, Benjamin S.; Bryant, Robert G.; Willmott, Jon R.
2016-01-01
Here, we report, for what we believe to be the first time, on the modification of a low cost sensor, designed for the smartphone camera market, to develop an ultraviolet (UV) camera system. This was achieved via adaptation of Raspberry Pi cameras, which are based on back-illuminated complementary metal-oxide semiconductor (CMOS) sensors, and we demonstrated the utility of these devices for applications at wavelengths as low as 310 nm, by remotely sensing power station smokestack emissions in this spectral region. Given the very low cost of these units, ≈ USD 25, they are suitable for widespread proliferation in a variety of UV imaging applications, e.g., in atmospheric science, volcanology, forensics and surface smoothness measurements. PMID:27782054
2015-05-08
NASA's Curiosity Mars rover recorded this view of the sun setting at the close of the mission's 956th Martian day, or sol (April 15, 2015), from the rover's location in Gale Crater. This was the first sunset observed in color by Curiosity. The image comes from the left-eye camera of the rover's Mast Camera (Mastcam). The color has been calibrated and white-balanced to remove camera artifacts. Mastcam sees color very similarly to what human eyes see, although it is actually a little less sensitive to blue than people are. Dust in the Martian atmosphere has fine particles that permit blue light to penetrate the atmosphere more efficiently than longer-wavelength colors. That causes the blue colors in the mixed light coming from the sun to stay closer to sun's part of the sky, compared to the wider scattering of yellow and red colors. The effect is most pronounced near sunset, when light from the sun passes through a longer path in the atmosphere than it does at mid-day. Malin Space Science Systems, San Diego, built and operates the rover's Mastcam. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, manages the Mars Science Laboratory Project for NASA's Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19400
Quantifying photometric observing conditions on Paranal using an IR camera
NASA Astrophysics Data System (ADS)
Kerber, Florian; Querel, Richard R.; Hanuschik, Reinhard
2014-08-01
A Low Humidity and Temperature Profiling (LHATPRO) microwave radiometer, manufactured by Radiometer Physics GmbH (RPG), is used to monitor sky conditions over ESO's Paranal observatory in support of VLT science operations. In addition to measuring precipitable water vapour (PWV) the instrument also contains an IR camera measuring sky brightness temperature at 10.5 μm. Due to its extended operating range down to -100 °C it is capable of detecting very cold and very thin, even sub-visual, cirrus clouds. We present a set of instrument flux calibration values as compared with a detrended fluctuation analysis (DFA) of the IR camera zenith-looking sky brightness data measured above Paranal taken over the past two years. We show that it is possible to quantify photometric observing conditions and that the method is highly sensitive to the presence of even very thin clouds but robust against variations of sky brightness caused by effects other than clouds such as variations of precipitable water vapour. Hence it can be used to determine photometric conditions for science operations. About 60 % of nights are free of clouds on Paranal. More work will be required to classify the clouds using this technique. For the future this approach might become part of VLT science operations for evaluating nightly sky conditions.
Balancing Science Objectives and Operational Constraints: A Mission Planner's Challenge
NASA Technical Reports Server (NTRS)
Weldy, Michelle
1996-01-01
The Air Force minute sensor technology integration (MSTI-3) satellite's primary mission is to characterize Earth's atmospheric background clutter. MSTI-3 will use three cameras for data collection, a mid-wave infrared imager, a short wave infrared imager, and a visible imaging spectrometer. Mission science objectives call for the collection of over 2 million images within the one year mission life. In addition, operational constraints limit camera usage to four operations of twenty minutes per day, with no more than 10,000 data and calibrating images collected per day. To balance the operational constraints and science objectives, the mission planning team has designed a planning process to e event schedules and sensor operation timelines. Each set of constraints, including spacecraft performance capabilities, the camera filters, the geographical regions, and the spacecraft-Sun-Earth geometries of interest, and remote tracking station deconflictions has been accounted for in this methodology. To aid in this process, the mission planning team is building a series of tools from commercial off-the-shelf software. These include the mission manifest which builds a daily schedule of events, and the MSTI Scene Simulator which helps build geometrically correct scans. These tools provide an efficient, responsive, and highly flexible architecture that maximizes data collection while minimizing mission planning time.
Soft x-ray streak camera for laser fusion applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stradling, G.L.
This thesis reviews the development and significance of the soft x-ray streak camera (SXRSC) in the context of inertial confinement fusion energy development. A brief introduction of laser fusion and laser fusion diagnostics is presented. The need for a soft x-ray streak camera as a laser fusion diagnostic is shown. Basic x-ray streak camera characteristics, design, and operation are reviewed. The SXRSC design criteria, the requirement for a subkilovolt x-ray transmitting window, and the resulting camera design are explained. Theory and design of reflector-filter pair combinations for three subkilovolt channels centered at 220 eV, 460 eV, and 620 eV aremore » also presented. Calibration experiments are explained and data showing a dynamic range of 1000 and a sweep speed of 134 psec/mm are presented. Sensitivity modifications to the soft x-ray streak camera for a high-power target shot are described. A preliminary investigation, using a stepped cathode, of the thickness dependence of the gold photocathode response is discussed. Data from a typical Argus laser gold-disk target experiment are shown.« less
Applying compressive sensing to TEM video: A substantial frame rate increase on any camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Andrew; Kovarik, Libor; Abellan, Patricia
One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less
Applying compressive sensing to TEM video: A substantial frame rate increase on any camera
Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; ...
2015-08-13
One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less
CAMERA: An integrated strategy for compound spectra extraction and annotation of LC/MS data sets
Kuhl, Carsten; Tautenhahn, Ralf; Böttcher, Christoph; Larson, Tony R.; Neumann, Steffen
2013-01-01
Liquid chromatography coupled to mass spectrometry is routinely used for metabolomics experiments. In contrast to the fairly routine and automated data acquisition steps, subsequent compound annotation and identification require extensive manual analysis and thus form a major bottle neck in data interpretation. Here we present CAMERA, a Bioconductor package integrating algorithms to extract compound spectra, annotate isotope and adduct peaks, and propose the accurate compound mass even in highly complex data. To evaluate the algorithms, we compared the annotation of CAMERA against a manually defined annotation for a mixture of known compounds spiked into a complex matrix at different concentrations. CAMERA successfully extracted accurate masses for 89.7% and 90.3% of the annotatable compounds in positive and negative ion mode, respectively. Furthermore, we present a novel annotation approach that combines spectral information of data acquired in opposite ion modes to further improve the annotation rate. We demonstrate the utility of CAMERA in two different, easily adoptable plant metabolomics experiments, where the application of CAMERA drastically reduced the amount of manual analysis. PMID:22111785
NASA Astrophysics Data System (ADS)
Ryżak, Magdalena; Beczek, Michał; Mazur, Rafał; Sochan, Agata; Bieganowski, Andrzej
2017-04-01
The phenomenon of splash, which is one of the factors causing erosion of the soil surface, is the subject of research of various scientific teams. One of efficient methods of observation and analysis of this phenomenon are high-speed cameras to measure particles at 2000 frames per second or higher. Analysis of the phenomenon of splash with the use of high-speed cameras and specialized software can reveal, among other things, the number of broken particles, their speeds, trajectories, and the distances over which they were transferred. The paper presents an attempt at evaluation of the efficiency of detection of splashed particles with the use of a set of 3 cameras (Vision Research MIRO 310) and software Dantec Dynamics Studio, using a 3D module (Volumetric PTV). In order to assess the effectiveness of estimating the number of particles, the experiment was performed on glass beads with a diameter of 0.5 mm (corresponding to the sand fraction). Water droplets with a diameter of 4.2 mm fell on a sample from a height of 1.5 m. Two types of splashed particles were observed: particle having a low range (up to 18 mm) splashed at larger angles and particles of a high range (up to 118 mm) splashed at smaller angles. The detection efficiency the number of splashed particles estimated by the software was 45 - 65% for particles with a large range. The effectiveness of the detection of particles by the software has been calculated on the basis of comparison with the number of beads that fell on the adhesive surface around the sample. This work was partly financed from the National Science Centre, Poland; project no. 2014/14/E/ST10/00851.
2006-01-19
KENNEDY SPACE CENTER, FLA. -- Great white egrets and a great blue heron in the foreground seem to stand watch as NASA's New Horizons spacecraft leaps off the pad on time at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Ken Thornsley
2006-01-19
KENNEDY SPACE CENTER, FLA. — Into a blue, cloud-scattered sky, NASA’s New Horizons spacecraft lifts off on time at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Debbie Kiger
2006-01-19
KENNEDY SPACE CENTER, FLA. — From between lightning masts surrounding the launch pad, NASA’s New Horizons spacecraft roars into the blue sky aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — NASA’s New Horizons spacecraft emerges from a cloud painted pink by the Atlas V rocket roaring through it after launch from Complex 41 on Cape Canaveral Air Force Station in Florida. Liftoff was on time at 2 p.m. EST. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Kim Shiflett
2006-01-19
KENNEDY SPACE CENTER, FLA. — Into a cloud-scattered blue sky, NASA’s New Horizons spacecraft roars off the launch pad aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — From between lightning masts surrounding the launch pad, NASA’s New Horizons spacecraft roars into the blue sky aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - With the backdrop of blue sky and blue water of the Atlantic Ocean, the Atlas V expendable launch vehicle with the New Horizons spacecraft (center) is nearly ready for launch. Surrounding the rocket are lightning masts that support the catenary wire used to provide lightning protection. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — From between lightning masts surrounding the launch pad, NASA’s New Horizons spacecraft roars into the blue sky aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — Clouds part as NASA’s New Horizons spacecraft roars into the blue sky after an on-time liftoff at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Ken Thornsley
2006-01-19
KENNEDY SPACE CENTER, FLA. — Viewed from the top of the Vehicle Assembly Building at Kennedy Space Center, the blue Atlantic Ocean frames NASA’s New Horizons spacecraft as it launches from Complex 41 on Cape Canaveral Air Force Station in Florida. Liftoff was on time at 2 p.m. EST. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Kim Shiflett
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft moves with the launcher umbilical tower between lightning masts on its way to the launch pad. The liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — Into a blue, cloud-scattered sky, NASA’s New Horizons spacecraft lifts off on time at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Ken Thornsley
2006-01-19
KENNEDY SPACE CENTER, FLA. — NASA’s New Horizons spacecraft pierces a cloud as it roars toward space after an on-time liftoff at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Ken Thornsley
2006-01-19
KENNEDY SPACE CENTER, FLA. — Into a blue, cloud-scattered sky, NASA’s New Horizons spacecraft lifts off on time at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Ken Thornsley
2006-01-19
KENNEDY SPACE CENTER, FLA. — Into a cloud-scattered blue sky, NASA’s New Horizons spacecraft roars off the launch pad aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — Smoke and steam fill the launch pad as NASA’s New Horizons spacecraft roars into the blue sky aboard an Atlas V rocket. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-16
KENNEDY SPACE CENTER, FLA. - On Complex 41 at Cape Canaveral Air Force Station, the Atlas V expendable launch vehicle with the New Horizons spacecraft has been moved to the pad. Umbilicals have been attached. Seen near the rocket are lightning masts that support the catenary wire used to provide lightning protection. Liftoff is scheduled for 1:24 p.m. EST Jan. 17. After its launch aboard the Atlas V, the compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. A launch before Feb. 3 allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — From between lightning masts surrounding the launch pad, NASA’s New Horizons spacecraft roars into the blue sky aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — Leaping into a blue, cloud-scattered sky, NASA’s New Horizons spacecraft lifts off on time at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Ken Thornsley
2006-01-19
KENNEDY SPACE CENTER, FLA. — From among four lightning masts surrounding the launch pad, NASA’s New Horizons spacecraft lifts off the launch pad aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — Great white egrets and a great blue heron in the foreground seem to stand watch as NASA’s New Horizons spacecraft leaps off the pad on time at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Ken Thornsley
2006-01-19
KENNEDY SPACE CENTER, FLA. — With the blue Atlantic Ocean as backdrop, smoke and steam fill the launch pad, at right, as NASA’s New Horizons spacecraft roars into the sky aboard an Atlas V rocket. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft.
2006-01-19
KENNEDY SPACE CENTER, FLA. — NASA’s New Horizons spacecraft roars into the cloud-scattered sky trailing fire and smoke from the Atlas V rocket that propels it. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Kim Shiflett
Streaked Thomson Scattering on Laboratory Plasma Jets
NASA Astrophysics Data System (ADS)
Banasek, Jacob; Byvank, Tom; Rocco, Sophia; Kusse, Bruce; Hammer, David
2017-10-01
Streaked Thomson scattering measurements have been performed on plasma jets created from a 15 μm thick radial Al or Ti foil load on COBRA, a 1 MA pulsed power machine. The goal was to measure the electron temperatures inside the center of the plasma jet created by the radial foil. The laser used for these measurements had a maximum energy of 10 J at 526.5 nm in a 3 ns duration pulse. Early experiments showed using the full energy significantly heats the 5 ×1018 cm-3 jet by inverse bremsstrahlung radiation. Here we used a streak camera to record the scattered spectrum and measure the evolving electron temperature of this laser heated jet. Analysis of the streak camera image showed that the electron temperature of the Al jet was increased from about 25 eV to 80-100 eV within about 2 ns. The Ti jets showed even stronger interaction with the laser, being heated to over 150 eV, and showed some heating even when only 1 J of laser energy was used. Also, the ion-acoustic peaks in the scattered spectrum from the Ti jets were significantly narrower than those from Al jets. Initial results will also be presented with scattered spectra taken at two different times within a single experiment by splitting the probe beam. This research is supported by the NNSA Stewardship Sciences Academic Programs under DOE Cooperative Agreement DE-NA0001836.
Malin, Michal C; Ravine, Michael A; Caplinger, Michael A; Tony Ghaemi, F; Schaffner, Jacob A; Maki, Justin N; Bell, James F; Cameron, James F; Dietrich, William E; Edgett, Kenneth S; Edwards, Laurence J; Garvin, James B; Hallet, Bernard; Herkenhoff, Kenneth E; Heydari, Ezat; Kah, Linda C; Lemmon, Mark T; Minitti, Michelle E; Olson, Timothy S; Parker, Timothy J; Rowland, Scott K; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J; Sumner, Dawn Y; Aileen Yingst, R; Duston, Brian M; McNair, Sean; Jensen, Elsa H
2017-08-01
The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam-34 has an f/8, 34 mm focal length lens, and the M-100 an f/10, 100 mm focal length lens. The M-34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M-100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M-34 can focus from 0.5 m to infinity, and the M-100 from ~1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the ~2 m tall Remote Sensing Mast, have a 360° azimuth and ~180° elevation field of regard. Mars Descent Imager is fixed-mounted to the bottom left front side of the rover at ~66 cm above the surface. Its fixed focus lens is in focus from ~2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of ~70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression.
Ravine, Michael A.; Caplinger, Michael A.; Tony Ghaemi, F.; Schaffner, Jacob A.; Maki, Justin N.; Bell, James F.; Cameron, James F.; Dietrich, William E.; Edgett, Kenneth S.; Edwards, Laurence J.; Garvin, James B.; Hallet, Bernard; Herkenhoff, Kenneth E.; Heydari, Ezat; Kah, Linda C.; Lemmon, Mark T.; Minitti, Michelle E.; Olson, Timothy S.; Parker, Timothy J.; Rowland, Scott K.; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J.; Sumner, Dawn Y.; Aileen Yingst, R.; Duston, Brian M.; McNair, Sean; Jensen, Elsa H.
2017-01-01
Abstract The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam‐34 has an f/8, 34 mm focal length lens, and the M‐100 an f/10, 100 mm focal length lens. The M‐34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M‐100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M‐34 can focus from 0.5 m to infinity, and the M‐100 from ~1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the ~2 m tall Remote Sensing Mast, have a 360° azimuth and ~180° elevation field of regard. Mars Descent Imager is fixed‐mounted to the bottom left front side of the rover at ~66 cm above the surface. Its fixed focus lens is in focus from ~2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of ~70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression. PMID:29098171
ERIC Educational Resources Information Center
Laursen, Sandra L.; Brickley, Annette
2011-01-01
Scientists' involvement in education has increased in recent years due to mechanisms such as the National Science Foundation's "broader impacts" expectations for research projects. The best investment of their effort lies in sharing their expertise on the nature and processes of science; film is one medium by which this can be done…
2001-05-02
Students from DuPont Manual High School in Louisville, Kentucky participated in a video-teleconference during the Pan-Pacific Basin Workshop on Microgravity Sciences held in Pasadena, California. The event originated at the California Science Center in Los Angeles. The DuPont Manual students patched in to the event through the distance learning lab at the Louisville Science Center. This image is from a digital still camera; higher resolution is not available.
ERIC Educational Resources Information Center
Vollmer, Michael; Mollmann, Klaus-Peter
2012-01-01
The recent introduction of inexpensive high-speed cameras offers a new experimental approach to many simple but fast-occurring events in physics. In this paper, the authors present two simple demonstration experiments recorded with high-speed cameras in the fields of gas dynamics and thermal physics. The experiments feature vapour pressure effects…
LPT. Low power test (TAN640) interior. Basement level. Camera facing ...
LPT. Low power test (TAN-640) interior. Basement level. Camera facing north. Cable trays and conduit cross tunnel between critical experiment cell and critical experiment control room. Construction 93% complete. Photographer: Jack L. Anderson. Date: October 23, 1957. INEEL negative no. 57-5339 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Shestakov, Ivan L.; Blednov, Roman G.
2017-05-01
One of urgent security problems is a detection of objects placed inside the human body. Obviously, for safety reasons one cannot use X-rays for such object detection widely and often. For this purpose, we propose to use THz camera and IR camera. Below we continue a possibility of IR camera using for a detection of temperature trace on a human body. In contrast to passive THz camera using, the IR camera does not allow to see very pronounced the object under clothing. Of course, this is a big disadvantage for a security problem solution based on the IR camera using. To find possible ways for this disadvantage overcoming we make some experiments with IR camera, produced by FLIR Company and develop novel approach for computer processing of images captured by IR camera. It allows us to increase a temperature resolution of IR camera as well as human year effective susceptibility enhancing. As a consequence of this, a possibility for seeing of a human body temperature changing through clothing appears. We analyze IR images of a person, which drinks water and eats chocolate. We follow a temperature trace on human body skin, caused by changing of temperature inside the human body. Some experiments are made with observing of temperature trace from objects placed behind think overall. Demonstrated results are very important for the detection of forbidden objects, concealed inside the human body, by using non-destructive control without using X-rays.
Polygon Patterned Ground on Mars and on Earth
NASA Technical Reports Server (NTRS)
2008-01-01
Some high-latitude areas on Mars (left) and Earth (right) exhibit similarly patterned ground where shallow fracturing has drawn polygons on the surface. This patterning may result from cycles of contraction and expansion. The left image shows ground within the targeted landing area NASA's Phoenix Mars Lander before the winter frost had entirely disappeared from the surface. The bright ice in shallow crevices accentuates the area's polygonal fracturing pattern. The polygons are a few meters (several feet) across. The image is a small portion of an exposure taken in March 2008 by the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. The image on the right is an aerial view of similarly patterned ground in Antarctica. The Phoenix Mission is led by the University of Arizona on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory. Spacecraft development is by Lockheed Martin Space Systems. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The High Resolution Imaging Science Experiment is operated by the University of Arizona, Tucson, and the instrument was built by Ball Aerospace & Technologies Corp., Boulder, Colo.2014-05-07
View of the High Definition Earth Viewing (HDEV) flight assembly installed on the exterior of the Columbus European Laboratory module. Image was released by astronaut on Twitter. The High Definition Earth Viewing (HDEV) experiment places four commercially available HD cameras on the exterior of the space station and uses them to stream live video of Earth for viewing online. The cameras are enclosed in a temperature specific housing and are exposed to the harsh radiation of space. Analysis of the effect of space on the video quality, over the time HDEV is operational, may help engineers decide which cameras are the best types to use on future missions. High school students helped design some of the cameras' components, through the High Schools United with NASA to Create Hardware (HUNCH) program, and student teams operate the experiment.
Experiments with synchronized sCMOS cameras
NASA Astrophysics Data System (ADS)
Steele, Iain A.; Jermak, Helen; Copperwheat, Chris M.; Smith, Robert J.; Poshyachinda, Saran; Soonthorntham, Boonrucksar
2016-07-01
Scientific-CMOS (sCMOS) cameras can combine low noise with high readout speeds and do not suffer the charge multiplication noise that effectively reduces the quantum efficiency of electron multiplying CCDs by a factor 2. As such they have strong potential in fast photometry and polarimetry instrumentation. In this paper we describe the results of laboratory experiments using a pair of commercial off the shelf sCMOS cameras based around a 4 transistor per pixel architecture. In particular using a both stable and a pulsed light sources we evaluate the timing precision that may be obtained when the cameras readouts are synchronized either in software or electronically. We find that software synchronization can introduce an error of 200-msec. With electronic synchronization any error is below the limit ( 50-msec) of our simple measurement technique.
2001-03-13
Arrays of lights at left focus on solar array panels at right during illumination testing. The solar array is part of the 2001 Mars Odyssey Orbiter. Scheduled for launch April 7, 2001, the orbiter contains three science instruments: THEMIS, the Gamma Ray Spectrometer (GRS), and the Mars Radiation Environment Experiment (MARIE). THEMIS will map the mineralogy and morphology of the Martian surface using a high-resolution camera and a thermal infrared imaging spectrometer. The GRS will achieve global mapping of the elemental composition of the surface and determine the abundance of hydrogen in the shallow subsurface. The MARIE will characterize aspects of the near-space radiation environment with regards to the radiation-related risk to human explorers
2001-03-13
Workers in the Spacecraft Assembly and Encapsulation Facility (SAEF 2) reattach the solar panel on the 2001 Mars Odyssey Orbiter in order to conduct illumination testing. Scheduled for launch April 7, 2001, the orbiter contains three science instruments: THEMIS, the Gamma Ray Spectrometer (GRS), and the Mars Radiation Environment Experiment (MARIE). THEMIS will map the mineralogy and morphology of the Martian surface using a high-resolution camera and a thermal infrared imaging spectrometer. The GRS will achieve global mapping of the elemental composition of the surface and determine the abundance of hydrogen in the shallow subsurface. The MARIE will characterize aspects of the near-space radiation environment with regards to the radiation-related risk to human explorers
2001-03-13
In the Spacecraft Assembly and Encapsulation Facility (SAEF 2), workers get ready to open the panels of the solar array on the 2001 Mars Odyssey Orbiter in order to conduct illumination testing. Scheduled for launch April 7, 2001, the orbiter contains three science instruments: THEMIS, the Gamma Ray Spectrometer (GRS), and the Mars Radiation Environment Experiment (MARIE). THEMIS will map the mineralogy and morphology of the Martian surface using a high-resolution camera and a thermal infrared imaging spectrometer. The GRS will achieve global mapping of the elemental composition of the surface and determine the abundance of hydrogen in the shallow subsurface. The MARIE will characterize aspects of the near-space radiation environment with regards to the radiation-related risk to human explorers
2001-03-13
Workers in the Spacecraft Assembly and Encapsulation Facility (SAEF 2) reattach the solar panel on the 2001 Mars Odyssey Orbiter in order to conduct illumination testing. Scheduled for launch April 7, 2001, the orbiter contains three science instruments: THEMIS, the Gamma Ray Spectrometer (GRS), and the Mars Radiation Environment Experiment (MARIE). THEMIS will map the mineralogy and morphology of the Martian surface using a high-resolution camera and a thermal infrared imaging spectrometer. The GRS will achieve global mapping of the elemental composition of the surface and determine the abundance of hydrogen in the shallow subsurface. The MARIE will characterize aspects of the near-space radiation environment with regards to the radiation-related risk to human explorers
2001-03-13
In the Spacecraft Assembly and Encapsulation Facility (SAEF 2), workers stand back as the panels of the solar array on the 2001 Mars Odyssey Orbiter open. The array will undergo illumination testing. Scheduled for launch April 7, 2001, the orbiter contains three science instruments: THEMIS, the Gamma Ray Spectrometer (GRS), and the Mars Radiation Environment Experiment (MARIE). THEMIS will map the mineralogy and morphology of the Martian surface using a high-resolution camera and a thermal infrared imaging spectrometer. The GRS will achieve global mapping of the elemental composition of the surface and determine the abundance of hydrogen in the shallow subsurface. The MARIE will characterize aspects of the near-space radiation environment with regards to the radiation-related risk to human explorers
2001-03-13
A worker in the Spacecraft Assembly and Encapsulation Facility (SAEF 2) checks the underside of the extended solar array panels on the 2001 Mars Odyssey Orbiter. The array will undergo illumination testing. Scheduled for launch April 7, 2001, the orbiter contains three science instruments: THEMIS, the Gamma Ray Spectrometer (GRS), and the Mars Radiation Environment Experiment (MARIE). THEMIS will map the mineralogy and morphology of the Martian surface using a high-resolution camera and a thermal infrared imaging spectrometer. The GRS will achieve global mapping of the elemental composition of the surface and determine the abundance of hydrogen in the shallow subsurface. The MARIE will characterize aspects of the near-space radiation environment with regards to the radiation-related risk to human explorers
NASA Astrophysics Data System (ADS)
Higgins, M. L.; Lebofsky, L. A.; McCarthy, D. W.; Lebofsky, N.
2013-04-01
In 2003, the University of Arizona's (UA) NIRCam EPO team (NASA James Webb Space Telescope's Near-Infrared Camera) and the Girl Scouts of Southern Arizona began a long-term collaboration to bring STEM and astronomy activities and concepts to adult Girl Scout volunteers and staff and, in turn, their councils and girls, i.e., to train the trainers. Nationally, our goal is to reach adult volunteers and staff in all 112 councils. To date, this program has reached nearly 240 adults from 78 councils in 41 states, DC, Guam, and Japan, bringing together adult volunteers and staff, UA graduate students, and NIRCam scientists and educators to experience Arizona's dark skies.
With a cloudy horizon scene as a backdrop, the Spartan 207 free-flyer is held in the grasp of the
NASA Technical Reports Server (NTRS)
1996-01-01
STS-77 ESC VIEW --- With a cloudy horizon scene as a backdrop, the Spartan 207 free-flyer is held in the grasp of the Space Shuttle Endeavour's Remote Manipulator System (RMS) following its re-capture on May 21, 1996. The view was captured with an onboard Electronic Still Camera (ESC). The six-member crew has spent a portion of the early stages of the mission in various activities involving the Spartan 207 and the related Inflatable Antenna Experiment (IAE). The Spartan project is managed by NASA's Goddard Space Flight Center (GSFC) for NASA's Office of Space Science, Washington, D.C. GMT: 09:39:35.
2008-07-11
CAPE CANAVERAL, Fla. – In the Orbiter Processing Facility at NASA's Kennedy Space Center, STS-125 Mission Specialists Mike Massimino (left) and Michael Good (right) check out the orbiter boom sensor system and the attached camera in space shuttle Atlantis' payload bay. Equipment familiarization is part of the crew equipment interface test, which provides hands-on experience with hardware and equipment for the mission. Atlantis is targeted to launch Oct. 8 on the STS-125 mission to service the Hubble Space Telescope. The mission crew will perform history-making, on-orbit “surgery” on two important science instruments aboard the telescope. After capturing the telescope, two teams of spacewalking astronauts will perform the repairs during five planned spacewalks. Photo credit: NASA/Kim Shiflett
NASA Astrophysics Data System (ADS)
Peltoniemi, Mikko; Aurela, Mika; Böttcher, Kristin; Kolari, Pasi; Loehr, John; Karhu, Jouni; Linkosalmi, Maiju; Melih Tanis, Cemal; Tuovinen, Juha-Pekka; Nadir Arslan, Ali
2018-01-01
In recent years, monitoring of the status of ecosystems using low-cost web (IP) or time lapse cameras has received wide interest. With broad spatial coverage and high temporal resolution, networked cameras can provide information about snow cover and vegetation status, serve as ground truths to Earth observations and be useful for gap-filling of cloudy areas in Earth observation time series. Networked cameras can also play an important role in supplementing laborious phenological field surveys and citizen science projects, which also suffer from observer-dependent observation bias. We established a network of digital surveillance cameras for automated monitoring of phenological activity of vegetation and snow cover in the boreal ecosystems of Finland. Cameras were mounted at 14 sites, each site having 1-3 cameras. Here, we document the network, basic camera information and access to images in the permanent data repository (http://www.zenodo.org/communities/phenology_camera/). Individual DOI-referenced image time series consist of half-hourly images collected between 2014 and 2016 (https://doi.org/10.5281/zenodo.1066862). Additionally, we present an example of a colour index time series derived from images from two contrasting sites.
The Large Synoptic Survey Telescope (LSST) Camera
None
2018-06-13
Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energyâs SLAC National Accelerator Laboratory is leading the construction of the LSST camera â the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.
STS-99 Commander Kregel poses with EARTHKAM camera on OV-105's flight deck
2000-03-30
STS099-314-035 (11-22 February 2000) ---Astronaut Kevin R. Kregel, mission commander, works with camera equipment, which was used for the EarthKAM project. The camera stayed busy throughout the 11-day mission taking vertical imagery of the Earth points of opportunity for the project. Students across the United States and in France, Germany and Japan took photos throughout the STS-99 mission. And they are using these new photos, plus all the images already available in the EarthKAM system, to enhance their classroom learning in Earth and space science, social studies, geography, mathematics and more.
Stealth life detection instruments aboard Curiosity
NASA Astrophysics Data System (ADS)
Levin, Gilbert V.
2012-10-01
NASA has often stated (e.g. MSL Science Corner1) that it's Mars Science Laboratory (MSL), "Curiosity," Mission to Mars carries no life detection experiments. This is in keeping with NASA's 36-year explicit ban on such, imposed immediately after the 1976 Viking Mission to Mars. The space agency attributes the ban to the "ambiguity" of that Mission's Labeled Release (LR) life detection experiment, fearing an adverse effect on the space program should a similar "inconclusive" result come from a new robotic quest. Yet, despite the NASA ban, this author, the Viking LR Experimenter, contends there are "stealth life detection instruments" aboard Curiosity. These are life detection instruments in the sense that they can free the Viking LR from the pall of ambiguity that has held it prisoner so long. Curiosity's stealth instruments are those seeking organic compounds, and the mission's high-resolution camera system. Results from any or all of these devices, coupled with the Viking LR data, can confirm the LR's life detection claim. In one possible scenario, Curiosity can, of itself, completely corroborate the finding of life on Mars. MSL has just successfully landed on Mars. Hopefully, its stealth confirmations of life will be reported shortly.
Speech versus manual control of camera functions during a telerobotic task
NASA Technical Reports Server (NTRS)
Bierschwale, John M.; Sampaio, Carlos E.; Stuart, Mark A.; Smith, Randy L.
1993-01-01
This investigation has evaluated the voice-commanded camera control concept. For this particular task, total voice control of continuous and discrete camera functions was significantly slower than manual control. There was no significant difference between voice and manual input for several types of errors. There was not a clear trend in subjective preference of camera command input modality. Task performance, in terms of both accuracy and speed, was very similar across both levels of experience.
Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera
NASA Astrophysics Data System (ADS)
Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.
2016-04-01
The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.
Building and Deploying Remotely Operated Vehicles in the First-Year Experience
NASA Astrophysics Data System (ADS)
O'Brien-Gayes, A.; Fuss, K.; Gayes, P.
2007-12-01
Coastal Carolina University has committed to improving student retention and success in Mathematics and Science through a pilot program to engage first-year students in an applied and investigative project as part of the University's First-Year Experience (FYE). During the fall 2007 semester, five pilot sections of FYE classes, consisting of students from the College of Natural and Applied Sciences are building and deploying Remotely Operated Vehicles (ROVs). These ROV-based classes are designed to: accelerate exploration of the broad fields of science and mathematics; enlist interest in technology by engaging students in a multi-stepped, interdisciplinary problem solving experience; explore science and mathematical concepts; institute experiential learning; and build a culture of active learners to benefit student success across traditional departmental boundaries. Teams of three students (forty teams total) will build, based on the MIT Sea Perch design, and test ROVs in addition to collecting data with their ROVs. Various accessories attached to the vehicles for data collection will include temperature and light sensors, plankton nets and underwater cameras. The first-year students will then analyze the data, and the results will be documented as part of their capstone projects. Additionally, two launch days will take place on two campus ponds. Local middle and high school teachers and their students will be invited to observe this event. The teams of students with the most capable and successful ROVs will participate in a workshop held in November 2007 for regional elementary, middle and high school teachers. These students will give a presentation on the building of the ROVs and also provide a hands-on demonstration for the workshop participants. These activities will ensure an incorporation of service learning into the first semester of the freshmen experience. The desired outcomes of the ROV-based FYE classes are: increased retention at the postsecondary level in mathematics and science; increased student confidence to persevere through difficult courses by seeing the actual application of the science; greater self-esteem and self-efficacy through service learning; and engaging middle and high school students in mathematics and science. The innovative significance of the program is three fold: applying experiential learning through technology; integrating disciplines in a planned manner with consistent delivery; and creating an environment conducive to success.
NASA Technical Reports Server (NTRS)
1992-01-01
The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.
Electronic Still Camera view of Aft end of Wide Field/Planetary Camera in HST
1993-12-06
S61-E-015 (6 Dec 1993) --- A close-up view of the aft part of the new Wide Field/Planetary Camera (WFPC-II) installed on the Hubble Space Telescope (HST). WFPC-II was photographed with the Electronic Still Camera (ESC) from inside Endeavour's cabin as astronauts F. Story Musgrave and Jeffrey A. Hoffman moved it from its stowage position onto the giant telescope. Electronic still photography is a relatively new technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality. The electronic still camera has flown as an experiment on several other shuttle missions.
NASA Astrophysics Data System (ADS)
Duan, Yaxuan; Xu, Songbo; Yuan, Suochao; Chen, Yongquan; Li, Hongguang; Da, Zhengshang; Gao, Limin
2018-01-01
ISO 12233 slanted-edge method experiences errors using fast Fourier transform (FFT) in the camera modulation transfer function (MTF) measurement due to tilt angle errors in the knife-edge resulting in nonuniform sampling of the edge spread function (ESF). In order to resolve this problem, a modified slanted-edge method using nonuniform fast Fourier transform (NUFFT) for camera MTF measurement is proposed. Theoretical simulations for images with noise at a different nonuniform sampling rate of ESF are performed using the proposed modified slanted-edge method. It is shown that the proposed method successfully eliminates the error due to the nonuniform sampling of the ESF. An experimental setup for camera MTF measurement is established to verify the accuracy of the proposed method. The experiment results show that under different nonuniform sampling rates of ESF, the proposed modified slanted-edge method has improved accuracy for the camera MTF measurement compared to the ISO 12233 slanted-edge method.
Comparison of three different techniques for camera and motion control of a teleoperated robot.
Doisy, Guillaume; Ronen, Adi; Edan, Yael
2017-01-01
This research aims to evaluate new methods for robot motion control and camera orientation control through the operator's head orientation in robot teleoperation tasks. Specifically, the use of head-tracking in a non-invasive way, without immersive virtual reality devices was combined and compared with classical control modes for robot movements and camera control. Three control conditions were tested: 1) a condition with classical joystick control of both the movements of the robot and the robot camera, 2) a condition where the robot movements were controlled by a joystick and the robot camera was controlled by the user head orientation, and 3) a condition where the movements of the robot were controlled by hand gestures and the robot camera was controlled by the user head orientation. Performance, workload metrics and their evolution as the participants gained experience with the system were evaluated in a series of experiments: for each participant, the metrics were recorded during four successive similar trials. Results shows that the concept of robot camera control by user head orientation has the potential of improving the intuitiveness of robot teleoperation interfaces, specifically for novice users. However, more development is needed to reach a margin of progression comparable to a classical joystick interface. Copyright © 2016 Elsevier Ltd. All rights reserved.
Final Report for the Advanced Camera for Surveys (ACS)
NASA Technical Reports Server (NTRS)
2004-01-01
ACS was launched aboard the Space Shuttle Columbia just before dawn on March 1, 2002. At the time of liftoff, the Hubble Space Telescope (HST) was reflecting the early morning sun as it moved across the sky. After successfully docking with HST, several components were replaced. One of the components was the Advanced Camera for Surveys built by Ball Aerospace & Technologies Corp. (BATC) in Boulder, Colorado. Over the life of the HST contract at BATC, hundreds of employees had the pleasure of working on the concept, design, fabrication, assembly, and test of ACS. Those employees thank NASA - Goddard Space Flight Center and the science team at Johns Hopkins University (JHU) for the opportunity to participate in building a great science instrument for HST.
The 360 Degree Fulldome Production "Clockwork Ocean"
NASA Astrophysics Data System (ADS)
Baschek, B.; Heinsohn, R.; Opitz, D.; Fischer, T.; Baschek, T.
2016-02-01
The investigation of submesoscale eddies and fronts is one of the leading oceanographic topics at the Ocean Sciences Meeting 2016. In order to observe these small and short-lived phenomena, planes equipped with high-resolution cameras and fast vessels were deployed during the Submesoscale Experiments (SubEx) leading to some of the first high-resolution observations of these eddies. In a future experiment, a zeppelin will be used the first time in marine sciences. The relevance of submesoscale processes for the oceans and the work of the eddy hunters is described in the fascinating 9-minute long 360 degree fulldome production Clockwork Ocean. The fully animated movie is introduced in this presentation taking the observer from the bioluminescence in the deep ocean to a view of our blue planet from space. The immersive media is used to combine fascination for a yet unknown environment with scientific education of a broad audience. Detailed background information is available at the parallax website www.clockwork-ocean.com. The Film is also available for Virtual Reality glasses and smartphones to reach a broader distribution. A unique Mobile Dome with an area of 70 m² and seats for 40 people is used for science education at events, festivals, for politicians and school classes. The spectators are also invited to participate in the experiments by presenting 360 degree footage of the measurements. The premiere of Clockwork Ocean was in July 2015 in Hamburg, Germany and will be worldwide available in English and German as of fall 2015. Clockwork Ocean is a film of the Helmholtz-Zentrum Geesthacht produced by Daniel Opitz and Ralph Heinsohn.
2014-04-18
CAPE CANAVERAL, Fla. - Remote-controlled and sound-activated cameras placed around the perimeter of the pad by media organizations capture images of the SpaceX Falcon 9 rocket as it rises off Space Launch Complex 40 at Cape Canaveral Air Force Station, sending the Dragon resupply spacecraft on its way to the International Space Station. Liftoff was during an instantaneous window at 3:25 p.m. EDT. Dragon is making its fourth trip to the space station. The SpaceX-3 mission, carrying almost 2.5 tons of supplies, technology and science experiments, is the third of 12 flights through a $1.6 billion NASA Commercial Resupply Services contract. Dragon's cargo will support more than 150 experiments that will be conducted during the station's Expeditions 39 and 40. For more information, visit http://www.nasa.gov/mission_pages/station/structure/launch/index.html. Photo credit: NASA/Tony Gray and Tim Terry
Confessions of an Accidental E/POnomer, from in Front of the Camera and Behind the Pixels
NASA Astrophysics Data System (ADS)
Durda, Daniel D.
2015-11-01
The various techniques, styles, and venues for popularizing the research outputs from our planetary science community reflect the diversity of the community itself. While some are eloquent public speakers or gifted writers, other colleagues are regularly sought for television appearances or for artwork to illustrate press releases. Whatever the method or medium, the collection of experiences we have had in ‘getting the word out’ represents a valuable resource pool of ideas and lessons learned from which we can all learn and improve. I will share some of my experiences from working with a number of television productions and magazine editors, with some thoughts on contemporary television production styles. I will also discuss life as both a planetary scientist and digital artist and share some resources that are ready, willing, and quite able to help tell the visual story of your research.
Characteristics of an Imaging Polarimeter for the Powell Observatory
NASA Astrophysics Data System (ADS)
Hall, Shannon; Henson, G.
2010-01-01
A dual-beam imaging polarimeter has been built for use on the 14 inch Schmidt-Cassegrain telescope at the ETSU Harry D. Powell Observatory. The polarimeter includes a rotating half-wave plate and a Wollaston prism to separate light into two orthogonal linearly polarized rays. A TEC cooled CCD camera is used to detect the modulated polarized light. We present here measurements of the polarization of polarimetric standard stars. By measuring unpolarized and polarized standard stars we are able to establish the instrumental polarization and the efficiency of the instrument. The polarimeter will initially be used as a dedicated instrument in an ongoing project to monitor the eclipsing binary star, Epsilon Aurigae. This project was funded by a partnership between the National Science Foundation (NSF AST-0552798), Research Experience for Undergraduates (REU), and the Department of Defense (DoD) ASSURE (Awards to Stimulate and Support Undergraduate Research Experiences) programs.
NASA Astrophysics Data System (ADS)
Sund, Per
2016-09-01
Science teachers regard practical work as important and many claim that it helps students to learn science. Besides theoretical knowledge, such as concepts and formulas, practical work is considered to be an integral and basic part of science education. As practical work is perceived and understood in different ways, comparing the results between classes and schools is difficult. One way of making the results comparable is to develop systematic inquiries to be assessed in national large-scale tests. However, introducing similar testing conditions in a laboratory environment is not always possible. Although the instructions and assessment guides for such tests are detailed, many obstacles need to be overcome if equality in the overall test situation is to be achieved. This empirical case study investigates two secondary school science teachers' assessments of 15-16 years old students in three separate groups in the practical part of a Swedish national test in chemistry. Data are gathered using two video cameras and three pairs of spy camera glasses. The results show that individual and independent assessments are difficult due to the social interactions that take place and the physical sources of errors that occur in this type of setting.
JPRS Report, Science & Technology, Japan, 27th Aircraft Symposium
1990-10-29
screen; the relative attitude is then determined . 2) Video Sensor System Specific patterns (grapple target, etc.) drawn on the target spacecraft , or the...entire target spacecraft , is imaged by camera . Navigation information is obtained by on-board image processing, such as extraction of contours and...standard figure called "grapple target" located in the vicinity of the grapple fixture on the target spacecraft is imaged by camera . Contour lines and
JPRS Report, Science & Technology, Japan, 4th Intelligent Robots Symposium, Volume 2
1989-03-16
accidents caused by strikes by robots,5 a quantitative model for safety evaluation,6 and evaluations of actual systems7 in order to contribute to...Mobile Robot Position Referencing Using Map-Based Vision Systems.... 160 Safety Evaluation of Man-Robot System 171 Fuzzy Path Pattern of Automatic...camera are made after the robot stops to prevent damage from occurring through obstacle interference. The position of the camera is indicated on the
Close Binary Star Speckle Interferometry on the McMath-Pierce 0.8-Meter Solar Telescope
NASA Astrophysics Data System (ADS)
Wiley, Edward; Harshaw, Richard; Jones, Gregory; Branston, Detrick; Boyce, Patrick; Rowe, David; Ridgely, John; Estrada, Reed; Genet, Russell
2015-09-01
Observations were made in April 2014 to assess the utility of the 0.8-meter solar telescope at the McMath-Pierce Solar Observatory at Kitt Peak National Observatory for performing speckle interferometry observations of close binary stars. Several configurations using science cameras, acquisition cameras, eyepieces, and flip mirrors were evaluated. Speckle images were obtained and recommendations for further improvement of the acquisition system are presented.
NASA Astrophysics Data System (ADS)
McIntosh, Benjamin Patrick
Blindness due to Age-Related Macular Degeneration and Retinitis Pigmentosa is unfortunately both widespread and largely incurable. Advances in visual prostheses that can restore functional vision in those afflicted by these diseases have evolved rapidly from new areas of research in ophthalmology and biomedical engineering. This thesis is focused on further advancing the state-of-the-art of both visual prostheses and implantable biomedical devices. A novel real-time system with a high performance head-mounted display is described that enables enhanced realistic simulation of intraocular retinal prostheses. A set of visual psychophysics experiments is presented using the visual prosthesis simulator that quantify, in several ways, the benefit of foveation afforded by an eye-pointed camera (such as an eye-tracked extraocular camera or an implantable intraocular camera) as compared with a head-pointed camera. A visual search experiment demonstrates a significant improvement in the time to locate a target on a screen when using an eye-pointed camera. A reach and grasp experiment demonstrates a 20% to 70% improvement in time to grasp an object when using an eye-pointed camera, with the improvement maximized when the percept is blurred. A navigation and mobility experiment shows a 10% faster walking speed and a 50% better ability to avoid obstacles when using an eye-pointed camera. Improvements to implantable biomedical devices are also described, including the design and testing of VLSI-integrable positive mobile ion contamination sensors and humidity sensors that can validate the hermeticity of biomedical device packages encapsulated by hermetic coatings, and can provide early warning of leaks or contamination that may jeopardize the implant. The positive mobile ion contamination sensors are shown to be sensitive to externally applied contamination. A model is proposed to describe sensitivity as a function of device geometry, and verified experimentally. Guidelines are provided on the use of spare CMOS oxide and metal layers to maximize the hermeticity of an implantable microchip. In addition, results are presented on the design and testing of small form factor, very low power, integrated CMOS clock generation circuits that are stable enough to drive commercial image sensor arrays, and therefore can be incorporated in an intraocular camera for retinal prostheses.
Development of a 300,000-pixel ultrahigh-speed high-sensitivity CCD
NASA Astrophysics Data System (ADS)
Ohtake, H.; Hayashida, T.; Kitamura, K.; Arai, T.; Yonai, J.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Poggemann, D.; Ruckelshausen, A.; van Kuijk, H.; Bosiers, Jan T.
2006-02-01
We are developing an ultrahigh-speed, high-sensitivity broadcast camera that is capable of capturing clear, smooth slow-motion videos even where lighting is limited, such as at professional baseball games played at night. In earlier work, we developed an ultrahigh-speed broadcast color camera1) using three 80,000-pixel ultrahigh-speed, highsensitivity CCDs2). This camera had about ten times the sensitivity of standard high-speed cameras, and enabled an entirely new style of presentation for sports broadcasts and science programs. Most notably, increasing the pixel count is crucially important for applying ultrahigh-speed, high-sensitivity CCDs to HDTV broadcasting. This paper provides a summary of our experimental development aimed at improving the resolution of CCD even further: a new ultrahigh-speed high-sensitivity CCD that increases the pixel count four-fold to 300,000 pixels.
Expedition One CDR Shepherd with IMAX camera
2001-02-11
STS98-E-5164 (11 February 2001) --- Astronaut William M. (Bill) Shepherd documents activity onboard the newly attached Destiny laboratory using an IMAX motion picture camera. The crews of Atlantis and the International Space Station on February 11 opened the Destiny laboratory and spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Shepherd opened the Destiny hatch, and he and Shuttle commander Kenneth D. Cockrell ventured inside at 8:38 a.m. (CST). Members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station and filmed several scenes onboard the station using an IMAX camera. This scene was recorded with a digital still camera.
Operation and Performance of the Mars Exploration Rover Imaging System on the Martian Surface
NASA Technical Reports Server (NTRS)
Maki, Justin N.; Litwin, Todd; Herkenhoff, Ken
2005-01-01
This slide presentation details the Mars Exploration Rover (MER) imaging system. Over 144,000 images have been gathered from all Mars Missions, with 83.5% of them being gathered by MER. Each Rover has 9 cameras (Navcam, front and rear Hazcam, Pancam, Microscopic Image, Descent Camera, Engineering Camera, Science Camera) and produces 1024 x 1024 (1 Megapixel) images in the same format. All onboard image processing code is implemented in flight software and includes extensive processing capabilities such as autoexposure, flat field correction, image orientation, thumbnail generation, subframing, and image compression. Ground image processing is done at the Jet Propulsion Laboratory's Multimission Image Processing Laboratory using Video Image Communication and Retrieval (VICAR) while stereo processing (left/right pairs) is provided for raw image, radiometric correction; solar energy maps,triangulation (Cartesian 3-spaces) and slope maps.
Overhead View of Area Surrounding Pathfinder
NASA Technical Reports Server (NTRS)
1997-01-01
Overhead view of the area surrounding the Pathfinder lander illustrating the Sojourner traverse. Red rectangles are rover positions at the end of sols 1-30. Locations of soil mechanics experiments, wheel abrasion experiments, and APXS measurements are shown. The A numbers refer to APXS measurements as discussed in the paper by Rieder et al. (p. 1770, Science Magazine, see image note). Coordinates are given in the LL frame.
The photorealistic, interactive, three-dimensional virtual reality (VR) terrain models were created from IMP images using a software package developed for Pathfinder by C. Stoker et al. as a participating science project. By matching features in the left and right camera, an automated machine vision algorithm produced dense range maps of the nearfield, which were projected into a three-dimensional model as a connected polygonal mesh. Distance and angle measurements can be made on features viewed in the model using a mouse-driven three-dimensional cursor and a point-and-click interface. The VR model also incorporates graphical representations of the lander and rover and the sequence and spatial locations at which rover data were taken. As the rover moved, graphical models of the rover were added for each position that could be uniquely determined using stereo images of the rover taken by the IMP. Images taken by the rover were projected into the model as two-dimensional 'billboards' to show the proper perspective of these images.NOTE: original caption as published in Science MagazineMars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech).NASA Astrophysics Data System (ADS)
Tucker, G. E.
1997-05-01
This NSF supported program, emphasizing hands-on learning and observation with modern instruments, is described in its pilot phase, prior to being launched nationally. A group of 14 year old students are using a small (21 cm) computer controlled telescope and CCD camera to do: (1) a 'sky survey' of brighter celestial objects, finding, identifying, and learning about them, and accumulating a portfolio of images, (2) photometry of variable stars, reducing the data to get a light curve, and (3) learn modern computer-based communication/dissemination skills by posting images and data to a Web site they are designing (http://www.javanet.com/ sky) and contributing data to archives (e.g. AAVSO) via the Internet. To attract more interest to astronomy and science in general and have a wider impact on the school and surrounding community, peer teaching is used as a pedagogical technique and families are encouraged to participate. Students teach e.g. astronomy, software and computers, Internet, instrumentation, and observing to other students, parents and the community by means of daytime presentations of their results (images and data) and evening public viewing at the telescope, operating the equipment themselves. Students can contribute scientifically significant data and experience the `discovery' aspect of science through observing projects where a measurement is made. Their `informal education' activities also help improve the perception of science in general and astronomy in particular in society at large. This program could benefit from collaboration with astronomers wanting to organize geographically distributed observing campaigns coordinated over the Internet and willing to advise on promising observational programs for small telescopes in the context of current science.
A modular positron camera for the study of industrial processes
NASA Astrophysics Data System (ADS)
Leadbeater, T. W.; Parker, D. J.
2011-10-01
Positron imaging techniques rely on the detection of the back-to-back annihilation photons arising from positron decay within the system under study. A standard technique, called positron emitting particle tracking (PEPT) [1], uses a number of these detected events to rapidly determine the position of a positron emitting tracer particle introduced into the system under study. Typical applications of PEPT are in the study of granular and multi-phase materials in the disciplines of engineering and the physical sciences. Using components from redundant medical PET scanners a modular positron camera has been developed. This camera consists of a number of small independent detector modules, which can be arranged in custom geometries tailored towards the application in question. The flexibility of the modular camera geometry allows for high photon detection efficiency within specific regions of interest, the ability to study large and bulky systems and the application of PEPT to difficult or remote processes as the camera is inherently transportable.
The status of MUSIC: the multiwavelength sub-millimeter inductance camera
NASA Astrophysics Data System (ADS)
Sayers, Jack; Bockstiegel, Clint; Brugger, Spencer; Czakon, Nicole G.; Day, Peter K.; Downes, Thomas P.; Duan, Ran P.; Gao, Jiansong; Gill, Amandeep K.; Glenn, Jason; Golwala, Sunil R.; Hollister, Matthew I.; Lam, Albert; LeDuc, Henry G.; Maloney, Philip R.; Mazin, Benjamin A.; McHugh, Sean G.; Miller, David A.; Mroczkowski, Anthony K.; Noroozian, Omid; Nguyen, Hien Trong; Schlaerth, James A.; Siegel, Seth R.; Vayonakis, Anastasios; Wilson, Philip R.; Zmuidzinas, Jonas
2014-08-01
The Multiwavelength Sub/millimeter Inductance Camera (MUSIC) is a four-band photometric imaging camera operating from the Caltech Submillimeter Observatory (CSO). MUSIC is designed to utilize 2304 microwave kinetic inductance detectors (MKIDs), with 576 MKIDs for each observing band centered on 150, 230, 290, and 350 GHz. MUSIC's field of view (FOV) is 14' square, and the point-spread functions (PSFs) in the four observing bands have 45'', 31'', 25'', and 22'' full-widths at half maximum (FWHM). The camera was installed in April 2012 with 25% of its nominal detector count in each band, and has subsequently completed three short sets of engineering observations and one longer duration set of early science observations. Recent results from on-sky characterization of the instrument during these observing runs are presented, including achieved map- based sensitivities from deep integrations, along with results from lab-based measurements made during the same period. In addition, recent upgrades to MUSIC, which are expected to significantly improve the sensitivity of the camera, are described.
An Inexpensive Digital Infrared Camera
ERIC Educational Resources Information Center
Mills, Allan
2012-01-01
Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)
Polarization Observations of the Total Solar Eclipse of August 21, 2017
NASA Astrophysics Data System (ADS)
Burkepile, J.; Boll, A.; Casini, R.; de Toma, G.; Elmore, D. F.; Gibson, K. L.; Judge, P. G.; Mitchell, A. M.; Penn, M.; Sewell, S. D.; Tomczyk, S.; Yanamandra-Fisher, P. A.
2017-12-01
A total solar eclipse offers ideal sky conditions for viewing the solar corona. Light from the corona is composed of three components: the E-corona, made up of spectral emission lines produced by ionized elements in the corona; the K-corona, produced by photospheric light that is Thomson scattered by coronal electrons; and the F-corona, produced by sunlight scattered from dust particles in the near Sun environment and in interplanetary space. Polarized white light observations of the corona provide a way of isolating the K-corona to determine its structure, brightness, and density. This work focuses on broadband white light polarization observations of the corona during the upcoming solar eclipse from three different instruments. We compare coronal polarization brightness observations of the August 21, 2017 total solar eclipse from the NCAR/High Altitude Observatory (HAO) Rosetta Stone experiment using the 4-D Technology PolarCam camera with the two Citizen PACA_CATE17Pol telescopes that will acquire linear polarization observations of the eclipse and the NCAR/HAO K-Cor white light coronagraph observations from the Mauna Loa Solar Observatory in Hawaii. This comparison includes a discussion of the cross-calibration of the different instruments and reports the results of the coronal polarization brightness and electron density of the corona. These observations will be compared with results from previous coronal measurements taken at different phases of the solar cycle. In addition, we report on the performance of the three different polarimeters. The 4-D PolarCam uses a linear polarizer array, PACA_CATE17Pol uses a nematic liquid crystal retarder in a single beam configuration and K-Cor uses a pair of ferroelectric liquid crystal retarders in a dual-beam configuration. The use of the 4-D PolarCam camera in the Rosetta Stone experiment is to demonstrate the technology for acquiring high cadence polarization measurements. The Rosetta Stone experiment is funded through the NASA award NNH16ZDA001N-ISE. The Citizen Science approach to measuring the polarized solar corona during the eclipse is funded through NASA award NNX17AH76G. The NCAR Mauna Loa Solar Observatory is funded by the National Science Foundation.
NASA Astrophysics Data System (ADS)
Sakon, I.; Onaka, T.; Kataza, H.; Wada, T.; Sarugaku, Y.; Matsuhara, H.; Nakagawa, T.; Kobayashi, N.; Kemper, C.; Ohyama, Y.; Matsumoto, T.; Seok, J. Y.
Mid-Infrared Camera and Spectrometers (MCS) is one of the Focal-Plane Instruments proposed for the SPICA mission in the pre-project phase. SPICA MCS is equipped with two spectrometers with different spectral resolution powers (R=λ /δ λ ); medium-resolution spectrometer (MRS) which covers 12-38µ m with R≃1100-3000, and high-resolution spectrometer (HRS) which covers either 12-18µ m with R≃30000. MCS is also equipped with Wide Field Camera (WFC), which is capable of performing multi-objects grism spectroscopy in addition to the imaging observation. A small slit aperture for low-resolution slit spectroscopy is planned to be placed just next to the field of view (FOV) aperture for imaging and slit-less spectroscopic observation. MCS covers an important part of the core spectral range of SPICA and, complementary with SAFARI (SpicA FAR-infrared Instrument), can do crucial observations for a number of key science cases to revolutionize our understanding of the lifecycle of dust in the universe. In this article, the latest design specification and the expected performance of the SPICA/MCS are introduced. Key science cases that should be targetted by SPICA/MCS have been discussed by the MCS science working group. Among such science cases, some of those related to dust science are briefly introduced.
ERIC Educational Resources Information Center
Donnelly, Laura
2007-01-01
When teaching science to kids, a visual approach is good. Humor is also good. And blowing things up is really, really good. At least that is what educators at the Exploratorium in San Francisco have found in the nine years since the museum began producing a live, off-the-cuff competition called Iron Science Teacher. Modeled after the Japanese cult…
Taking on the Heat--A Narrative Account of How Infrared Cameras Invite Instant Inquiry
ERIC Educational Resources Information Center
Haglund, Jesper; Jeppsson, Fredrik; Schönborn, Konrad J.
2016-01-01
Integration of technology, social learning and scientific models offers pedagogical opportunities for science education. A particularly interesting area is thermal science, where students often struggle with abstract concepts, such as heat. In taking on this conceptual obstacle, we explore how hand-held infrared (IR) visualization technology can…
The Force of Multimedia Slide Shows
ERIC Educational Resources Information Center
Santangelo, Darcy; Guy, Mark
2004-01-01
Many teachers look for a creative and engaging way to bring physical science topics of force and motion to life for their students. In this project, fourth-grade students weren't "forced" to investigate physical science topics--they were thrilled to! With the help of various technology tools--digital cameras, the Internet, computers, and…
JWST NIRCam Time Series Observations
NASA Technical Reports Server (NTRS)
Greene, Tom; Schlawin, E.
2017-01-01
We explain how to make time-series observations with the Near-Infrared camera (NIRCam) science instrument of the James Webb Space Telescope. Both photometric and spectroscopic observations are described. We present the basic capabilities and performance of NIRCam and show examples of how to set its observing parameters using the Space Telescope Science Institute's Astronomer's Proposal Tool (APT).
NASA Technical Reports Server (NTRS)
Soffen, G.
1976-01-01
The paper reviews Viking injection into Mars orbit, the landing, and the Orbiter. The following Viking investigations are discussed: the search for life (photosynthetic analysis, metabolic analysis, and respiration), molecular analysis, inorganic chemistry, water detection, thermal mapping, radio science, and physical and seismic characteristics. Also considered are the imaging system, the lander camera, entry science, and Mars weather.
Infrared Imaging Camera Final Report CRADA No. TC02061.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roos, E. V.; Nebeker, S.
This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less
LIFTING THE VEIL OF DUST TO REVEAL THE SECRETS OF SPIRAL GALAXIES
NASA Technical Reports Server (NTRS)
2002-01-01
Astronomers have combined information from the NASA Hubble Space Telescope's visible- and infrared-light cameras to show the hearts of four spiral galaxies peppered with ancient populations of stars. The top row of pictures, taken by a ground-based telescope, represents complete views of each galaxy. The blue boxes outline the regions observed by the Hubble telescope. The bottom row represents composite pictures from Hubble's visible- and infrared-light cameras, the Wide Field and Planetary Camera 2 (WFPC2) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS). Astronomers combined views from both cameras to obtain the true ages of the stars surrounding each galaxy's bulge. The Hubble telescope's sharper resolution allows astronomers to study the intricate structure of a galaxy's core. The galaxies are ordered by the size of their bulges. NGC 5838, an 'S0' galaxy, is dominated by a large bulge and has no visible spiral arms; NGC 7537, an 'Sbc' galaxy, has a small bulge and loosely wound spiral arms. Astronomers think that the structure of NGC 7537 is very similar to our Milky Way. The galaxy images are composites made from WFPC2 images taken with blue (4445 Angstroms) and red (8269 Angstroms) filters, and NICMOS images taken in the infrared (16,000 Angstroms). They were taken in June, July, and August of 1997. Credits for the ground-based images: Allan Sandage (The Observatories of the Carnegie Institution of Washington) and John Bedke (Computer Sciences Corporation and the Space Telescope Science Institute) Credits for WFPC2 and NICMOS composites: NASA, ESA, and Reynier Peletier (University of Nottingham, United Kingdom)
A detailed comparison of single-camera light-field PIV and tomographic PIV
NASA Astrophysics Data System (ADS)
Shi, Shengxian; Ding, Junfei; Atkinson, Callum; Soria, Julio; New, T. H.
2018-03-01
This paper conducts a comprehensive study between the single-camera light-field particle image velocimetry (LF-PIV) and the multi-camera tomographic particle image velocimetry (Tomo-PIV). Simulation studies were first performed using synthetic light-field and tomographic particle images, which extensively examine the difference between these two techniques by varying key parameters such as pixel to microlens ratio (PMR), light-field camera Tomo-camera pixel ratio (LTPR), particle seeding density and tomographic camera number. Simulation results indicate that the single LF-PIV can achieve accuracy consistent with that of multi-camera Tomo-PIV, but requires the use of overall greater number of pixels. Experimental studies were then conducted by simultaneously measuring low-speed jet flow with single-camera LF-PIV and four-camera Tomo-PIV systems. Experiments confirm that given a sufficiently high pixel resolution, a single-camera LF-PIV system can indeed deliver volumetric velocity field measurements for an equivalent field of view with a spatial resolution commensurate with those of multi-camera Tomo-PIV system, enabling accurate 3D measurements in applications where optical access is limited.
Dropping In a Microgravity Environment (DIME) Contest
NASA Technical Reports Server (NTRS)
2001-01-01
The first NASA Dropping In a Microgravity Environment (DIME) student competition pilot project came to a conclusion at the Glenn Research Center in April 2001. The competition involved high-school student teams who developed the concept for a microgravity experiment and prepared an experiment proposal. The two student teams - COSI Academy, sponsored by the Columbus Center of Science and Industry, and another team from Cincinnati, Ohio's Sycamore High School, designed a microgravity experiment, fabricated the experimental apparatus, and visited NASA Glenn to operate their experiment in the 2.2 Second Drop Tower. NASA and contractor personnel who conducted the DIME activity with the students. Shown (L-R) are: Daniel Dietrich (NASA) mentor for Sycamore High School team), Carol Hodanbosi (National Center for Microgravity Research; DIME staff), Jose Carrion (GRC Akima, drop tower technician), Dennis Stocker (NASA; DIME staff), Richard DeLombard (NASA; DIME staff), Sandi Thompson (NSMR sabbatical teacher; DIME staff), Peter Sunderland (NCMR, mentor for COSI Academy student team), Adam Malcolm (NASA co-op student; DIME staff). This image is from a digital still camera; higher resolution is not available.
Shock tube Multiphase Experiments
NASA Astrophysics Data System (ADS)
Middlebrooks, John; Allen, Roy; Paudel, Manoj; Young, Calvin; Musick, Ben; McFarland, Jacob
2017-11-01
Shock driven multiphase instabilities (SDMI) are unique physical phenomena that have far-reaching practical applications in engineering and science. The instability is present in high energy explosions, scramjet combustors, and supernovae events. The SDMI arises when a multiphase interface is impulsively accelerated by the passage of a shockwave. It is similar in development to the Richtmyer-Meshkov (RM) instability however, particle-to-gas coupling is the driving mechanism of the SDMI. As particle effects such as lag and phase change become more prominent, the SDMI's development begins to significantly deviate from the RM instability. We have developed an experiment for studying the SDMI in our shock tube facility. In our experiments, a multiphase interface is created using a laminar jet and flowed into the shock tube where it is accelerated by the passage of a planar shockwave. The interface development is captured using CCD cameras synchronized with planar laser illumination. This talk will give an overview of new experiments conducted to examine the development of a shocked cylindrical multiphase interface. The effects of Atwood number, particle size, and a second acceleration (reshock) of the interface will be discussed.
The Mars Hand Lens Imager (MAHLI) for the 209 Mars Science Laboratory
NASA Technical Reports Server (NTRS)
Edgett, K. S.; Bell, J. F., III; Herkenhoff, K. E.; Heydari, E.; Kah, L. C.; Minitti, M. E.; Olson, T. S.; Rowland, S. K.; Schieber, J.; Sullivan, R. J.
2005-01-01
The MArs Hand Lens Imager (MAHLI) is a small, RGB-color camera designed to examine geologic material at 12.5-75 microns/pixel resolution at the Mars Science Laboratory (MSL) landing site. MAHLI is a PI-led investigation competitively selected by NASA in December 2004 as part of the science payload for the MSL rover launching in 2009. The instrument is being fabricated by, and will be operated by, Malin Space Science Systems of San Diego, California.
NASA Astrophysics Data System (ADS)
Mandell, Avi M.; Groff, Tyler D.; Gong, Qian; Rizzo, Maxime J.; Lupu, Roxana; Zimmerman, Neil T.; Saxena, Prabal; McElwain, Michael W.
2017-09-01
One of the key science goals of the Coronograph Instrument (CGI) on the WFIRST mission is to spectrally characterize the atmospheres of planets around other stars at extremely high contrast levels. To achieve this goal, the CGI instrument will include a integral field spectrograph (IFS) as one of the two science cameras. We present the current science requirements that pertain to the IFS design, describe how our design implementation flows from these requirements, and outline our current instrument design.
NASA Technical Reports Server (NTRS)
Mandell, Avi M.; Groff, Tyler D.; Gong, Qian; Rizzo, Maxime J.; Lupu, Roxana; Zimmerman, Neil T.; Saxena, Prabal; McElwain, Michael W.
2017-01-01
One of the key science goals of the Coronograph Instrument (CGI) on the WFIRST mission is to spectrally characterize the atmospheres of planets around other stars at extremely high contrast levels. To achieve this goal, the CGI Instrument will include a integral field spectrograph (IFS) as one of the two science cameras. We present the current science requirements that pertain to the IFS design, describe how our design implementation flows from these requirements, and outline our current instrument design.
High spatial resolution infrared camera as ISS external experiment
NASA Astrophysics Data System (ADS)
Eckehard, Lorenz; Frerker, Hap; Fitch, Robert Alan
High spatial resolution infrared camera as ISS external experiment for monitoring global climate changes uses ISS internal and external resources (eg. data storage). The optical experiment will consist of an infrared camera for monitoring global climate changes from the ISS. This technology was evaluated by the German small satellite mission BIRD and further developed in different ESA projects. Compared to BIRD the presended instrument uses proven sensor advanced technologies (ISS external) and ISS on board processing and storage capabili-ties (internal). The instrument will be equipped with a serial interfaces for TM/TC and several relay commands for the power supply. For data processing and storage a mass memory is re-quired. The access to actual attitude data is highly desired to produce geo referenced maps-if possible by an on board processing.
NASA Astrophysics Data System (ADS)
Shao, Xinxing; Zhu, Feipeng; Su, Zhilong; Dai, Xiangjun; Chen, Zhenning; He, Xiaoyuan
2018-03-01
The strain errors in stereo-digital image correlation (DIC) due to camera calibration were investigated using precisely controlled numerical experiments and real experiments. Three-dimensional rigid body motion tests were conducted to examine the effects of camera calibration on the measured results. For a fully accurate calibration, rigid body motion causes negligible strain errors. However, for inaccurately calibrated camera parameters and a short working distance, rigid body motion will lead to more than 50-μɛ strain errors, which significantly affects the measurement. In practical measurements, it is impossible to obtain a fully accurate calibration; therefore, considerable attention should be focused on attempting to avoid these types of errors, especially for high-accuracy strain measurements. It is necessary to avoid large rigid body motions in both two-dimensional DIC and stereo-DIC.
Real-Time On-Board Processing Validation of MSPI Ground Camera Images
NASA Technical Reports Server (NTRS)
Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.
2010-01-01
The Earth Sciences Decadal Survey identifies a multiangle, multispectral, high-accuracy polarization imager as one requirement for the Aerosol-Cloud-Ecosystem (ACE) mission. JPL has been developing a Multiangle SpectroPolarimetric Imager (MSPI) as a candidate to fill this need. A key technology development needed for MSPI is on-board signal processing to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's Advanced Information Systems Technology (AIST) Program, JPL is solving the real-time data processing requirements to demonstrate, for the first time, how signal data at 95 Mbytes/sec over 16-channels for each of the 9 multiangle cameras in the spaceborne instrument can be reduced on-board to 0.45 Mbytes/sec. This will produce the intensity and polarization data needed to characterize aerosol and cloud microphysical properties. Using the Xilinx Virtex-5 FPGA including PowerPC440 processors we have implemented a least squares fitting algorithm that extracts intensity and polarimetric parameters in real-time, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information.
General Model of Photon-Pair Detection with an Image Sensor
NASA Astrophysics Data System (ADS)
Defienne, Hugo; Reichert, Matthew; Fleischer, Jason W.
2018-05-01
We develop an analytic model that relates intensity correlation measurements performed by an image sensor to the properties of photon pairs illuminating it. Experiments using an effective single-photon counting camera, a linear electron-multiplying charge-coupled device camera, and a standard CCD camera confirm the model. The results open the field of quantum optical sensing using conventional detectors.
Non-invasive diagnostics of ion beams in strong toroidal magnetic fields with standard CMOS cameras
NASA Astrophysics Data System (ADS)
Ates, Adem; Ates, Yakup; Niebuhr, Heiko; Ratzinger, Ulrich
2018-01-01
A superconducting Figure-8 stellarator type magnetostatic Storage Ring (F8SR) is under investigation at the Institute for Applied Physics (IAP) at Goethe University Frankfurt. Besides numerical simulations on an optimized design for beam transport and injection a scaled down (0.6T) experiment with two 30°toroidal magnets is set up for further investigations. A great challenge is the development of a non-destructive, magnetically insensitive and flexible detector for local investigations of an ion beam propagating through the toroidal magnetostatic field. This paper introduces a new way of beam path measurement by residual gas monitoring. It uses a single board camera connected to a standard single board computer by a camera serial interface all placed inside the vacuum chamber. First experiments with one camera were done and in a next step two under 90 degree arranged cameras were installed. With the help of the two cameras which are moveable along the beam pipe the theoretical predictions are experimentally verified successfully. Previous experimental results have been confirmed. The transport of H+ and H2+ ion beams with energies of 7 keV and at beam currents of about 1 mA is investigated successfully.
Using hacked point and shoot cameras for time-lapse snow cover monitoring in an Alpine valley
NASA Astrophysics Data System (ADS)
Weijs, S. V.; Diebold, M.; Mutzner, R.; Golay, J. R.; Parlange, M. B.
2012-04-01
In Alpine environments, monitoring snow cover is essential get insight in the hydrological processes and water balance. Although measurement techniques based on LIDAR are available, their cost is often a restricting factor. In this research, an experiment was done using a distributed array of cheap consumer cameras to get insight in the spatio-temporal evolution of snowpack. Two experiments are planned. The first involves the measurement of eolic snow transport around a hill, to validate a snow saltation model. The second monitors the snowmelt during the melting season, which can then be combined with data from a wireless network of meteorological stations and discharge measurements at the outlet of the catchment. The poster describes the hardware and software setup, based on an external timer circuit and CHDK, the Canon Hack Development Kit. This latter is a flexible and developing software package, released under a GPL license. It was developed by hackers that reverse engineered the firmware of the camera and added extra functionality such as raw image output, more full control of the camera, external trigger and motion detection, and scripting. These features make it a great tool for geosciences. Possible other applications involve aerial stereo photography, monitoring vegetation response. We are interested in sharing experiences and brainstorming about new applications. Bring your camera!
Chang'e 3 and Jade Rabbit's: observations and the landing zone
NASA Astrophysics Data System (ADS)
Ping, Jinsong
Chang’E-3 was launched and landed on the near side of the Moon in December 2013. It is realizing the 2nd phase of Chinese lunar scientific exploration projects. Together with the various in-situ optical observations around the landing sites, the mission carried 4 kinds of radio science experiments, cover the various lunar scientific disciplines as well as lunar surface radio astronomy studies. The key payloads onboard the lander and rover include the near ultraviolet telescope, extreme ultraviolet cameras, ground penetrating radar, very low frequency radio spectrum analyzer, which have not been used in earlier lunar landing missions. Optical spectrometer, Alpha Paticle X-ray spectrometer and Gama Ray spectrometer is also used. The mission is using extreme ultraviolet camera to observe the sun activity and geomagnetic disturbances on geo-space plasma layer of extreme ultraviolet radiation, studying space weather in the plasma layer role in the process; the mission also carries the first time lunar base optical astronomical observations. Most importantly, the topography, landforms and geological structure has been explored in detail. Additionally, the very precise Earth-Moon radio phase ranging technique was firstly tested and realized in this mission. It may increase the study of lunar dyanmics together with LLR technique. Similar to Luna-Glob landers, together with the VLBI radio beacons, the radio transponders are also set on the Chang’E-3. Transponder will receive the uplink X band radio wave transmitted from the two newly constructed Chinese deep space stations, where the high quality hydrogen maser atomic clocks have been used as local time and frequency standard. Radio science receivers have been developed by updating the multi-channel open loop Doppler receiver developed for VLBI and Doppler tracking in Yinghuo-1 and Phobos-Glob Martian missions. This experiment will improve the study of lunar dynamics, by means of measuring the lunar physical liberations precisely together with LLR data.
X-ray pinhole camera setups used in the Atomki ECR Laboratory for plasma diagnostics.
Rácz, R; Biri, S; Pálinkás, J; Mascali, D; Castro, G; Caliri, C; Romano, F P; Gammino, S
2016-02-01
Imaging of the electron cyclotron resonance (ECR) plasmas by using CCD camera in combination with a pinhole is a non-destructive diagnostics method to record the strongly inhomogeneous spatial density distribution of the X-ray emitted by the plasma and by the chamber walls. This method can provide information on the location of the collisions between warm electrons and multiple charged ions/atoms, opening the possibility to investigate the direct effect of the ion source tuning parameters to the plasma structure. The first successful experiment with a pinhole X-ray camera was carried out in the Atomki ECR Laboratory more than 10 years ago. The goal of that experiment was to make the first ECR X-ray photos and to carry out simple studies on the effect of some setting parameters (magnetic field, extraction, disc voltage, gas mixing, etc.). Recently, intensive efforts were taken to investigate now the effect of different RF resonant modes to the plasma structure. Comparing to the 2002 experiment, this campaign used wider instrumental stock: CCD camera with a lead pinhole was placed at the injection side allowing X-ray imaging and beam extraction simultaneously. Additionally, Silicon Drift Detector (SDD) and High Purity Germanium (HPGe) detectors were installed to characterize the volumetric X-ray emission rate caused by the warm and hot electron domains. In this paper, detailed comparison study on the two X-ray camera and detector setups and also on the technical and scientific goals of the experiments is presented.
First NAC Image Obtained in Mercury Orbit
2017-12-08
NASA image acquired: March 29, 2011 This is the first image of Mercury taken from orbit with MESSENGER’s Narrow Angle Camera (NAC). MESSENGER’s camera system, the Mercury Dual Imaging System (MDIS), has two cameras: the Narrow Angle Camera and the Wide Angle Camera (WAC). Comparison of this image with MESSENGER’s first WAC image of the same region shows the substantial difference between the fields of view of the two cameras. At 1.5°, the field of view of the NAC is seven times smaller than the 10.5° field of view of the WAC. This image was taken using MDIS’s pivot. MDIS is mounted on a pivoting platform and is the only instrument in MESSENGER’s payload capable of movement independent of the spacecraft. The other instruments are fixed in place, and most point down the spacecraft’s boresight at all times, relying solely on the guidance and control system for pointing. The 90° range of motion of the pivot gives MDIS a much-needed extra degree of freedom, allowing MDIS to image the planet’s surface at times when spacecraft geometry would normally prevent it from doing so. The pivot also gives MDIS additional imaging opportunities by allowing it to view more of the surface than that at which the boresight-aligned instruments are pointed at any given time. On March 17, 2011 (March 18, 2011, UTC), MESSENGER became the first spacecraft ever to orbit the planet Mercury. The mission is currently in the commissioning phase, during which spacecraft and instrument performance are verified through a series of specially designed checkout activities. In the course of the one-year primary mission, the spacecraft's seven scientific instruments and radio science investigation will unravel the history and evolution of the Solar System's innermost planet. Visit the Why Mercury? section of this website to learn more about the science questions that the MESSENGER mission has set out to answer. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook
Speed cameras for the prevention of road traffic injuries and deaths.
Wilson, Cecilia; Willis, Charlene; Hendrikz, Joan K; Le Brocque, Robyne; Bellamy, Nicholas
2010-11-10
It is estimated that by 2020, road traffic crashes will have moved from ninth to third in the world ranking of burden of disease, as measured in disability adjusted life years. The prevention of road traffic injuries is of global public health importance. Measures aimed at reducing traffic speed are considered essential to preventing road injuries; the use of speed cameras is one such measure. To assess whether the use of speed cameras reduces the incidence of speeding, road traffic crashes, injuries and deaths. We searched the following electronic databases covering all available years up to March 2010; the Cochrane Library, MEDLINE (WebSPIRS), EMBASE (WebSPIRS), TRANSPORT, IRRD (International Road Research Documentation), TRANSDOC (European Conference of Ministers of Transport databases), Web of Science (Science and Social Science Citation Index), PsycINFO, CINAHL, EconLit, WHO database, Sociological Abstracts, Dissertation Abstracts, Index to Theses. Randomised controlled trials, interrupted time series and controlled before-after studies that assessed the impact of speed cameras on speeding, road crashes, crashes causing injury and fatalities were eligible for inclusion. We independently screened studies for inclusion, extracted data, assessed methodological quality, reported study authors' outcomes and where possible, calculated standardised results based on the information available in each study. Due to considerable heterogeneity between and within included studies, a meta-analysis was not appropriate. Thirty five studies met the inclusion criteria. Compared with controls, the relative reduction in average speed ranged from 1% to 15% and the reduction in proportion of vehicles speeding ranged from 14% to 65%. In the vicinity of camera sites, the pre/post reductions ranged from 8% to 49% for all crashes and 11% to 44% for fatal and serious injury crashes. Compared with controls, the relative improvement in pre/post injury crash proportions ranged from 8% to 50%. Despite the methodological limitations and the variability in degree of signal to noise effect, the consistency of reported reductions in speed and crash outcomes across all studies show that speed cameras are a worthwhile intervention for reducing the number of road traffic injuries and deaths. However, whilst the the evidence base clearly demonstrates a positive direction in the effect, an overall magnitude of this effect is currently not deducible due to heterogeneity and lack of methodological rigour. More studies of a scientifically rigorous and homogenous nature are necessary, to provide the answer to the magnitude of effect.
Speed cameras for the prevention of road traffic injuries and deaths.
Wilson, Cecilia; Willis, Charlene; Hendrikz, Joan K; Le Brocque, Robyne; Bellamy, Nicholas
2010-10-06
It is estimated that by 2020, road traffic crashes will have moved from ninth to third in the world ranking of burden of disease, as measured in disability adjusted life years. The prevention of road traffic injuries is of global public health importance. Measures aimed at reducing traffic speed are considered essential to preventing road injuries; the use of speed cameras is one such measure. To assess whether the use of speed cameras reduces the incidence of speeding, road traffic crashes, injuries and deaths. We searched the following electronic databases covering all available years up to March 2010; the Cochrane Library, MEDLINE (WebSPIRS), EMBASE (WebSPIRS), TRANSPORT, IRRD (International Road Research Documentation), TRANSDOC (European Conference of Ministers of Transport databases), Web of Science (Science and Social Science Citation Index), PsycINFO, CINAHL, EconLit, WHO database, Sociological Abstracts, Dissertation Abstracts, Index to Theses. Randomised controlled trials, interrupted time series and controlled before-after studies that assessed the impact of speed cameras on speeding, road crashes, crashes causing injury and fatalities were eligible for inclusion. We independently screened studies for inclusion, extracted data, assessed methodological quality, reported study authors' outcomes and where possible, calculated standardised results based on the information available in each study. Due to considerable heterogeneity between and within included studies, a meta-analysis was not appropriate. Thirty five studies met the inclusion criteria. Compared with controls, the relative reduction in average speed ranged from 1% to 15% and the reduction in proportion of vehicles speeding ranged from 14% to 65%. In the vicinity of camera sites, the pre/post reductions ranged from 8% to 49% for all crashes and 11% to 44% for fatal and serious injury crashes. Compared with controls, the relative improvement in pre/post injury crash proportions ranged from 8% to 50%. Despite the methodological limitations and the variability in degree of signal to noise effect, the consistency of reported reductions in speed and crash outcomes across all studies show that speed cameras are a worthwhile intervention for reducing the number of road traffic injuries and deaths. However, whilst the the evidence base clearly demonstrates a positive direction in the effect, an overall magnitude of this effect is currently not deducible due to heterogeneity and lack of methodological rigour. More studies of a scientifically rigorous and homogenous nature are necessary, to provide the answer to the magnitude of effect.
Into the blue: AO science with MagAO in the visible
NASA Astrophysics Data System (ADS)
Close, Laird M.; Males, Jared R.; Follette, Katherine B.; Hinz, Phil; Morzinski, Katie; Wu, Ya-Lin; Kopon, Derek; Riccardi, Armando; Esposito, Simone; Puglisi, Alfio; Pinna, Enrico; Xompero, Marco; Briguglio, Runa; Quiros-Pacheco, Fernando
2014-08-01
We review astronomical results in the visible (λ<1μm) with adaptive optics. Other than a brief period in the early 1990s, there has been little astronomical science done in the visible with AO until recently. The most productive visible AO system to date is our 6.5m Magellan telescope AO system (MagAO). MagAO is an advanced Adaptive Secondary system at the Magellan 6.5m in Chile. This secondary has 585 actuators with < 1 msec response times (0.7 ms typically). We use a pyramid wavefront sensor. The relatively small actuator pitch (~23 cm/subap) allows moderate Strehls to be obtained in the visible (0.63-1.05 microns). We use a CCD AO science camera called "VisAO". On-sky long exposures (60s) achieve <30mas resolutions, 30% Strehls at 0.62 microns (r') with the VisAO camera in 0.5" seeing with bright R < 8 mag stars. These relatively high visible wavelength Strehls are made possible by our powerful combination of a next generation ASM and a Pyramid WFS with 378 controlled modes and 1000 Hz loop frequency. We'll review the key steps to having good performance in the visible and review the exciting new AO visible science opportunities and refereed publications in both broad-band (r,i,z,Y) and at Halpha for exoplanets, protoplanetary disks, young stars, and emission line jets. These examples highlight the power of visible AO to probe circumstellar regions/spatial resolutions that would otherwise require much larger diameter telescopes with classical infrared AO cameras.
Attempt of Serendipitous Science During the Mojave Volatile Prospector Field Expedition
NASA Technical Reports Server (NTRS)
Roush, T. L.; Colaprete, A.; Heldmann, J.; Lim, D. S. S.; Cook, A.; Elphic, R.; Deans, M.; Fluckiger, L.; Fritzler, E.; Hunt, David
2015-01-01
On 23 October a partial solar eclipse occurred across parts of the southwest United States between approximately 21:09 and 23:40 (UT), with maximum obscuration, 36%, occurring at 22:29 (UT). During 21-26 October 2014 the Mojave Volatile Prospector (MVP) field expedition deployed and operated the NASA Ames Krex2 rover in the Mojave desert west of Baker, California (Fig. 1, bottom). The MVP field expedition primary goal was to characterize the surface and sub-surface soil moisture properties within desert alluvial fans, and as a secondary goal to provide mission operations simulations of the Resource Prospector (RP) mission to a Lunar pole. The partial solar eclipse provided an opportunity during MVP operations to address serendipitous science. Science instruments on Krex2 included a neutron spectrometer, a near-infrared spectrometer with associated imaging camera, and an independent camera coupled with software to characterize the surface textures of the areas encountered. All of these devices are focused upon the surface and as a result are downward looking. In addition to these science instruments, two hazard cameras are mounted on Krex2. The chief device used to monitor the partial solar eclipse was the engineering development unit of the Near-Infrared Volatile Spectrometer System (NIRVSS) near-infrared spectrometer. This device uses two separate fiber optic fed Hadamard transform spectrometers. The short-wave and long-wave spectrometers measure the 1600-2400 and 2300-3400 nm wavelength regions with resolutions of 10 and 13 nm, respectively. Data are obtained approximately every 8 seconds. The NIRVSS stares in the opposite direction as the front Krex2.
Autonomous Image Analysis for Future Mars Missions
NASA Technical Reports Server (NTRS)
Gulick, V. C.; Morris, R. L.; Ruzon, M. A.; Bandari, E.; Roush, T. L.
1999-01-01
To explore high priority landing sites and to prepare for eventual human exploration, future Mars missions will involve rovers capable of traversing tens of kilometers. However, the current process by which scientists interact with a rover does not scale to such distances. Specifically, numerous command cycles are required to complete even simple tasks, such as, pointing the spectrometer at a variety of nearby rocks. In addition, the time required by scientists to interpret image data before new commands can be given and the limited amount of data that can be downlinked during a given command cycle constrain rover mobility and achievement of science goals. Experience with rover tests on Earth supports these concerns. As a result, traverses to science sites as identified in orbital images would require numerous science command cycles over a period of many weeks, months or even years, perhaps exceeding rover design life and other constraints. Autonomous onboard science analysis can address these problems in two ways. First, it will allow the rover to preferentially transmit "interesting" images, defined as those likely to have higher science content. Second, the rover will be able to anticipate future commands. For example, a rover might autonomously acquire and return spectra of "interesting" rocks along with a high-resolution image of those rocks in addition to returning the context images in which they were detected. Such approaches, coupled with appropriate navigational software, help to address both the data volume and command cycle bottlenecks that limit both rover mobility and science yield. We are developing fast, autonomous algorithms to enable such intelligent on-board decision making by spacecraft. Autonomous algorithms developed to date have the ability to identify rocks and layers in a scene, locate the horizon, and compress multi-spectral image data. We are currently investigating the possibility of reconstructing a 3D surface from a sequence of images acquired by a robotic arm camera. This would then allow the return of a single completely in focus image constructed only from those portions of individual images that lie within the camera's depth of field. Output from these algorithms could be used to autonomously obtain rock spectra, determine which images should be transmitted to the ground, or to aid in image compression. We will discuss these algorithms and their performance during a recent rover field test.
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2015-05-01
As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. In previous papers, we demonstrate new possibility of the passive THz camera using for a temperature difference observing on the human skin if this difference is caused by different temperatures inside the body. For proof of validity of our statement we make the similar physical experiment using the IR camera. We show a possibility of temperature trace on human body skin, caused by changing of temperature inside the human body due to water drinking. We use as a computer code that is available for treatment of images captured by commercially available IR camera, manufactured by Flir Corp., as well as our developed computer code for computer processing of these images. Using both codes we demonstrate clearly changing of human body skin temperature induced by water drinking. Shown phenomena are very important for the detection of forbidden samples and substances concealed inside the human body using non-destructive control without X-rays using. Early we have demonstrated such possibility using THz radiation. Carried out experiments can be used for counter-terrorism problem solving. We developed original filters for computer processing of images captured by IR cameras. Their applications for computer processing of images results in a temperature resolution enhancing of cameras.
The Alfred Nobel rocket camera. An early aerial photography attempt
NASA Astrophysics Data System (ADS)
Ingemar Skoog, A.
2010-02-01
Alfred Nobel (1833-1896), mainly known for his invention of dynamite and the creation of the Nobel Prices, was an engineer and inventor active in many fields of science and engineering, e.g. chemistry, medicine, mechanics, metallurgy, optics, armoury and rocketry. Amongst his inventions in rocketry was the smokeless solid propellant ballistite (i.e. cordite) patented for the first time in 1887. As a very wealthy person he actively supported many Swedish inventors in their work. One of them was W.T. Unge, who was devoted to the development of rockets and their applications. Nobel and Unge had several rocket patents together and also jointly worked on various rocket applications. In mid-1896 Nobel applied for patents in England and France for "An Improved Mode of Obtaining Photographic Maps and Earth or Ground Measurements" using a photographic camera carried by a "…balloon, rocket or missile…". During the remaining of 1896 the mechanical design of the camera mechanism was pursued and cameras manufactured. In April 1897 (after the death of Alfred Nobel) the first aerial photos were taken by these cameras. These photos might be the first documented aerial photos taken by a rocket borne camera. Cameras and photos from 1897 have been preserved. Nobel did not only develop the rocket borne camera but also proposed methods on how to use the photographs taken for ground measurements and preparing maps.
First results from the TOPSAT camera
NASA Astrophysics Data System (ADS)
Greenway, Paul; Tosh, Ian; Morris, Nigel; Burton, Gary; Cawley, Steve
2017-11-01
The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. The instrument was designed and assembled at the Space Science & Technology Department of the CCLRC's Rutherford Appleton Laboratory (RAL) in the UK, and was launched on the 27th October 2005 from Plesetsk Cosmodrome in Northern Russia on a Kosmos-3M. The camera utilises an off-axis three mirror system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; QinetiQ, Surrey Satellite Technology Ltd (SSTL), RAL and Infoterra. Its objective is to demonstrate provision of rapid response high resolution imagery to fixed and mobile ground stations using a low cost minisatellite. The paper "Development of the TopSat Camera" presented by RAL at the 5th ICSO in 2004 described the opto-mechanical design, assembly, alignment and environmental test methods implemented. Now that the spacecraft is in orbit and successfully acquiring images, this paper presents the first results from the camera and makes an initial assessment of the camera's in-orbit performance.
Li, Tian-Jiao; Li, Sai; Yuan, Yuan; Liu, Yu-Dong; Xu, Chuan-Long; Shuai, Yong; Tan, He-Ping
2017-04-03
Plenoptic cameras are used for capturing flames in studies of high-temperature phenomena. However, simulations of plenoptic camera models can be used prior to the experiment improve experimental efficiency and reduce cost. In this work, microlens arrays, which are based on the established light field camera model, are optimized into a hexagonal structure with three types of microlenses. With this improved plenoptic camera model, light field imaging of static objects and flame are simulated using the calibrated parameters of the Raytrix camera (R29). The optimized models improve the image resolution, imaging screen utilization, and shooting range of depth of field.
Extreme ultra-violet movie camera for imaging microsecond time scale magnetic reconnection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, Kil-Byoung; Bellan, Paul M.
2013-12-15
An ultra-fast extreme ultra-violet (EUV) movie camera has been developed for imaging magnetic reconnection in the Caltech spheromak/astrophysical jet experiment. The camera consists of a broadband Mo:Si multilayer mirror, a fast decaying YAG:Ce scintillator, a visible light block, and a high-speed visible light CCD camera. The camera can capture EUV images as fast as 3.3 × 10{sup 6} frames per second with 0.5 cm spatial resolution. The spectral range is from 20 eV to 60 eV. EUV images reveal strong, transient, highly localized bursts of EUV radiation when magnetic reconnection occurs.
NASA Astrophysics Data System (ADS)
Oschlisniok, J.; Pätzold, M.; Häusler, B.; Tellmann, S.; Bird, M.; Andert, T.; Remus, S.; Krüger, C.; Mattei, R.
2011-10-01
Earth's nearest planetary neighbour Venus is shrouded within a roughly 22 km thick three-layered cloud deck, which is located approximately 48 km above the surface and extends to an altitude of about 70 km. The clouds are mostly composed of sulfuric acid. The latter is responsible for a strong absorption of radio signals at microwaves, which is observed in radio occultation experiments. The absorption of the radio signal intensity is used to determine the abundance of H2SO4. This way a detailed study of the H2SO4 height distribution within the cloud deck is possible. The Venus Express spacecraft is orbiting Venus since 2006. The Radio Science Experiment VeRa onboard probes the atmosphere with radio signals at 3.4 cm (X-Band) and 13 cm (S-Band). Absorptivity profiles of the 3.4 cm radio wave and the resulting vertical sulfuric acid profiles in the cloud region of Venus' atmosphere are presented. The three-layered structure and a distinct latitudinal variation of H2SO4 are observed. Convective atmospheric motions within the equatorial latitudes, which transport absorbing material from lower to higher altitudes, are clearly visible. Results of the Venus Monitoring Camera (VMC) and the Visible and Infrared Thermal Imaging Spectrometer (VIRTIS) are compared with the VeRa results.
NASA Technical Reports Server (NTRS)
Rice, Melissa S.; Gupta, Sanjeev; Bell, James F., III; Warner, Nicholas H.
2011-01-01
Eberswalde crater was selected as a candidate landing site for the Mars Science Laboratory (MSL) mission based on the presence of a fan-shaped sedimentary deposit interpreted as a delta. We have identified and mapped five other candidate fluvio -deltaic systems in the crater, using images and digital terrain models (DTMs) derived from the Mars Reconnaissance Orbiter (MRO) High Resolution Imaging Science Experiment (HiRISE) and Context Camera (CTX). All of these systems consist of the same three stratigraphic units: (1) an upper layered unit, conformable with (2) a subpolygonally fractured unit, unconformably overlying (3) a pitted unit. We have also mapped a system of NNE-trending scarps interpreted as dip-slip faults that pre-date the fluvial -lacustrine deposits. The post-impact regional faulting may have generated the large-scale topography within the crater, which consists of a Western Basin, an Eastern Basin, and a central high. This topography subsequently provided depositional sinks for sediment entering the crater and controlled the geomorphic pattern of delta development.
Technical Communication--Taking the User into Account.
1981-08-01
the effect of cognitive style on instructional design, it may be more cost-effective to evaluate how different instructional formats impact different...deck, 2 SONY VCK-3210 televison cameras, and a SONY Switcher/Fader SEG-l special effects generator. One television camera was positioned next to the ...AD-A1O? 030 NEW YORK STATE COLL OF AGRICULTURE AND LIFE SCIENCES -ETC F/ S /9 TECHNICAL COMMUNICATION--TAKING THE USER INTO ACCOUNT U) AUG Al T L
An accurate registration technique for distorted images
NASA Technical Reports Server (NTRS)
Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis
1990-01-01
Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.
NASA Astrophysics Data System (ADS)
Godet, Olivier; Barret, Didier; Paul, Jacques; Sizun, Patrick; Mandrou, Pierre; Cordier, Bertrand
SVOM (Space Variable Object Monitor) is a French-Chinese mission dedicated to the study of high-redshift GRBs, which is expected to be launched in 2012. The anti-Sun pointing strategy of SVOM along with a strong and integrated ground segment consisting of two wide-field robotic telescopes covering the near-IR and optical will optimise the ground-based GRB follow-ups by the largest telescopes and thus the measurements of spectroscopic redshifts. The central instrument of the science payload will be an innovative wide-field coded-mask camera for X- /Gamma-rays (4-250 keV) responsible for triggering and localising GRBs with an accuracy better than 10 arc-minutes. Such an instrument will be background-dominated so it is essential to estimate the background level expected once in orbit during the early phase of the instrument design in order to ensure good science performance. We present our Monte-Carlo simulator enabling us to compute the background spectrum taking into account the mass model of the camera and the main components of the space environment encountered in orbit by the satellite. From that computation, we show that the current design of the camera CXG will be more sensitive to high-redshift GRBs than the Swift-BAT thanks to its low-energy threshold of 4 keV.
W-Band Free Space Permittivity Measurement Setup for Candidate Radome Materials
NASA Technical Reports Server (NTRS)
Fralick, Dion T.
1997-01-01
This paper presents a measurement system used for w-band complex permittivity measurements performed in NASA Langley Research Center's Electromagnetics Research Branch. The system was used to characterize candidate radome materials for the passive millimeter wave (PMMW) camera experiment. The PMMW camera is a new technology sensor, with goals of all-weather landings of civilian and military aircraft. The sensor is being developed under a NASA Technology Reinvestment program with TRW, McDonnell- Douglas, Honeywell, and Composite Optics, Inc. as participants. The experiment is scheduled to be flight tested on the Air Force's 'Speckled Trout' aircraft in late 1997. The camera operates at W-band, in a radiometric capacity and generates an image of the viewable field. Because the camera is a radiometer, the system is very sensitive to losses. Minimal transmission loss through the radome at the operating frequency, 89 GHz, was critical to the success of the experiment. This paper details the design, set-up, calibration and operation of a free space measurement system developed and used to characterize the candidate radome materials for this program.
2001-05-02
Sutta Chernubhotta (grade 10) from DuPont Manual High School in Louisville, Kentucky, asks a question of on of the on-line lecturers during the Pan-Pacific Basin Workshop on Microgravity Sciences held in Pasadena, California. The event originated at the California Science Center in Los Angeles. The DuPont Manual students patched in to the event through the distance learning lab at the Louisville Science Center. This image is from a digital still camera; higher resolution is not available.
Touch And Go Camera System (TAGCAMS) for the OSIRIS-REx Asteroid Sample Return Mission
NASA Astrophysics Data System (ADS)
Bos, B. J.; Ravine, M. A.; Caplinger, M.; Schaffner, J. A.; Ladewig, J. V.; Olds, R. D.; Norman, C. D.; Huish, D.; Hughes, M.; Anderson, S. K.; Lorenz, D. A.; May, A.; Jackman, C. D.; Nelson, D.; Moreau, M.; Kubitschek, D.; Getzandanner, K.; Gordon, K. E.; Eberhardt, A.; Lauretta, D. S.
2018-02-01
NASA's OSIRIS-REx asteroid sample return mission spacecraft includes the Touch And Go Camera System (TAGCAMS) three camera-head instrument. The purpose of TAGCAMS is to provide imagery during the mission to facilitate navigation to the target asteroid, confirm acquisition of the asteroid sample, and document asteroid sample stowage. The cameras were designed and constructed by Malin Space Science Systems (MSSS) based on requirements developed by Lockheed Martin and NASA. All three of the cameras are mounted to the spacecraft nadir deck and provide images in the visible part of the spectrum, 400-700 nm. Two of the TAGCAMS cameras, NavCam 1 and NavCam 2, serve as fully redundant navigation cameras to support optical navigation and natural feature tracking. Their boresights are aligned in the nadir direction with small angular offsets for operational convenience. The third TAGCAMS camera, StowCam, provides imagery to assist with and confirm proper stowage of the asteroid sample. Its boresight is pointed at the OSIRIS-REx sample return capsule located on the spacecraft deck. All three cameras have at their heart a 2592 × 1944 pixel complementary metal oxide semiconductor (CMOS) detector array that provides up to 12-bit pixel depth. All cameras also share the same lens design and a camera field of view of roughly 44° × 32° with a pixel scale of 0.28 mrad/pixel. The StowCam lens is focused to image features on the spacecraft deck, while both NavCam lens focus positions are optimized for imaging at infinity. A brief description of the TAGCAMS instrument and how it is used to support critical OSIRIS-REx operations is provided.
ASTRONAUT COOPER, GORDON L. - TRAINING - MERCURY-ATLAS (MA)-9 - CAMERA
1963-03-01
S63-03952 (1963) --- Astronaut L. Gordon Cooper Jr. explains the 16mm handheld spacecraft camera to his backup pilot astronaut Alan Shepard. The camera, designed by J.R. Hereford of McDonnell Aircraft Corp., will be used by Cooper during the Mercury-Atlas 9 (MA-9) mission to photograph experiments in space for M.I.T. and the Weather Bureau. Photo credit: NASA
Seeing in a different light—using an infrared camera to teach heat transfer and optical phenomena
NASA Astrophysics Data System (ADS)
Pei Wong, Choun; Subramaniam, R.
2018-05-01
The infrared camera is a useful tool in physics education to ‘see’ in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.
Seeing in a Different Light--Using an Infrared Camera to Teach Heat Transfer and Optical Phenomena
ERIC Educational Resources Information Center
Wong, Choun Pei; Subramaniam, R.
2018-01-01
The infrared camera is a useful tool in physics education to 'see' in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.
The Kaguya Mission: Science Achievements and Data Release
NASA Astrophysics Data System (ADS)
Kato, Manabu; Sasaki, Susumu; Takizawa, Yoshisada
2010-05-01
Lunar orbiter Kaguya (SELENE) has impacted the Moon on July 10, 2009. The Kaguya mission has completed to observe the whole Moon for total twenty months; checkout term of three months, nominal one of ten months, and the extension of seven months. In the extended mission before the impact the measurements of magnetic field and gamma-ray from lower orbits have been perrformed successfully in addition to low altitude observation by Terraine Camera, Multiband Imager, and HDTV Camera. New data of intense magnetic anomaly and GRS data with higher spacial resolution has been acquired to study elemental distribution and magnetism of the Moon. New information and insights have been brought to lunar sciences in topography, gra-vimetry, geology, mineralogy, lithology, plasma physics. On November 1, 2009 the Kaguya team has released science data to the public as an international promise. The archive data can be accessed through Kaguya homepage of JAXA. Image gallary and 3D GIS system have been also put on view from the same homepage.
Mobile Phone Images and Video in Science Teaching and Learning
ERIC Educational Resources Information Center
Ekanayake, Sakunthala Yatigammana; Wishart, Jocelyn
2014-01-01
This article reports a study into how mobile phones could be used to enhance teaching and learning in secondary school science. It describes four lessons devised by groups of Sri Lankan teachers all of which centred on the use of the mobile phone cameras rather than their communication functions. A qualitative methodological approach was used to…
ERIC Educational Resources Information Center
Bueno de Mesquita, Paul; Dean, Ross F.; Young, Betty J.
2010-01-01
Advances in digital video technology create opportunities for more detailed qualitative analyses of actual teaching practice in science and other subject areas. User-friendly digital cameras and highly developed, flexible video-analysis software programs have made the tasks of video capture, editing, transcription, and subsequent data analysis…
Delivering Images for Mars Rover Science Planning
NASA Technical Reports Server (NTRS)
Edmonds, Karina
2008-01-01
A methodology has been developed for delivering, via the Internet, images transmitted to Earth from cameras on the Mars Explorer Rovers, the Phoenix Mars Lander, the Mars Science Laboratory, and the Mars Reconnaissance Orbiter spacecraft. The images in question are used by geographically dispersed scientists and engineers in planning Rover scientific activities and Rover maneuvers pertinent thereto.
NASA Technical Reports Server (NTRS)
1998-01-01
Color composite of condensate clouds over Tharsis made from red and blue images with a synthesized green channel. Mars Orbiter Camera wide angle frames from Orbit 48.
Figure caption from Science MagazinePRIMAS: a real-time 3D motion-analysis system
NASA Astrophysics Data System (ADS)
Sabel, Jan C.; van Veenendaal, Hans L. J.; Furnee, E. Hans
1994-03-01
The paper describes a CCD TV-camera-based system for real-time multicamera 2D detection of retro-reflective targets and software for accurate and fast 3D reconstruction. Applications of this system can be found in the fields of sports, biomechanics, rehabilitation research, and various other areas of science and industry. The new feature of real-time 3D opens an even broader perspective of application areas; animations in virtual reality are an interesting example. After presenting an overview of the hardware and the camera calibration method, the paper focuses on the real-time algorithms used for matching of the images and subsequent 3D reconstruction of marker positions. When using a calibrated setup of two cameras, it is now possible to track at least ten markers at 100 Hz. Limitations in the performance are determined by the visibility of the markers, which could be improved by adding a third camera.
2001-11-29
KENNEDY SPACE CENTER, Fla. -- Fully unwrapped, the Advanced Camera for Surveys, which is suspended by an overhead crane, is checked over by workers. Part of the payload on the Hubble Space Telescope Servicing Mission, STS-109, the ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. Tasks for the mission include replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
Innovation in robotic surgery: the Indian scenario.
Deshpande, Suresh V
2015-01-01
Robotics is the science. In scientific words a "Robot" is an electromechanical arm device with a computer interface, a combination of electrical, mechanical, and computer engineering. It is a mechanical arm that performs tasks in Industries, space exploration, and science. One such idea was to make an automated arm - A robot - In laparoscopy to control the telescope-camera unit electromechanically and then with a computer interface using voice control. It took us 5 long years from 2004 to bring it to the level of obtaining a patent. That was the birth of the Swarup Robotic Arm (SWARM) which is the first and the only Indian contribution in the field of robotics in laparoscopy as a total voice controlled camera holding robotic arm developed without any support by industry or research institutes.
Processing of Mars Exploration Rover Imagery for Science and Operations Planning
NASA Technical Reports Server (NTRS)
Alexander, Douglass A.; Deen, Robert G.; Andres, Paul M.; Zamani, Payam; Mortensen, Helen B.; Chen, Amy C.; Cayanan, Michael K.; Hall, Jeffrey R.; Klochko, Vadim S.; Pariser, Oleg;
2006-01-01
The twin Mars Exploration Rovers (MER) delivered an unprecedented array of image sensors to the Mars surface. These cameras were essential for operations, science, and public engagement. The Multimission Image Processing Laboratory (MIPL) at the Jet Propulsion Laboratory was responsible for the first-order processing of all of the images returned by these cameras. This processing included reconstruction of the original images, systematic and ad hoc generation of a wide variety of products derived from those images, and delivery of the data to a variety of customers, within tight time constraints. A combination of automated and manual processes was developed to meet these requirements, with significant inheritance from prior missions. This paper describes the image products generated by MIPL for MER and the processes used to produce and deliver them.
NASA Astrophysics Data System (ADS)
Lumpkin, A. H.; Thurman-Keup, R.; Edstrom, D.; Ruan, J.; Eddy, N.; Prieto, P.; Napoly, O.; Carlsten, B. E.; Bishofberger, K.
2018-06-01
We report the direct observations of submacropulse beam centroid oscillations correlated with higher order modes (HOMs) which were generated by off-axis electron beam steering in TESLA-type superconducting rf cavities. The experiments were performed at the Fermilab Accelerator Science and Technology (FAST) facility using its unique configuration of a photocathode rf gun injecting beam into two separated nine-cell cavities in series with corrector magnets and beam position monitors (BPMs) located before, between, and after them. Oscillations of ˜100 kHz in the vertical plane and ˜380 kHz in the horizontal plane with up to 600 -μ m amplitudes were observed in a 3-MHz micropulse repetition rate beam with charges of 100, 300, 500, and 1000 pC /b . However, the effects were much reduced at 100 pC /b . The measurements were based on HOM detector circuitry targeting the first and second dipole passbands, rf BPM bunch-by-bunch array data, imaging cameras, and a framing camera. Calculations reproduced the oscillation frequencies of the phenomena in the vertical case. In principle, these fundamental results may be scaled to cryomodule configurations of major accelerator facilities.
NASA Astrophysics Data System (ADS)
Zhang, Wei; Huang, Wei; Gao, Yubo; Qi, Yafei; Hypervelocity Impact Research Center Team
2015-06-01
Laboratory-scaled oblique water entry experiments for the trajectory stability in the water column have been performed with four different nosed-projectiles at a range of velocities from 20m /s to 250 m /s . The slender projectiles are designed with flat, ogival, hemi-sperical, truncated-ogival noses to make comparisons on the trajectory deviation when they are launched at vertical and oblique impact angles (0°~25°). Two high-speed cameras that are positioned orthogonal to each other and normal to the column are employed to capture the entire process of projectiles' penetration. From the experimental results, the sequential images in two planes are presented to compare the trajectory deviation of different impact tests and the 3D trajectory models are extracted based on the location recorded by cameras. Considering the effect influenced by the impact velocities and noses of projectiles, it merited concluded that trajectory deviation is affected from most by impact angle, and least by impact velocities. Additionally, ogival projectiles tend to be more sensitive to oblique angle and experienced the largest attitude changing. National Natural Science Foundation of China (NO.: 11372088).
Lumpkin, A. H.; Thurman-Keup, R.; Edstrom, D.; ...
2018-06-04
Here, we report the direct observations of submacropulse beam centroid oscillations correlated with higher order modes (HOMs) which were generated by off-axis electron beam steering in TESLA-type superconducting rf cavities. The experiments were performed at the Fermilab Accelerator Science and Technology (FAST) facility using its unique configuration of a photocathode rf gun injecting beam into two separated nine-cell cavities in series with corrector magnets and beam position monitors (BPMs) located before, between, and after them. Oscillations of ~100 kHz in the vertical plane and ~380 kHz in the horizontal plane with up to 600-μm amplitudes were observed in a 3-MHzmore » micropulse repetition rate beam with charges of 100, 300, 500, and 1000 pC/b. However, the effects were much reduced at 100 pC/b. The measurements were based on HOM detector circuitry targeting the first and second dipole passbands, rf BPM bunch-by-bunch array data, imaging cameras, and a framing camera. Calculations reproduced the oscillation frequencies of the phenomena in the vertical case. In principle, these fundamental results may be scaled to cryomodule configurations of major accelerator facilities.« less
Development of proton CT imaging system using plastic scintillator and CCD camera
NASA Astrophysics Data System (ADS)
Tanaka, Sodai; Nishio, Teiji; Matsushita, Keiichiro; Tsuneda, Masato; Kabuki, Shigeto; Uesaka, Mitsuru
2016-06-01
A proton computed tomography (pCT) imaging system was constructed for evaluation of the error of an x-ray CT (xCT)-to-WEL (water-equivalent length) conversion in treatment planning for proton therapy. In this system, the scintillation light integrated along the beam direction is obtained by photography using the CCD camera, which enables fast and easy data acquisition. The light intensity is converted to the range of the proton beam using a light-to-range conversion table made beforehand, and a pCT image is reconstructed. An experiment for demonstration of the pCT system was performed using a 70 MeV proton beam provided by the AVF930 cyclotron at the National Institute of Radiological Sciences. Three-dimensional pCT images were reconstructed from the experimental data. A thin structure of approximately 1 mm was clearly observed, with spatial resolution of pCT images at the same level as that of xCT images. The pCT images of various substances were reconstructed to evaluate the pixel value of pCT images. The image quality was investigated with regard to deterioration including multiple Coulomb scattering.
Recurring Lineae on Slopes at Hale Crater, Mars
2015-09-28
Dark, narrow streaks on Martian slopes such as these at Hale Crater are inferred to be formed by seasonal flow of water on contemporary Mars. The streaks are roughly the length of a football field. The imaging and topographical information in this processed, false-color view come from the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. These dark features on the slopes are called "recurring slope lineae" or RSL. Planetary scientists using observations with the Compact Reconnaissance Imaging Spectrometer on the same orbiter detected hydrated salts on these slopes at Hale Crater, corroborating the hypothesis that the streaks are formed by briny liquid water. The image was produced by first creating a 3-D computer model (a digital terrain map) of the area based on stereo information from two HiRISE observations, and then draping a false-color image over the land-shape model. The vertical dimension is exaggerated by a factor of 1.5 compared to horizontal dimensions. The camera records brightness in three wavelength bands: infrared, red and blue-green. The draped image is one product from HiRISE observation ESP_03070_1440. http://photojournal.jpl.nasa.gov/catalog/PIA19916
NASA Technical Reports Server (NTRS)
Marsh, J. G.; Douglas, B. C.; Walls, D. M.
1974-01-01
Laser and camera data taken during the International Satellite Geodesy Experiment (ISAGEX) were used in dynamical solutions to obtain center-of-mass coordinates for the Astro-Soviet camera sites at Helwan, Egypt, and Oulan Bator, Mongolia, as well as the East European camera sites at Potsdam, German Democratic Republic, and Ondrejov, Czechoslovakia. The results are accurate to about 20m in each coordinate. The orbit of PEOLE (i=15) was also determined from ISAGEX data. Mean Kepler elements suitable for geodynamic investigations are presented.
Low Cost Wireless Network Camera Sensors for Traffic Monitoring
DOT National Transportation Integrated Search
2012-07-01
Many freeways and arterials in major cities in Texas are presently equipped with video detection cameras to : collect data and help in traffic/incident management. In this study, carefully controlled experiments determined : the throughput and output...
Sunsat-2004 satellite and synoptic VLF payload
NASA Astrophysics Data System (ADS)
Milne, Gw; Hughes, A.; Mostert, S.; Steyn, Wh
Sunsat 2004 is a second satellite from the University of Stellenbosch, with intended suns-synchronous launch in late 2005. The first satellite, Sunsat, was launched in February 1999, and was Africa's first satellite The three-axis stabilised bus will normally point its main solar panel at the sun, but will rotate for imaging. The attitude determination and control system will use coarse sun sensors, magnetometers, rate gyros, and a star mapper, and use reaction wheels and torquer rods for actuation. The payloads include a multispectral pushbroom imager with less than 5m GSD, TV cameras, an Amateur Radio communications payload, and science experiments. The main South African science experiment is a VLF receiver. In the magnetosphere VLF waves play an important role in energy exchange processes with energetic particles. The wave-particle interactions can lead to particle precipitation into the atmosphere or introduce additional energy into particle populations in the magnetosphere. The former is important due to its effect on terrestrial communications while the latter is of interest, as it affects the environment in which satellites operate. A full understanding, of the magnetosphere and phenomena such as the aurora, airglow and particle precipitation, depends on comprehensive wave and particle models together with models of the background plasma density The energetic particle populations and background plasma densities have been extensively modelled using data from a large number of satellite, rocket and ground-based experiments but no comprehensive model of the wave environment exist. The proposed synoptic VLF experiment will start to address this need by locating and tracking the morphology of regions in the magnetosphere where waves are generated. The experiment would consist of a nine channel VLF receiver with a loop antenna. The data would be recorded on board and transmitted to ground stations at appropriate times. A number of additional science payloads are also being evaluated for the mission, and will be reported on in the paper.
NASA Astrophysics Data System (ADS)
Taggart, D. P.; Gribble, R. J.; Bailey, A. D., III; Sugimoto, S.
Recently, a prototype soft x ray pinhole camera was fielded on FRX-C/LSM at Los Alamos and TRX at Spectra Technology. The soft x ray FRC images obtained using this camera stand out in high contrast to their surroundings. It was particularly useful for studying the FRC during and shortly after formation when, at certain operating conditions, flute-like structures at the edge and internal structures of the FRC were observed which other diagnostics could not resolve. Building on this early experience, a new soft x ray pinhole camera was installed on FRX-C/LSM, which permits more rapid data acquisition and briefer exposures. It will be used to continue studying FRC formation and to look for internal structure later in time which could be a signature of instability. The initial operation of this camera is summarized.
Soft X-ray streak camera for laser fusion applications
NASA Astrophysics Data System (ADS)
Stradling, G. L.
1981-04-01
The development and significance of the soft x-ray streak camera (SXRSC) in the context of inertial confinement fusion energy development is reviewed as well as laser fusion and laser fusion diagnostics. The SXRSC design criteria, the requirement for a subkilovolt x-ray transmitting window, and the resulting camera design are explained. Theory and design of reflector-filter pair combinations for three subkilovolt channels centered at 220 eV, 460 eV, and 620 eV are also presented. Calibration experiments are explained and data showing a dynamic range of 1000 and a sweep speed of 134 psec/mm are presented. Sensitivity modifications to the soft x-ray streak camera for a high-power target shot are described. A preliminary investigation, using a stepped cathode, of the thickness dependence of the gold photocathode response is discussed. Data from a typical Argus laser gold-disk target experiment are shown.
Wrist Camera Orientation for Effective Telerobotic Orbital Replaceable Unit (ORU) Changeout
NASA Technical Reports Server (NTRS)
Jones, Sharon Monica; Aldridge, Hal A.; Vazquez, Sixto L.
1997-01-01
The Hydraulic Manipulator Testbed (HMTB) is the kinematic replica of the Flight Telerobotic Servicer (FTS). One use of the HMTB is to evaluate advanced control techniques for accomplishing robotic maintenance tasks on board the Space Station. Most maintenance tasks involve the direct manipulation of the robot by a human operator when high-quality visual feedback is important for precise control. An experiment was conducted in the Systems Integration Branch at the Langley Research Center to compare several configurations of the manipulator wrist camera for providing visual feedback during an Orbital Replaceable Unit changeout task. Several variables were considered such as wrist camera angle, camera focal length, target location, lighting. Each study participant performed the maintenance task by using eight combinations of the variables based on a Latin square design. The results of this experiment and conclusions based on data collected are presented.
The effect of playing a science center-based mobile game: Affective outcomes and gender differences
NASA Astrophysics Data System (ADS)
Atwood-Blaine, Dana
Situated in a hands-on science center, The Great STEM Caper was a collaborative mobile game built on the ARIS platform that was designed to engage 5th-9th grade players in NGSS science and engineering practices while they interacted with various exhibits. Same gender partners sharing one iPad would search for QR codes placed at specific exhibits; scanning a code within the game would launch a challenge for that exhibit. The primary hypothesis was that in- game victories would be equivalent to "mastery experiences" as described by Bandura (1997) and would result in increased science self-efficacy. Gender differences in gameplay behaviors and perceptions were also studied. The study included two groups, one that played the game during their visit and one that explored the science center in the traditional way. The Motivation to Learn Science Questionnaire was administered to participants in both groups both before and after their visit to the science center. Participants wore head-mounted GoPro cameras to record their interactions within the physical and social environment. No differences in affective outcomes were found between the game and comparison groups or between boys and girls in the game group. The MLSQ was unable to measure any significant change in science self-efficacy, interest and enjoyment of science, or overall motivation to learn science in either group. However, girls outperformed boys on every measure of game achievement. Lazzaro's (2004) four types of fun were found to be a good fit for describing the gender differences in game perceptions and behaviors. Girls tended to enjoy hard fun and collaborative people fun while boys enjoyed easy fun and competitive people fun. While boys associated game achievement with enjoyment and victory, girls perceived their game achievement as difficult, rather than enjoyable or victorious.
2006-01-19
KENNEDY SPACE CENTER, FLA. — Viewed from a vantage point on the nearby river bank, NASA’s New Horizons spacecraft roars into the cloud-scattered sky trailing fire from the Atlas V rocket that propels it. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/George Shelton
2006-01-19
KENNEDY SPACE CENTER, FLA. — Viewed from a nearby vantage point, NASA’s New Horizons spacecraft roars into the cloud-scattered sky trailing fire and smoke from the Atlas V rocket that propels it. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/George Shelton
2006-01-19
KENNEDY SPACE CENTER, FLA. — Viewed from the NASA News Center, NASA’s New Horizons spacecraft roars into the cloud-scattered sky trailing fire and smoke from the Atlas V rocket that propels it. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Fletch Hildreth
2006-01-19
KENNEDY SPACE CENTER, FLA. — Viewed from the top of the Vehicle Assembly Building at Kennedy Space Center, NASA’s New Horizons spacecraft roars off the launch pad aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Kim Shiflett
2006-01-19
KENNEDY SPACE CENTER, FLA. — Spectators and photographers enjoy the view as the NASA New Horizons spacecraft clears the horizon six seconds into the launch (as seen on the countdown clock at left). The spacecraft lifted off on time at 2 p.m. EST aboard an Atlas V rocket from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Fletch Hildreth
2006-01-19
KENNEDY SPACE CENTER, FLA. — Photographers and spectators watch NASA’s New Horizons spacecraft, trailing fire and smoke from the Atlas V rocket that propels it, as it roars into the cloud-scattered sky. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/George Shelton
2006-01-19
KENNEDY SPACE CENTER, FLA. — Viewed from the NASA News Center, NASA’s New Horizons spacecraft roars into the cloud-scattered sky trailing fire and smoke from the Atlas V rocket that propels it. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Fletch Hildreth
2006-01-19
KENNEDY SPACE CENTER, FLA. — Viewed from the top of the Vehicle Assembly Building at Kennedy Space Center, NASA’s New Horizons spacecraft roars off the launch pad aboard an Atlas V rocket spewing flames and smoke. Liftoff was on time at 2 p.m. EST from Complex 41 on Cape Canaveral Air Force Station in Florida. This was the third launch attempt in as many days after scrubs due to weather concerns. The compact, 1,050-pound piano-sized probe will get a boost from a kick-stage solid propellant motor for its journey to Pluto. New Horizons will be the fastest spacecraft ever launched, reaching lunar orbit distance in just nine hours and passing Jupiter 13 months later. The New Horizons science payload, developed under direction of Southwest Research Institute, includes imaging infrared and ultraviolet spectrometers, a multi-color camera, a long-range telescopic camera, two particle spectrometers, a space-dust detector and a radio science experiment. The dust counter was designed and built by students at the University of Colorado, Boulder. The launch at this time allows New Horizons to fly past Jupiter in early 2007 and use the planet’s gravity as a slingshot toward Pluto. The Jupiter flyby trims the trip to Pluto by as many as five years and provides opportunities to test the spacecraft’s instruments and flyby capabilities on the Jupiter system. New Horizons could reach the Pluto system as early as mid-2015, conducting a five-month-long study possible only from the close-up vantage of a spacecraft. Photo credit: NASA/Kim Shiflett
Timing generator of scientific grade CCD camera and its implementation based on FPGA technology
NASA Astrophysics Data System (ADS)
Si, Guoliang; Li, Yunfei; Guo, Yongfei
2010-10-01
The Timing Generator's functions of Scientific Grade CCD Camera is briefly presented: it generates various kinds of impulse sequence for the TDI-CCD, video processor and imaging data output, acting as the synchronous coordinator for time in the CCD imaging unit. The IL-E2TDI-CCD sensor produced by DALSA Co.Ltd. use in the Scientific Grade CCD Camera. Driving schedules of IL-E2 TDI-CCD sensor has been examined in detail, the timing generator has been designed for Scientific Grade CCD Camera. FPGA is chosen as the hardware design platform, schedule generator is described with VHDL. The designed generator has been successfully fulfilled function simulation with EDA software and fitted into XC2VP20-FF1152 (a kind of FPGA products made by XILINX). The experiments indicate that the new method improves the integrated level of the system. The Scientific Grade CCD camera system's high reliability, stability and low power supply are achieved. At the same time, the period of design and experiment is sharply shorted.
InfraCAM (trade mark): A Hand-Held Commercial Infrared Camera Modified for Spaceborne Applications
NASA Technical Reports Server (NTRS)
Manitakos, Daniel; Jones, Jeffrey; Melikian, Simon
1996-01-01
In 1994, Inframetrics introduced the InfraCAM(TM), a high resolution hand-held thermal imager. As the world's smallest, lightest and lowest power PtSi based infrared camera, the InfraCAM is ideal for a wise range of industrial, non destructive testing, surveillance and scientific applications. In addition to numerous commercial applications, the light weight and low power consumption of the InfraCAM make it extremely valuable for adaptation to space borne applications. Consequently, the InfraCAM has been selected by NASA Lewis Research Center (LeRC) in Cleveland, Ohio, for use as part of the DARTFire (Diffusive and Radiative Transport in Fires) space borne experiment. In this experiment, a solid fuel is ignited in a low gravity environment. The combustion period is recorded by both visible and infrared cameras. The infrared camera measures the emission from polymethyl methacrylate, (PMMA) and combustion products in six distinct narrow spectral bands. Four cameras successfully completed all qualification tests at Inframetrics and at NASA Lewis. They are presently being used for ground based testing in preparation for space flight in the fall of 1995.
An attentive multi-camera system
NASA Astrophysics Data System (ADS)
Napoletano, Paolo; Tisato, Francesco
2014-03-01
Intelligent multi-camera systems that integrate computer vision algorithms are not error free, and thus both false positive and negative detections need to be revised by a specialized human operator. Traditional multi-camera systems usually include a control center with a wall of monitors displaying videos from each camera of the network. Nevertheless, as the number of cameras increases, switching from a camera to another becomes hard for a human operator. In this work we propose a new method that dynamically selects and displays the content of a video camera from all the available contents in the multi-camera system. The proposed method is based on a computational model of human visual attention that integrates top-down and bottom-up cues. We believe that this is the first work that tries to use a model of human visual attention for the dynamic selection of the camera view of a multi-camera system. The proposed method has been experimented in a given scenario and has demonstrated its effectiveness with respect to the other methods and manually generated ground-truth. The effectiveness has been evaluated in terms of number of correct best-views generated by the method with respect to the camera views manually generated by a human operator.
2008-02-06
KENNEDY SPACE CENTER, FLA. -- On the flight deck of space shuttle Atlantis, STS-122 Mission Specialist Hans Schlegel handles the camera to be used during the mission. Schlegel represents the European Space Agency. The STS-122 mission to the International Space Station is scheduled to launch at 2:45 p.m. Feb. 7 with a crew of seven. Atlantis will carry the Columbus Laboratory, Europe's largest contribution to the construction of the station. Columbus will support scientific and technological research in a microgravity environment. Columbus is a multifunctional, pressurized laboratory that will be permanently attached to the Harmony module to carry out experiments in materials science, fluid physics and biosciences, as well as to perform a number of technological applications. Photo credit: NASA/Kim Shiflett
2001-03-13
Arrays of lights (left) in the Spacecraft Assembly and Encapsulation Facility (SAEF 2) are used for illumination testing on the solar array panels at right. The panels are part of on the 2001 Mars Odyssey Orbiter. Scheduled for launch April 7, 2001, the orbiter contains three science instruments: THEMIS, the Gamma Ray Spectrometer (GRS), and the Mars Radiation Environment Experiment (MARIE). THEMIS will map the mineralogy and morphology of the Martian surface using a high-resolution camera and a thermal infrared imaging spectrometer. The GRS will achieve global mapping of the elemental composition of the surface and determine the abundance of hydrogen in the shallow subsurface. The MARIE will characterize aspects of the near-space radiation environment with regards to the radiation-related risk to human explorers
2001-03-13
Workers testing in the Spacecraft Assembly and Encapsulation Facility (SAEF 2) stand alongside the 2001 Mars Odyssey Orbiter and behind its solar array panels. The arrays of lights (right) focus on the panels during illumination testing. Scheduled for launch April 7, 2001, the orbiter contains three science instruments: THEMIS, the Gamma Ray Spectrometer (GRS), and the Mars Radiation Environment Experiment (MARIE). THEMIS will map the mineralogy and morphology of the Martian surface using a high-resolution camera and a thermal infrared imaging spectrometer. The GRS will achieve global mapping of the elemental composition of the surface and determine the abundance of hydrogen in the shallow subsurface. The MARIE will characterize aspects of the near-space radiation environment with regards to the radiation-related risk to human explorers
2008-02-06
KENNEDY SPACE CENTER, FLA. -- On the flight deck of space shuttle Atlantis, STS-122 Mission Specialist Hans Schlegel handles the camera to be used during the mission. Schlegel represents the European Space Agency. The STS-122 mission to the International Space Station is scheduled to launch at 2:45 p.m. Feb. 7 with a crew of seven. Atlantis will carry the Columbus Laboratory, Europe's largest contribution to the construction of the station. Columbus will support scientific and technological research in a microgravity environment. Columbus is a multifunctional, pressurized laboratory that will be permanently attached to the Harmony module to carry out experiments in materials science, fluid physics and biosciences, as well as to perform a number of technological applications. Photo credit: NASA/Kim Shiflett
Chandra High Resolution Camera (HRC). Rev. 59
NASA Technical Reports Server (NTRS)
Murray, Stephen
2004-01-01
This monthly report discusses management and general status, mission support and operations, and science activities. A technical memorandum entitled "Failure Analysis of HRC Flight Relay" is included with the report.
Low-cost digital dynamic visualization system
NASA Astrophysics Data System (ADS)
Asundi, Anand K.; Sajan, M. R.
1995-05-01
High speed photographic systems like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording systems requiring time consuming and tedious wet processing of the films. Currently digital cameras are replacing to certain extent the conventional cameras for static experiments. Recently, there is lot of interest in developing and modifying CCD architectures and recording arrangements for dynamic scene analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration (TDI) mode for digitally recording dynamic scenes. Applications in solid as well as fluid impact problems are presented.
Hole at Buckskin Drilled Days Before Landing Anniversary
2015-08-05
NASA's Curiosity Mars Rover drilled this hole to collect sample material from a rock target called "Buckskin" on July 30, 2015, during the 1060th Martian day, or sol, of the rover's work on Mars. The diameter is slightly smaller than a U.S. dime. Curiosity landed on Mars on Aug. 6, 2012, Universal Time (evening of Aug. 5, PDT). The rover took this image with the Mars Hand Lens Imager (MAHLI) camera, which is mounted on the same robotic arm as the sample-collecting drill. Rock powder from the collected sample was subsequently delivered to a laboratory inside the rover for analysis. The rover's drill did not experience any sign during this sample collection of an intermittent short-circuiting issue that was detected earlier in 2015. The Buckskin target is in an area near "Marias Pass" on lower Mount Sharp where Curiosity had detected unusually high levels of silica and hydrogen. MAHLI was built by Malin Space Science Systems, San Diego. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19804
IMAX and Nikon Camera Sensor Cleaning
2015-01-25
ISS042E182382 (01/25/2015) ---US astronaut Barry "Butch" Wilmore inspects one the cameras aboard the International Space Station Jan. 25, 2015, in preparation for another photo session of station experiments. Barry is the Commander of Expedition 42.
A simple demonstration when studying the equivalence principle
NASA Astrophysics Data System (ADS)
Mayer, Valery; Varaksina, Ekaterina
2016-06-01
The paper proposes a lecture experiment that can be demonstrated when studying the equivalence principle formulated by Albert Einstein. The demonstration consists of creating stroboscopic photographs of a ball moving along a parabola in Earth's gravitational field. In the first experiment, a camera is stationary relative to Earth's surface. In the second, the camera falls freely downwards with the ball, allowing students to see that the ball moves uniformly and rectilinearly relative to the frame of reference of the freely falling camera. The equivalence principle explains this result, as it is always possible to propose an inertial frame of reference for a small region of a gravitational field, where space-time effects of curvature are negligible.
Analysis of Photogrammetry Data from ISIM Mockup
NASA Technical Reports Server (NTRS)
Nowak, Maria; Hill, Mike
2007-01-01
During ground testing of the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST), the ISIM Optics group plans to use a Photogrammetry Measurement System for cryogenic calibration of specific target points on the ISIM composite structure and Science Instrument optical benches and other GSE equipment. This testing will occur in the Space Environmental Systems (SES) chamber at Goddard Space Flight Center. Close range photogrammetry is a 3 dimensional metrology system using triangulation to locate custom targets in 3 coordinates via a collection of digital photographs taken from various locations and orientations. These photos are connected using coded targets, special targets that are recognized by the software and can thus correlate the images to provide a 3 dimensional map of the targets, and scaled via well calibrated scale bars. Photogrammetry solves for the camera location and coordinates of the targets simultaneously through the bundling procedure contained in the V-STARS software, proprietary software owned by Geodetic Systems Inc. The primary objectives of the metrology performed on the ISIM mock-up were (1) to quantify the accuracy of the INCA3 photogrammetry camera on a representative full scale version of the ISIM structure at ambient temperature by comparing the measurements obtained with this camera to measurements using the Leica laser tracker system and (2), empirically determine the smallest increment of target position movement that can be resolved by the PG camera in the test setup, i.e., precision, or resolution. In addition, the geometrical details of the test setup defined during the mockup testing, such as target locations and camera positions, will contribute to the final design of the photogrammetry system to be used on the ISIM Flight Structure.
Attitude identification for SCOLE using two infrared cameras
NASA Technical Reports Server (NTRS)
Shenhar, Joram
1991-01-01
An algorithm is presented that incorporates real time data from two infrared cameras and computes the attitude parameters of the Spacecraft COntrol Lab Experiment (SCOLE), a lab apparatus representing an offset feed antenna attached to the Space Shuttle by a flexible mast. The algorithm uses camera position data of three miniature light emitting diodes (LEDs), mounted on the SCOLE platform, permitting arbitrary camera placement and an on-line attitude extraction. The continuous nature of the algorithm allows identification of the placement of the two cameras with respect to some initial position of the three reference LEDs, followed by on-line six degrees of freedom attitude tracking, regardless of the attitude time history. A description is provided of the algorithm in the camera identification mode as well as the mode of target tracking. Experimental data from a reduced size SCOLE-like lab model, reflecting the performance of the camera identification and the tracking processes, are presented. Computer code for camera placement identification and SCOLE attitude tracking is listed.
Real-time vehicle matching for multi-camera tunnel surveillance
NASA Astrophysics Data System (ADS)
Jelača, Vedran; Niño Castañeda, Jorge Oswaldo; Frías-Velázquez, Andrés; Pižurica, Aleksandra; Philips, Wilfried
2011-03-01
Tracking multiple vehicles with multiple cameras is a challenging problem of great importance in tunnel surveillance. One of the main challenges is accurate vehicle matching across the cameras with non-overlapping fields of view. Since systems dedicated to this task can contain hundreds of cameras which observe dozens of vehicles each, for a real-time performance computational efficiency is essential. In this paper, we propose a low complexity, yet highly accurate method for vehicle matching using vehicle signatures composed of Radon transform like projection profiles of the vehicle image. The proposed signatures can be calculated by a simple scan-line algorithm, by the camera software itself and transmitted to the central server or to the other cameras in a smart camera environment. The amount of data is drastically reduced compared to the whole image, which relaxes the data link capacity requirements. Experiments on real vehicle images, extracted from video sequences recorded in a tunnel by two distant security cameras, validate our approach.
Effect of image scaling on stereoscopic movie experience
NASA Astrophysics Data System (ADS)
Häkkinen, Jukka P.; Hakala, Jussi; Hannuksela, Miska; Oittinen, Pirkko
2011-03-01
Camera separation affects the perceived depth in stereoscopic movies. Through control of the separation and thereby the depth magnitudes, the movie can be kept comfortable but interesting. In addition, the viewing context has a significant effect on the perceived depth, as a larger display and longer viewing distances also contribute to an increase in depth. Thus, if the content is to be viewed in multiple viewing contexts, the depth magnitudes should be carefully planned so that the content always looks acceptable. Alternatively, the content can be modified for each viewing situation. To identify the significance of changes due to the viewing context, we studied the effect of stereoscopic camera base distance on the viewer experience in three different situations: 1) small sized video and a viewing distance of 38 cm, 2) television and a viewing distance of 158 cm, and 3) cinema and a viewing distance of 6-19 meters. We examined three different animations with positive parallax. The results showed that the camera distance had a significant effect on the viewing experience in small display/short viewing distance situations, in which the experience ratings increased until the maximum disparity in the scene was 0.34 - 0.45 degrees of visual angle. After 0.45 degrees, increasing the depth magnitude did not affect the experienced quality ratings. Interestingly, changes in the camera distance did not affect the experience ratings in the case of television or cinema if the depth magnitudes were below one degree of visual angle. When the depth was greater than one degree, the experience ratings began to drop significantly. These results indicate that depth magnitudes have a larger effect on the viewing experience with a small display. When a stereoscopic movie is viewed from a larger display, other experiences might override the effect of depth magnitudes.
X-ray pinhole camera setups used in the Atomki ECR Laboratory for plasma diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rácz, R., E-mail: rracz@atomki.hu; Biri, S.; Pálinkás, J.
Imaging of the electron cyclotron resonance (ECR) plasmas by using CCD camera in combination with a pinhole is a non-destructive diagnostics method to record the strongly inhomogeneous spatial density distribution of the X-ray emitted by the plasma and by the chamber walls. This method can provide information on the location of the collisions between warm electrons and multiple charged ions/atoms, opening the possibility to investigate the direct effect of the ion source tuning parameters to the plasma structure. The first successful experiment with a pinhole X-ray camera was carried out in the Atomki ECR Laboratory more than 10 years ago.more » The goal of that experiment was to make the first ECR X-ray photos and to carry out simple studies on the effect of some setting parameters (magnetic field, extraction, disc voltage, gas mixing, etc.). Recently, intensive efforts were taken to investigate now the effect of different RF resonant modes to the plasma structure. Comparing to the 2002 experiment, this campaign used wider instrumental stock: CCD camera with a lead pinhole was placed at the injection side allowing X-ray imaging and beam extraction simultaneously. Additionally, Silicon Drift Detector (SDD) and High Purity Germanium (HPGe) detectors were installed to characterize the volumetric X-ray emission rate caused by the warm and hot electron domains. In this paper, detailed comparison study on the two X-ray camera and detector setups and also on the technical and scientific goals of the experiments is presented.« less