Effects of red light camera enforcement on fatal crashes in large U.S. cities.
Hu, Wen; McCartt, Anne T; Teoh, Eric R
2011-08-01
To estimate the effects of red light camera enforcement on per capita fatal crash rates at intersections with signal lights. From the 99 large U.S. cities with more than 200,000 residents in 2008, 14 cities were identified with red light camera enforcement programs for all of 2004-2008 but not at any time during 1992-1996, and 48 cities were identified without camera programs during either period. Analyses compared the citywide per capita rate of fatal red light running crashes and the citywide per capita rate of all fatal crashes at signalized intersections during the two study periods, and rate changes then were compared for cities with and without cameras programs. Poisson regression was used to model crash rates as a function of red light camera enforcement, land area, and population density. The average annual rate of fatal red light running crashes declined for both study groups, but the decline was larger for cities with red light camera enforcement programs than for cities without camera programs (35% vs. 14%). The average annual rate of all fatal crashes at signalized intersections decreased by 14% for cities with camera programs and increased slightly (2%) for cities without cameras. After controlling for population density and land area, the rate of fatal red light running crashes during 2004-2008 for cities with camera programs was an estimated 24% lower than what would have been expected without cameras. The rate of all fatal crashes at signalized intersections during 2004-2008 for cities with camera programs was an estimated 17% lower than what would have been expected without cameras. Red light camera enforcement programs were associated with a statistically significant reduction in the citywide rate of fatal red light running crashes and a smaller but still significant reduction in the rate of all fatal crashes at signalized intersections. The study adds to the large body of evidence that red light camera enforcement can prevent the most serious crashes. Communities seeking to reduce crashes at intersections should consider this evidence. Copyright © 2011 Elsevier Ltd. All rights reserved.
Seeing Red: Discourse, Metaphor, and the Implementation of Red Light Cameras in Texas
ERIC Educational Resources Information Center
Hayden, Lance Alan
2009-01-01
This study examines the deployment of automated red light camera systems in the state of Texas from 2003 through late 2007. The deployment of new technologies in general, and surveillance infrastructures in particular, can prove controversial and challenging for the formation of public policy. Red light camera surveillance during this period in…
DOT National Transportation Integrated Search
2006-03-01
This report presents results from an analysis of about 47,000 red light violation records collected from 11 intersections in the : City of Sacramento, California, by red light photo enforcement cameras between May 1999 and June 2003. The goal of this...
Evaluating the Impacts of Red Light Camera Deployment on Intersection Traffic Safety
DOT National Transportation Integrated Search
2018-06-01
Red-light cameras (RLC) are a popular countermeasure to reduce red-light running and improve intersection safety. Studies show that the reduction in side impact crashes at RLC intersections are often accompanied by no-change or an increase in the num...
Reductions in injury crashes associated with red light camera enforcement in oxnard, california.
Retting, Richard A; Kyrychenko, Sergey Y
2002-11-01
This study estimated the impact of red light camera enforcement on motor vehicle crashes in one of the first US communities to employ such cameras-Oxnard, California. Crash data were analyzed for Oxnard and for 3 comparison cities. Changes in crash frequencies were compared for Oxnard and control cities and for signalized and nonsignalized intersections by means of a generalized linear regression model. Overall, crashes at signalized intersections throughout Oxnard were reduced by 7% and injury crashes were reduced by 29%. Right-angle crashes, those most associated with red light violations, were reduced by 32%; right-angle crashes involving injuries were reduced by 68%. Because red light cameras can be a permanent component of the transportation infrastructure, crash reductions attributed to camera enforcement should be sustainable.
Safety Evaluation of Red Light Running Camera Intersections in Illinois
DOT National Transportation Integrated Search
2017-04-01
As a part of this research, the safety performance of red light running (RLR) camera systems was evaluated for a sample of 41 intersections and 60 RLR camera approaches located on state routes under IDOTs jurisdiction in the Chicago suburbs. Compr...
Safety evaluation of red-light cameras
DOT National Transportation Integrated Search
2005-04-01
The objective of this final study was to determine the effectiveness of red-light-camera (RLC) systems in reducing crashes. The study used empirical Bayes before-and-after research using data from seven jurisdictions across the United States at 132 t...
Spillover Effect and Economic Effect of Red Light Cameras
DOT National Transportation Integrated Search
2017-04-01
"Spillover effect" of red light cameras (RLCs) refers to the expected safety improvement at intersections other than those actually treated. Such effects may be due to jurisdiction-wide publicity of RLCs and the general publics lack of knowledge o...
C-RED One and C-RED2: SWIR high-performance cameras using Saphira e-APD and Snake InGaAs detectors
NASA Astrophysics Data System (ADS)
Gach, Jean-Luc; Feautrier, Philippe; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Carmignani, Thomas; Wanwanscappel, Yann; Boutolleau, David
2018-02-01
After the development of the OCAM2 EMCCD fast visible camera dedicated to advanced adaptive optics wavefront sensing, First Light Imaging moved to the SWIR fast cameras with the development of the C-RED One and the C-RED 2 cameras. First Light Imaging's C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise and very low background. C-RED One is based on the last version of the SAPHIRA detector developed by Leonardo UK. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. In addition to this project, First Light Imaging developed an InGaAs 640x512 fast camera with unprecedented performances in terms of noise, dark and readout speed based on the SNAKE SWIR detector from Sofradir. The camera was called C-RED 2. The C-RED 2 characteristics and performances will be described. The C-RED One project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944. The C-RED 2 development is supported by the "Investments for the future" program and the Provence Alpes Côte d'Azur Region, in the frame of the CPER.
Feasibility of Using Video Cameras for Automated Enforcement on Red-Light Running and Managed Lanes.
DOT National Transportation Integrated Search
2009-12-01
The overall objective of this study is to evaluate the feasibility, effectiveness, legality, and public acceptance aspects of automated enforcement on red light running and high occupancy vehicle (HOV) occupancy requirement using video cameras in Nev...
Feasibility of Using Video Camera for Automated Enforcement on Red-Light Running and Managed Lanes.
DOT National Transportation Integrated Search
2009-12-25
The overall objective of this study is to evaluate the feasibility, effectiveness, legality, and public acceptance aspects of automated enforcement on red light running and HOV occupancy requirement using video cameras in Nevada. This objective was a...
The impact of red light cameras (photo-red enforcement) on crashes in Virginia.
DOT National Transportation Integrated Search
2007-01-01
Red light running is a significant public health concern, killing more than 800 people and injuring 200,000 in the United States per year (Retting et al., 1999a; Retting and Kyrychenko, 2002). To reduce red light running in Virginia, six jurisdiction...
Leveraging traffic and surveillance video cameras for urban traffic.
DOT National Transportation Integrated Search
2014-12-01
The objective of this project was to investigate the use of existing video resources, such as traffic : cameras, police cameras, red light cameras, and security cameras for the long-term, real-time : collection of traffic statistics. An additional ob...
Vanlaar, Ward; Robertson, Robyn; Marcoux, Kyla
2014-01-01
The objective of this study was to evaluate the impact of Winnipeg's photo enforcement safety program on speeding, i.e., "speed on green", and red-light running behavior at intersections as well as on crashes resulting from these behaviors. ARIMA time series analyses regarding crashes related to red-light running (right-angle crashes and rear-end crashes) and crashes related to speeding (injury crashes and property damage only crashes) occurring at intersections were conducted using monthly crash counts from 1994 to 2008. A quasi-experimental intersection camera experiment was also conducted using roadside data on speeding and red-light running behavior at intersections. These data were analyzed using logistic regression analysis. The time series analyses showed that for crashes related to red-light running, there had been a 46% decrease in right-angle crashes at camera intersections, but that there had also been an initial 42% increase in rear-end crashes. For crashes related to speeding, analyses revealed that the installation of cameras was not associated with increases or decreases in crashes. Results of the intersection camera experiment show that there were significantly fewer red light running violations at intersections after installation of cameras and that photo enforcement had a protective effect on speeding behavior at intersections. However, the data also suggest photo enforcement may be less effective in preventing serious speeding violations at intersections. Overall, Winnipeg's photo enforcement safety program had a positive net effect on traffic safety. Results from both the ARIMA time series and the quasi-experimental design corroborate one another. However, the protective effect of photo enforcement is not equally pronounced across different conditions so further monitoring is required to improve the delivery of this measure. Results from this study as well as limitations are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
2005-01-01
Red light running, which is defined as the act of a motorist entering an intersection after the traffic signal has turned red, caused almost 5,000 crashes in Virginia in 2003, resulting in at least 18 deaths and more than 3,800 injuries. In response ...
Red light running camera assessment.
DOT National Transportation Integrated Search
2011-04-01
In the 2004-2007 period, the Mission Street SE and 25th Street SE intersection in Salem, Oregon showed relatively few crashes attributable to red light running (RLR) but, since a high number of RLR violations were observed, the intersection was ident...
DOT National Transportation Integrated Search
2011-11-01
Red light running (RLR) is a problem in the US that has resulted in 165,000 injuries and 907 fatalities annually. In Iowa, RLR-related crashes make up 24.5 percent of all crashes and account for 31.7 percent of fatal and major injury crashes at signa...
Using Meta Analysis Techniques to Assess the Safety Effect of Red Light Running Cameras
DOT National Transportation Integrated Search
2002-02-01
Automated enforcement programs, including automated systems that are used to enforce red light running violations, have recently come under scrutiny regarding their value in terms of improving safety, their primary purpose. One of the major hurdles t...
Spectrally resolved laser interference microscopy
NASA Astrophysics Data System (ADS)
Butola, Ankit; Ahmad, Azeem; Dubey, Vishesh; Senthilkumaran, P.; Singh Mehta, Dalip
2018-07-01
We developed a new quantitative phase microscopy technique, namely, spectrally resolved laser interference microscopy (SR-LIM), with which it is possible to quantify multi-spectral phase information related to biological specimens without color crosstalk using a color CCD camera. It is a single shot technique where sequential switched on/off of red, green, and blue (RGB) wavelength light sources are not required. The method is implemented using a three-wavelength interference microscope and a customized compact grating based imaging spectrometer fitted at the output port. The results of the USAF resolution chart while employing three different light sources, namely, a halogen lamp, light emitting diodes, and lasers, are discussed and compared. The broadband light sources like the halogen lamp and light emitting diodes lead to stretching in the spectrally decomposed images, whereas it is not observed in the case of narrow-band light sources, i.e. lasers. The proposed technique is further successfully employed for single-shot quantitative phase imaging of human red blood cells at three wavelengths simultaneously without color crosstalk. Using the present technique, one can also use a monochrome camera, even though the experiments are performed using multi-color light sources. Finally, SR-LIM is not only limited to RGB wavelengths, it can be further extended to red, near infra-red, and infra-red wavelengths, which are suitable for various biological applications.
Study of plant phototropic responses to different LEDs illumination in microgravity
NASA Astrophysics Data System (ADS)
Zyablova, Natalya; Berkovich, Yuliy A.; Skripnikov, Alexander; Nikitin, Vladimir
2012-07-01
The purpose of the experiment planned for Russian BION-M #1, 2012, biosatellite is research of Physcomitrella patens (Hedw.) B.S.G. phototropic responses to different light stimuli in microgravity. The moss was chosen as small-size higher plant. The experimental design involves five lightproof culture flasks with moss gametophores fixed inside the cylindrical container (diameter 120 mm; height 240 mm). The plants in each flask are illuminated laterally by one of the following LEDs: white, blue (475 nm), red (625 nm), far red (730 nm), infrared (950 nm). The gametophores growth and bending are captured periodically by means of five analogue video cameras and recorder. The programmable command module controls power supply of each camera and each light source, commutation of the cameras and functioning of video recorder. Every 20 minutes the recorder is sequentially connecting to one of the cameras. This results in a clip, containing 5 sets of frames in a row. After landing time-lapse films are automatically created. As a result we will have five time-lapse films covering transformations in each of the five culture flasks. Onground experiments demonstrated that white light induced stronger gametophores phototropic bending as compared to red and blue stimuli. The comparison of time-lapse recordings in the experiments will provide useful information to optimize lighting assemblies for space plant growth facilities.
To brake or to accelerate? Safety effects of combined speed and red light cameras.
De Pauw, Ellen; Daniels, Stijn; Brijs, Tom; Hermans, Elke; Wets, Geert
2014-09-01
The present study evaluates the traffic safety effect of combined speed and red light cameras at 253 signalized intersections in Flanders, Belgium that were installed between 2002 and 2007. The adopted approach is a before-and-after study with control for the trend. The analyses showed a non-significant increase of 5% in the number of injury crashes. An almost significant decrease of 14% was found for the more severe crashes. The number of rear-end crashes turned out to have increased significantly (+44%), whereas a non-significant decrease (-6%) was found in the number of side crashes. The decrease for the severe crashes was mainly attributable to the effect on side crashes, for which a significant decrease of 24% was found. It is concluded that combined speed and red light cameras have a favorable effect on traffic safety, in particular on severe crashes. However, future research should examine the circumstances of rear-end crashes and how this increase can be managed. Copyright © 2014 National Safety Council and Elsevier Ltd. All rights reserved.
Opportunistic traffic sensing using existing video sources (phase II).
DOT National Transportation Integrated Search
2017-02-01
The purpose of the project reported on here was to investigate methods for automatic traffic sensing using traffic surveillance : cameras, red light cameras, and other permanent and pre-existing video sources. Success in this direction would potentia...
NASA Astrophysics Data System (ADS)
Singh Mehta, Dalip; Srivastava, Vishal
2012-11-01
We report quantitative phase imaging of human red blood cells (RBCs) using phase-shifting interference microscopy. Five phase-shifted white light interferograms are recorded using colour charge coupled device camera. White light interferograms were decomposed into red, green, and blue colour components. The phase-shifted interferograms of each colour were then processed by phase-shifting analysis and phase maps for red, green, and blue colours were reconstructed. Wavelength dependent refractive index profiles of RBCs were computed from the single set of white light interferogram. The present technique has great potential for non-invasive determination of refractive index variation and morphological features of cells and tissues.
Questions Students Ask: The Red-Eye Effect.
ERIC Educational Resources Information Center
Physics Teacher, 1985
1985-01-01
Addresses the question of why a dog's eyes appear red and glow when a flash photograph is taken. Conditions for the red-eye effect, light paths involved, structure of the eye, and typical cameras and lenses are discussed. Also notes differences between the eyes of nocturnal animals and humans. (JN)
Lights, Camera, Spectroscope! The Basics of Spectroscopy Disclosed Using a Computer Screen
ERIC Educational Resources Information Center
Garrido-Gonza´lez, Jose´ J.; Trillo-Alcala´, María; Sa´nchez-Arroyo, Antonio J.
2018-01-01
The generation of secondary colors in digital devices by means of the additive red, green, and blue color model (RGB) can be a valuable way to introduce students to the basics of spectroscopy. This work has been focused on the spectral separation of secondary colors of light emitted by a computer screen into red, green, and blue bands, and how the…
Officials nationwide give a green light to automated traffic enforcement
DOT National Transportation Integrated Search
2000-03-11
There has been resistance to using cameras to automatically identify vehicles driven by motorists who run red lights and drive faster than the posted speed limits. Fairness, privacy, and "big brother" have been cited as reasons. The article examines ...
NASA Astrophysics Data System (ADS)
Dubey, Vishesh; Singh, Veena; Ahmad, Azeem; Singh, Gyanendra; Mehta, Dalip Singh
2016-03-01
We report white light phase shifting interferometry in conjunction with color fringe analysis for the detection of contaminants in water such as Escherichia coli (E.coli), Campylobacter coli and Bacillus cereus. The experimental setup is based on a common path interferometer using Mirau interferometric objective lens. White light interferograms are recorded using a 3-chip color CCD camera based on prism technology. The 3-chip color camera have lesser color cross talk and better spatial resolution in comparison to single chip CCD camera. A piezo-electric transducer (PZT) phase shifter is fixed with the Mirau objective and they are attached with a conventional microscope. Five phase shifted white light interferograms are recorded by the 3-chip color CCD camera and each phase shifted interferogram is decomposed into the red, green and blue constituent colors, thus making three sets of five phase shifted intererograms for three different colors from a single set of white light interferogram. This makes the system less time consuming and have lesser effect due to surrounding environment. Initially 3D phase maps of the bacteria are reconstructed for red, green and blue wavelengths from these interferograms using MATLAB, from these phase maps we determines the refractive index (RI) of the bacteria. Experimental results of 3D shape measurement and RI at multiple wavelengths will be presented. These results might find applications for detection of contaminants in water without using any chemical processing and fluorescent dyes.
Fundamentals of in Situ Digital Camera Methodology for Water Quality Monitoring of Coast and Ocean
Goddijn-Murphy, Lonneke; Dailloux, Damien; White, Martin; Bowers, Dave
2009-01-01
Conventional digital cameras, the Nikon Coolpix885® and the SeaLife ECOshot®, were used as in situ optical instruments for water quality monitoring. Measured response spectra showed that these digital cameras are basically three-band radiometers. The response values in the red, green and blue bands, quantified by RGB values of digital images of the water surface, were comparable to measurements of irradiance levels at red, green and cyan/blue wavelengths of water leaving light. Different systems were deployed to capture upwelling light from below the surface, while eliminating direct surface reflection. Relationships between RGB ratios of water surface images, and water quality parameters were found to be consistent with previous measurements using more traditional narrow-band radiometers. This current paper focuses on the method that was used to acquire digital images, derive RGB values and relate measurements to water quality parameters. Field measurements were obtained in Galway Bay, Ireland, and in the Southern Rockall Trough in the North Atlantic, where both yellow substance and chlorophyll concentrations were successfully assessed using the digital camera method. PMID:22346729
NASA Astrophysics Data System (ADS)
Howett, C. J. A.; Ennico, K.; Olkin, C. B.; Buie, M. W.; Verbiscer, A. J.; Zangari, A. M.; Parker, A. H.; Reuter, D. C.; Grundy, W. M.; Weaver, H. A.; Young, L. A.; Stern, S. A.
2017-05-01
Light curves produced from color observations taken during New Horizons' approach to the Pluto-system by its Multi-spectral Visible Imaging Camera (MVIC, part of the Ralph instrument) are analyzed. Fifty seven observations were analyzed, they were obtained between 9th April and 3rd July 2015, at a phase angle of 14.5° to 15.1°, sub-observer latitude of 51.2 °N to 51.5 °N, and a sub-solar latitude of 41.2°N. MVIC has four color channels; all are discussed for completeness but only two were found to produce reliable light curves: Blue (400-550 nm) and Red (540-700 nm). The other two channels, Near Infrared (780-975 nm) and Methane-Band (860-910 nm), were found to be potentially erroneous and too noisy respectively. The Blue and Red light curves show that Charon's surface is neutral in color, but slightly brighter on its Pluto-facing hemisphere. This is consistent with previous studies made with the Johnson B and V bands, which are at shorter wavelengths than that of the MVIC Blue and Red channel respectively.
NASA Technical Reports Server (NTRS)
Howett, C. J. A.; Ennico, K.; Olkin, C. B.; Buie, M. W.; Verbiscer, A. J.; Zangari, A. M.; Parker, A. H.; Reuter, D. C.; Grundy, W. M.; Weaver, H. A.;
2016-01-01
Light curves produced from color observations taken during New Horizons approach to the Pluto-system by its Multi-spectral Visible Imaging Camera (MVIC, part of the Ralph instrument) are analyzed. Fifty seven observations were analyzed, they were obtained between 9th April and 3rd July 2015, at a phase angle of 14.5 degrees to 15.1 degrees, sub-observer latitude of 51.2 degrees North to 51.5 degrees North, and a sub-solar latitude of 41.2 degrees North. MVIC has four color channels; all are discussed for completeness but only two were found to produce reliable light curves: Blue (400-550 nm) and Red (540-700 nm). The other two channels, Near Infrared (780-975 nm) and Methane-Band (860-910 nm), were found to be potentially erroneous and too noisy respectively. The Blue and Red light curves show that Charon's surface is neutral in color, but slightly brighter on its Pluto-facing hemisphere. This is consistent with previous studies made with the Johnson B and V bands, which are at shorter wavelengths than that of the MVIC Blue and Red channel respectively.
Miniature, mobile X-ray computed radiography system
Watson, Scott A; Rose, Evan A
2017-03-07
A miniature, portable x-ray system may be configured to scan images stored on a phosphor. A flash circuit may be configured to project red light onto a phosphor and receive blue light from the phosphor. A digital monochrome camera may be configured to receive the blue light to capture an article near the phosphor.
A Daytime Aspect Camera for Balloon Altitudes
NASA Technical Reports Server (NTRS)
Dietz, Kurt L.; Ramsey, Brian D.; Alexander, Cheryl D.; Apple, Jeff A.; Ghosh, Kajal K.; Swift, Wesley R.; Six, N. Frank (Technical Monitor)
2001-01-01
We have designed, built, and flight-tested a new star camera for daytime guiding of pointed balloon-borne experiments at altitudes around 40km. The camera and lens are commercially available, off-the-shelf components, but require a custom-built baffle to reduce stray light, especially near the sunlit limb of the balloon. This new camera, which operates in the 600-1000 nm region of the spectrum, successfully provided daytime aspect information of approximately 10 arcsecond resolution for two distinct star fields near the galactic plane. The detected scattered-light backgrounds show good agreement with the Air Force MODTRAN models, but the daytime stellar magnitude limit was lower than expected due to dispersion of red light by the lens. Replacing the commercial lens with a custom-built lens should allow the system to track stars in any arbitrary area of the sky during the daytime.
Linear CCD attitude measurement system based on the identification of the auxiliary array CCD
NASA Astrophysics Data System (ADS)
Hu, Yinghui; Yuan, Feng; Li, Kai; Wang, Yan
2015-10-01
Object to the high precision flying target attitude measurement issues of a large space and large field of view, comparing existing measurement methods, the idea is proposed of using two array CCD to assist in identifying the three linear CCD with multi-cooperative target attitude measurement system, and to address the existing nonlinear system errors and calibration parameters and more problems with nine linear CCD spectroscopic test system of too complicated constraints among camera position caused by excessive. The mathematical model of binocular vision and three linear CCD test system are established, co-spot composition triangle utilize three red LED position light, three points' coordinates are given in advance by Cooperate Measuring Machine, the red LED in the composition of the three sides of a triangle adds three blue LED light points as an auxiliary, so that array CCD is easier to identify three red LED light points, and linear CCD camera is installed of a red filter to filter out the blue LED light points while reducing stray light. Using array CCD to measure the spot, identifying and calculating the spatial coordinates solutions of red LED light points, while utilizing linear CCD to measure three red LED spot for solving linear CCD test system, which can be drawn from 27 solution. Measured with array CCD coordinates auxiliary linear CCD has achieved spot identification, and has solved the difficult problems of multi-objective linear CCD identification. Unique combination of linear CCD imaging features, linear CCD special cylindrical lens system is developed using telecentric optical design, the energy center of the spot position in the depth range of convergence in the direction is perpendicular to the optical axis of the small changes ensuring highprecision image quality, and the entire test system improves spatial object attitude measurement speed and precision.
Atmospheric Science Data Center
2013-04-19
article title: MISR Global Images See the Light of Day View Larger Image ... camera and combines data from the red, green and blue spectral bands to create a natural color image. The central view combines ...
SOLAR - ASTRONOMY (APOLLO-SATURN [AS]-16)
1972-05-09
S72-36972 (21 April 1972) --- A color enhancement of a far-ultraviolet photo of Earth taken by astronaut John W. Young, commander, with the ultraviolet camera on April 21, 1972. The original black and white photo was printed on Agfacontour film three times, each exposure recording only one light level. The three light levels were then colored blue (dimmest), green (next brightest), and red (brightest). The three auroral belts, the sunlit atmosphere and the background stars (one very close to Earth, on left) can be studied quantitatively fro brightness. The UV camera was designed and built at the Naval Research Laboratory, Washington, D.C. EDITOR'S NOTE: The photographic number of the original black & white UV camera photograph from which this enhancement was made is AS16-123-19657.
Automated enforcement and highway safety : final report.
DOT National Transportation Integrated Search
2013-11-01
The objectives of the Automated Enforcement and Highway Safety Research study were to conduct a : literature review of national research related to the effectiveness of Red Light Camera (RLC) programs : in changing crash frequency, crash severity, cr...
Full-Field Calibration of Color Camera Chromatic Aberration using Absolute Phase Maps.
Liu, Xiaohong; Huang, Shujun; Zhang, Zonghua; Gao, Feng; Jiang, Xiangqian
2017-05-06
The refractive index of a lens varies for different wavelengths of light, and thus the same incident light with different wavelengths has different outgoing light. This characteristic of lenses causes images captured by a color camera to display chromatic aberration (CA), which seriously reduces image quality. Based on an analysis of the distribution of CA, a full-field calibration method based on absolute phase maps is proposed in this paper. Red, green, and blue closed sinusoidal fringe patterns are generated, consecutively displayed on an LCD (liquid crystal display), and captured by a color camera from the front viewpoint. The phase information of each color fringe is obtained using a four-step phase-shifting algorithm and optimum fringe number selection method. CA causes the unwrapped phase of the three channels to differ. These pixel deviations can be computed by comparing the unwrapped phase data of the red, blue, and green channels in polar coordinates. CA calibration is accomplished in Cartesian coordinates. The systematic errors introduced by the LCD are analyzed and corrected. Simulated results show the validity of the proposed method and experimental results demonstrate that the proposed full-field calibration method based on absolute phase maps will be useful for practical software-based CA calibration.
Science-Filters Study of Martian Rock Sees Hematite
2017-11-01
This false-color image demonstrates how use of special filters available on the Mast Camera (Mastcam) of NASA's Curiosity Mars rover can reveal the presence of certain minerals in target rocks. It is a composite of images taken through three "science" filters chosen for making hematite, an iron-oxide mineral, stand out as exaggerated purple. This target rock, called "Christmas Cove," lies in an area on Mars' "Vera Rubin Ridge" where Mastcam reconnaissance imaging (see PIA22065) with science filters suggested a patchy distribution of exposed hematite. Bright lines within the rocks are fractures filled with calcium sulfate minerals. Christmas Cove did not appear to contain much hematite until the rover team conducted an experiment on this target: Curiosity's wire-bristled brush, the Dust Removal Tool, scrubbed the rock, and a close-up with the Mars Hand Lens Imager (MAHLI) confirmed the brushing. The brushed area is about is about 2.5 inches (6 centimeters) across. The next day -- Sept. 17, 2017, on the mission's Sol 1819 -- this observation with Mastcam and others with the Chemistry and Camera (ChemCam showed a strong hematite presence that had been subdued beneath the dust. The team is continuing to explore whether the patchiness in the reconnaissance imaging may result more from variations in the amount of dust cover rather than from variations in hematite content. Curiosity's Mastcam combines two cameras: one with a telephoto lens and the other with a wider-angle lens. Each camera has a filter wheel that can be rotated in front of the lens for a choice of eight different filters. One filter for each camera is clear to all visible light, for regular full-color photos, and another is specifically for viewing the Sun. Some of the other filters were selected to admit wavelengths of light that are useful for identifying iron minerals. Each of the filters used for this image admits light from a narrow band of wavelengths, extending to only about 5 nanometers longer or shorter than the filter's central wavelength. Three observations are combined for this image, each through one of the filters centered at 751 nanometers (in the near-infrared part of the spectrum just beyond red light), 527 nanometers (green) and 445 nanometers (blue). Usual color photographs from digital cameras -- such as a Mastcam one of this same place (see PIA22067) -- also combine information from red, green and blue filtering, but the filters are in a microscopic grid in a "Bayer" filter array situated directly over the detector behind the lens, with wider bands of wavelengths. Mastcam's narrow-band filters used for this view help to increase spectral contrast, making blues bluer and reds redder, particularly with the processing used to boost contrast in each of the component images of this composite. Fine-grained hematite preferentially absorbs sunlight around in the green portion of the spectrum around 527 nanometers. That gives it the purple look from a combination of red and blue light reflected by the hematite and reaching the camera through the other two filters. https://photojournal.jpl.nasa.gov/catalog/PIA22066
NASA Technical Reports Server (NTRS)
1997-01-01
Lander image of rover near The Dice (three small rocks behind the rover) and Yogi on sol 22. Color (red, green, and blue filters at 6:1 compression) image shows dark rocks, bright red dust, dark red soil exposed in rover tracks, and dark (black) soil. The APXS is in view at the rear of the vehicle, and the forward stereo cameras and laser light stripers are in shadow just below the front edge of the solar panel.
NOTE: original caption as published in Science MagazineC-RED One : the infrared camera using the Saphira e-APD detector
NASA Astrophysics Data System (ADS)
Greffe, Timothée.; Feautrier, Philippe; Gach, Jean-Luc; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Boutolleau, David; Baker, Ian
2016-08-01
Name for Person Card: Observatoire de la Côte d'Azur First Light Imaging' C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a sub-electron readout noise and very low background. This breakthrough has been made possible thanks to the use of an e- APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
Cassini First-Look Images of Jupiter
2000-10-05
This image of Jupiter was taken by the Cassini Imaging Science narrow angle camera through the blue filter (centered at 445 nanometers) on October 1, 2000, 15:26 UTC at a distance of 84.1million km from Jupiter. The smallest features that can be seen are 500 kilometers across. The contrast between bright and dark features in this region of the spectrum is determined by the different light absorbing properties of the particles composing Jupiter's clouds. Ammonia ice particles are white, reflecting all light that falls on them. But some particles are red, and absorb mostly blue light. The composition of these red particles and the processes which determine their distribution are two of the long-standing mysteries of Jovian meteorology and chemistry. Note that the Great Red Spot contains a dark core of absorbing particles. http://photojournal.jpl.nasa.gov/catalog/PIA02666
Atmospheric Science Data Center
2013-04-15
... The images on the left are natural color (red, green, blue) images from MISR's vertical-viewing (nadir) camera. The images on the ... one of MISR's derived surface products. The radiance (light intensity) in each pixel of the so-called "top-of-atmosphere" images on ...
NASA Technical Reports Server (NTRS)
Onate, Bryan
2016-01-01
The International Space Station (ISS) will soon have a platform for conducting fundamental research of Large Plants. Plant Habitat (PH) is designed to be a fully controllable environment for high-quality plant physiological research. PH will control light quality, level, and timing, temperature, CO2, relative humidity, and irrigation, while scrubbing ethylene. Additional capabilities include leaf temperature and root zone moisture and oxygen sensing. The light cap will have red (630 nm), blue (450 nm), green (525 nm), far red (730 nm) and broad spectrum white LEDs. There will be several internal cameras (visible and IR) to monitor and record plant growth and operations.
Baker, Stokes S.; Vidican, Cleo B.; Cameron, David S.; Greib, Haittam G.; Jarocki, Christine C.; Setaputri, Andres W.; Spicuzza, Christopher H.; Burr, Aaron A.; Waqas, Meriam A.; Tolbert, Danzell A.
2012-01-01
Background and aims Studies have shown that levels of green fluorescent protein (GFP) leaf surface fluorescence are directly proportional to GFP soluble protein concentration in transgenic plants. However, instruments that measure GFP surface fluorescence are expensive. The goal of this investigation was to develop techniques with consumer digital cameras to analyse GFP surface fluorescence in transgenic plants. Methodology Inexpensive filter cubes containing machine vision dichroic filters and illuminated with blue light-emitting diodes (LED) were designed to attach to digital single-lens reflex (SLR) camera macro lenses. The apparatus was tested on purified enhanced GFP, and on wild-type and GFP-expressing arabidopsis grown autotrophically and heterotrophically. Principal findings Spectrum analysis showed that the apparatus illuminates specimens with wavelengths between ∼450 and ∼500 nm, and detects fluorescence between ∼510 and ∼595 nm. Epifluorescent photographs taken with SLR digital cameras were able to detect red-shifted GFP fluorescence in Arabidopsis thaliana leaves and cotyledons of pot-grown plants, as well as roots, hypocotyls and cotyledons of etiolated and light-grown plants grown heterotrophically. Green fluorescent protein fluorescence was detected primarily in the green channel of the raw image files. Studies with purified GFP produced linear responses to both protein surface density and exposure time (H0: β (slope) = 0 mean counts per pixel (ng s mm−2)−1, r2 > 0.994, n = 31, P < 1.75 × 10−29). Conclusions Epifluorescent digital photographs taken with complementary metal-oxide-semiconductor and charge-coupled device SLR cameras can be used to analyse red-shifted GFP surface fluorescence using visible blue light. This detection device can be constructed with inexpensive commercially available materials, thus increasing the accessibility of whole-organism GFP expression analysis to research laboratories and teaching institutions with small budgets. PMID:22479674
Johnson, Marilyn; Newstead, Stuart; Charlton, Judith; Oxley, Jennifer
2011-01-01
This study determined the rate and associated factors of red light infringement among urban commuter cyclists. A cross-sectional observational study was conducted using a covert video camera to record cyclists at 10 sites across metropolitan Melbourne, Australia from October 2008 to April 2009. In total, 4225 cyclists faced a red light and 6.9% were non-compliant. The main predictive factor for infringement was direction of travel, cyclists turning left (traffic travels on the left-side in Australia) had 28.3 times the relative odds of infringement compared to cyclists who continued straight through the intersection. Presence of other road users had a deterrent effect with the odds of infringement lower when a vehicle travelling in the same direction was present (OR=0.39, 95% CI 0.28-0.53) or when other cyclists were present (OR=0.26, 95% CI 0.19-0.36). Findings suggest that some cyclists do not perceive turning left against a red signal to be unsafe and the opportunity to ride through the red light during low cross traffic times influences the likelihood of infringement. Copyright © 2010 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
2012-08-01
Improving traffic safety is a priority transportation issue. A tremendous amount of : resources has been invested on improving safety and efficiency at signalized : intersections. Although programs such as driver education, red-light camera : deploym...
Programmable 10 MHz optical fiducial system for hydrodiagnostic cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huen, T.
1987-07-01
A solid state light control system was designed and fabricated for use with hydrodiagnostic streak cameras of the electro-optic type. With its use, the film containing the streak images will have on it two time scales simultaneously exposed with the signal. This allows timing and cross timing. The latter is achieved with exposure modulation marking onto the time tick marks. The purpose of using two time scales will be discussed. The design is based on a microcomputer, resulting in a compact and easy to use instrument. The light source is a small red light emitting diode. Time marking can bemore » programmed in steps of 0.1 microseconds, with a range of 255 steps. The time accuracy is based on a precision 100 MHz quartz crystal, giving a divided down 10 MHz system frequency. The light is guided by two small 100 micron diameter optical fibers, which facilitates light coupling onto the input slit of an electro-optic streak camera. Three distinct groups of exposure modulation of the time tick marks can be independently set anywhere onto the streak duration. This system has been successfully used in Fabry-Perot laser velocimeters for over four years in our Laboratory. The microcomputer control section is also being used in providing optical fids to mechanical rotor cameras.« less
NASA Astrophysics Data System (ADS)
Upputuri, Paul Kumar; Pramanik, Manojit
2018-02-01
Phase shifting white light interferometry (PSWLI) has been widely used for optical metrology applications because of their precision, reliability, and versatility. White light interferometry using monochrome CCD makes the measurement process slow for metrology applications. WLI integrated with Red-Green-Blue (RGB) CCD camera is finding imaging applications in the fields optical metrology and bio-imaging. Wavelength dependent refractive index profiles of biological samples were computed from colour white light interferograms. In recent years, whole-filed refractive index profiles of red blood cells (RBCs), onion skin, fish cornea, etc. were measured from RGB interferograms. In this paper, we discuss the bio-imaging applications of colour CCD based white light interferometry. The approach makes the measurement faster, easier, cost-effective, and even dynamic by using single fringe analysis methods, for industrial applications.
Safety impacts of red light cameras at signalized intersections based on cellular automata models.
Chai, C; Wong, Y D; Lum, K M
2015-01-01
This study applies a simulation technique to evaluate the hypothesis that red light cameras (RLCs) exert important effects on accident risks. Conflict occurrences are generated by simulation and compared at intersections with and without RLCs to assess the impact of RLCs on several conflict types under various traffic conditions. Conflict occurrences are generated through simulating vehicular interactions based on an improved cellular automata (CA) model. The CA model is calibrated and validated against field observations at approaches with and without RLCs. Simulation experiments are conducted for RLC and non-RLC intersections with different geometric layouts and traffic demands to generate conflict occurrences that are analyzed to evaluate the hypothesis that RLCs exert important effects on road safety. The comparison of simulated conflict occurrences show favorable safety impacts of RLCs on crossing conflicts and unfavorable impacts for rear-end conflicts during red/amber phases. Corroborative results are found from broad analysis of accident occurrence. RLCs are found to have a mixed effect on accident risk at signalized intersections: crossing collisions are reduced, whereas rear-end collisions may increase. The specially developed CA model is found to be a feasible safety assessment tool.
An infra-red imaging system for the analysis of tropisms in Arabidopsis thaliana seedlings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orbovic, V.; Poff, K.L.
1990-05-01
Since blue and green light will induce phototropism and red light is absorbed by phytochrome, no wavelength of visible radiation should be considered safe for any study of tropisms in etiolated seedlings. For this reason, we have developed an infra-red imaging system with a video camera with which we can monitor seedlings using radiation at wavelengths longer than 800 nm. The image of the seedlings can be observed in real time, recorded on a VCR and subsequently analyzed using the Java image analysis system. The time courses for curvature of seedlings differ in shape, amplitude, and lag time. This variabilitymore » accounts for much of the noise in the measurement of curvature for a population of seedlings.« less
History of Hubble Space Telescope (HST)
1969-01-01
This image of the Egg Nebula, also known as CRL-2688 and located roughly 3,000 light-years from us, was taken in red light with the Wide Field Planetary Camera 2 (WF/PC2) aboard the Hubble Space Telescope (HST). The image shows a pair of mysterious searchlight beams emerging from a hidden star, crisscrossed by numerous bright arcs. This image sheds new light on the poorly understood ejection of stellar matter that accompanies the slow death of Sun-like stars. The image is shown in false color.
Photobleaching of red fluorescence in oral biofilms.
Hope, C K; de Josselin de Jong, E; Field, M R T; Valappil, S P; Higham, S M
2011-04-01
Many species of oral bacteria can be induced to fluoresce due to the presence of endogenous porphyrins, a phenomenon that can be utilized to visualize and quantify dental plaque in the laboratory or clinical setting. However, an inevitable consequence of fluorescence is photobleaching, and the effects of this on longitudinal, quantitative analysis of dental plaque have yet to be ascertained. Filter membrane biofilms were grown from salivary inocula or single species (Prevotella nigrescens and Prevotella intermedia). The mature biofilms were then examined in a custom-made lighting rig comprising 405 nm light-emitting diodes capable of delivering 220 W/m(2) at the sample, an appropriate filter and a digital camera; a set-up analogous to quantitative light-induced fluorescence digital. Longitudinal sets of images were captured and processed to assess the degradation in red fluorescence over time. Photobleaching was observed in all instances. The highest rates of photobleaching were observed immediately after initiation of illumination, specifically during the first minute. Relative rates of photobleaching during the first minute of exposure were 19.17, 13.72 and 3.43 arbitrary units/min for P. nigrescens biofilms, microcosm biofilm and P. intermedia biofilms, respectively. Photobleaching could be problematic when making quantitative measurements of porphyrin fluorescence in situ. Reducing both light levels and exposure time, in combination with increased camera sensitivity, should be the default approach when undertaking analyses by quantitative light-induced fluorescence digital. © 2010 John Wiley & Sons A/S.
C-RED one: ultra-high speed wavefront sensing in the infrared made possible
NASA Astrophysics Data System (ADS)
Gach, J.-L.; Feautrier, Philippe; Stadler, Eric; Greffe, Timothee; Clop, Fabien; Lemarchand, Stéphane; Carmignani, Thomas; Boutolleau, David; Baker, Ian
2016-07-01
First Light Imaging's CRED-ONE infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. We will show the performances of the camera, its main features and compare them to other high performance wavefront sensing cameras like OCAM2 in the visible and in the infrared. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
Pupillary response to direct and consensual chromatic light stimuli.
Traustason, Sindri; Brondsted, Adam Elias; Sander, Birgit; Lund-Andersen, Henrik
2016-02-01
To assess whether the direct and consensual postillumination (ipRGC-driven) pupil light responses to chromatic light stimuli are equal in healthy subjects. Pupil responses in healthy volunteers were recorded using a prototype binocular chromatic pupillometer (IdeaMedical, Copenhagen), which is capable of both direct and consensual pupillometry measurements. The device uses a pair of dual monochromatic narrow bandwidth LED light sources, red (660 nm) and blue (470 nm). Pupil light responses were recorded with infrared video cameras and analysed using custom-made circuitry and software. Subjects were randomized to receive light stimuli at either the right or left eye after 5 min of dark adaptation. Pupil light responses were recorded in both eyes for 10 seconds before illumination, during illumination and 50 seconds after illumination with red and blue light. Three variables were defined for the recorded pupil responses: the maximal constriction amplitude (CAmax ), the pupil response during illumination and postillumination pupil response (PIPR). No difference was found in the pupil response to blue light. With red light, the pupil response during illumination was slightly larger during consensual illumination compared to direct illumination (0.54 and 0.52, respectively, p = 0.027, paired Wilcoxon's test, n = 12), while no differences were found for CAmax or the PIPR. No difference was found between direct and consensual pupil response to either red or blue light in the postillumination period. Direct and consensual responses can readily be compared when examining the postillumination pupil response to blue light as estimation of photosensitive retinal ganglion cell activation. © 2015 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
The impact of red light cameras on safety in Arizona.
Shin, Kangwon; Washington, Simon
2007-11-01
Red light cameras (RLCs) have been used in a number of US cities to yield a demonstrable reduction in red light violations; however, evaluating their impact on safety (crashes) has been relatively more difficult. Accurately estimating the safety impacts of RLCs is challenging for several reasons. First, many safety related factors are uncontrolled and/or confounded during the periods of observation. Second, "spillover" effects caused by drivers reacting to non-RLC equipped intersections and approaches can make the selection of comparison sites difficult. Third, sites selected for RLC installation may not be selected randomly, and as a result may suffer from the regression to the mean bias. Finally, crash severity and resulting costs need to be considered in order to fully understand the safety impacts of RLCs. Recognizing these challenges, a study was conducted to estimate the safety impacts of RLCs on traffic crashes at signalized intersections in the cities of Phoenix and Scottsdale, Arizona. Twenty-four RLC equipped intersections in both cities are examined in detail and conclusions are drawn. Four different evaluation methodologies were employed to cope with the technical challenges described in this paper and to assess the sensitivity of results based on analytical assumptions. The evaluation results indicated that both Phoenix and Scottsdale are operating cost-effective installations of RLCs: however, the variability in RLC effectiveness within jurisdictions is larger in Phoenix. Consistent with findings in other regions, angle and left-turn crashes are reduced in general, while rear-end crashes tend to increase as a result of RLCs.
Lighting up a Dead Star's Layers
NASA Technical Reports Server (NTRS)
2006-01-01
This image from NASA's Spitzer Space Telescope shows the scattered remains of an exploded star named Cassiopeia A. Spitzer's infrared detectors 'picked' through these remains and found that much of the star's original layering had been preserved. In this false-color image, the faint, blue glow surrounding the dead star is material that was energized by a shock wave, called the forward shock, which was created when the star blew up. The forward shock is now located at the outer edge of the blue glow. Stars are also seen in blue. Green, yellow and red primarily represent material that was ejected in the explosion and heated by a slower shock wave, called the reverse shock wave. The picture was taken by Spitzer's infrared array camera and is a composite of 3.6-micron light (blue); 4.5-micron light (green); and 8.0-micron light (red).NASA Astrophysics Data System (ADS)
Martínez-González, A.; Moreno-Hernández, D.; Monzón-Hernández, D.; León-Rodríguez, M.
2017-06-01
In the schlieren method, the deflection of light by the presence of an inhomogeneous medium is proportional to the gradient of its refractive index. Such deflection, in a schlieren system, is represented by light intensity variations on the observation plane. Then, for a digital camera, the intensity level registered by each pixel depends mainly on the variation of the medium refractive index and the status of the digital camera settings. Therefore, in this study, we regulate the intensity value of each pixel by controlling the camera settings such as exposure time, gamma and gain values in order to calibrate the image obtained to the actual temperature values of a particular medium. In our approach, we use a color digital camera. The images obtained with a color digital camera can be separated on three different color-channels. Each channel corresponds to red, green, and blue color, moreover, each one has its own sensitivity. The differences in sensitivity allow us to obtain a range of temperature values for each color channel. Thus, high, medium and low sensitivity correspond to green, blue, and red color channel respectively. Therefore, by adding up the temperature contribution of each color channel we obtain a wide range of temperature values. Hence, the basic idea in our approach to measure temperature, using a schlieren system, is to relate the intensity level of each pixel in a schlieren image to the corresponding knife-edge position measured at the exit focal plane of the system. Our approach was applied to the measurement of instantaneous temperature fields of the air convection caused by a heated rectangular metal plate and a candle flame. We found that for the metal plate temperature measurements only the green and blue color-channels were required to sense the entire phenomena. On the other hand, for the candle case, the three color-channels were needed to obtain a complete measurement of temperature. In our study, the candle temperature was took as reference and it was found that the maximum temperature value obtained for green, blue and red color-channel was ∼275.6, ∼412.9, and ∼501.3 °C, respectively.
Cinematic camera emulation using two-dimensional color transforms
NASA Astrophysics Data System (ADS)
McElvain, Jon S.; Gish, Walter
2015-02-01
For cinematic and episodic productions, on-set look management is an important component of the creative process, and involves iterative adjustments of the set, actors, lighting and camera configuration. Instead of using the professional motion capture device to establish a particular look, the use of a smaller form factor DSLR is considered for this purpose due to its increased agility. Because the spectral response characteristics will be different between the two camera systems, a camera emulation transform is needed to approximate the behavior of the destination camera. Recently, twodimensional transforms have been shown to provide high-accuracy conversion of raw camera signals to a defined colorimetric state. In this study, the same formalism is used for camera emulation, whereby a Canon 5D Mark III DSLR is used to approximate the behavior a Red Epic cinematic camera. The spectral response characteristics for both cameras were measured and used to build 2D as well as 3x3 matrix emulation transforms. When tested on multispectral image databases, the 2D emulation transforms outperform their matrix counterparts, particularly for images containing highly chromatic content.
Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test.
Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno
2008-11-17
The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.
Waiting endurance time estimation of electric two-wheelers at signalized intersections.
Huan, Mei; Yang, Xiao-bao
2014-01-01
The paper proposed a model for estimating waiting endurance times of electric two-wheelers at signalized intersections using survival analysis method. Waiting duration times were collected by video cameras and they were assigned as censored and uncensored data to distinguish between normal crossing and red-light running behavior. A Cox proportional hazard model was introduced, and variables revealing personal characteristics and traffic conditions were defined as covariates to describe the effects of internal and external factors. Empirical results show that riders do not want to wait too long to cross intersections. As signal waiting time increases, electric two-wheelers get impatient and violate the traffic signal. There are 12.8% of electric two-wheelers with negligible wait time. 25.0% of electric two-wheelers are generally nonrisk takers who can obey the traffic rules after waiting for 100 seconds. Half of electric two-wheelers cannot endure 49.0 seconds or longer at red-light phase. Red phase time, motor vehicle volume, and conformity behavior have important effects on riders' waiting times. Waiting endurance times would decrease with the longer red-phase time, the lower traffic volume, or the bigger number of other riders who run against the red light. The proposed model may be applicable in the design, management and control of signalized intersections in other developing cities.
Waiting Endurance Time Estimation of Electric Two-Wheelers at Signalized Intersections
Huan, Mei; Yang, Xiao-bao
2014-01-01
The paper proposed a model for estimating waiting endurance times of electric two-wheelers at signalized intersections using survival analysis method. Waiting duration times were collected by video cameras and they were assigned as censored and uncensored data to distinguish between normal crossing and red-light running behavior. A Cox proportional hazard model was introduced, and variables revealing personal characteristics and traffic conditions were defined as covariates to describe the effects of internal and external factors. Empirical results show that riders do not want to wait too long to cross intersections. As signal waiting time increases, electric two-wheelers get impatient and violate the traffic signal. There are 12.8% of electric two-wheelers with negligible wait time. 25.0% of electric two-wheelers are generally nonrisk takers who can obey the traffic rules after waiting for 100 seconds. Half of electric two-wheelers cannot endure 49.0 seconds or longer at red-light phase. Red phase time, motor vehicle volume, and conformity behavior have important effects on riders' waiting times. Waiting endurance times would decrease with the longer red-phase time, the lower traffic volume, or the bigger number of other riders who run against the red light. The proposed model may be applicable in the design, management and control of signalized intersections in other developing cities. PMID:24895659
A reflectance model for non-contact mapping of venous oxygen saturation using a CCD camera
NASA Astrophysics Data System (ADS)
Li, Jun; Dunmire, Barbrina; Beach, Kirk W.; Leotta, Daniel F.
2013-11-01
A method of non-contact mapping of venous oxygen saturation (SvO2) is presented. A CCD camera is used to image skin tissue illuminated alternately by a red (660 nm) and an infrared (800 nm) LED light source. Low cuff pressures of 30-40 mmHg are applied to induce a venous blood volume change with negligible change in the arterial blood volume. A hybrid model combining the Beer-Lambert law and the light diffusion model is developed and used to convert the change in the light intensity to the change in skin tissue absorption coefficient. A simulation study incorporating the full light diffusion model is used to verify the hybrid model and to correct a calculation bias. SvO2 in the fingers, palm, and forearm for five volunteers are presented and compared with results in the published literature. Two-dimensional maps of venous oxygen saturation are given for the three anatomical regions.
NASA Technical Reports Server (NTRS)
2000-01-01
MISR images of tropical northern Australia acquired on June 1, 2000 (Terra orbit 2413) during the long dry season. Left: color composite of vertical (nadir) camera blue, green, and red band data. Right: multi-angle composite of red band data only from the cameras viewing 60 degrees aft, 60 degrees forward, and nadir. Color and contrast have been enhanced to accentuate subtle details. In the left image, color variations indicate how different parts of the scene reflect light differently at blue, green, and red wavelengths; in the right image color variations show how these same scene elements reflect light differently at different angles of view. Water appears in blue shades in the right image, for example, because glitter makes the water look brighter at the aft camera's view angle. The prominent inland water body is Lake Argyle, the largest human-made lake in Australia, which supplies water for the Ord River Irrigation Area and the town of Kununurra (pop. 6500) just to the north. At the top is the southern edge of Joseph Bonaparte Gulf; the major inlet at the left is Cambridge Gulf, the location of the town of Wyndham (pop. 850), the port for this region. This area is sparsely populated, and is known for its remote, spectacular mountains and gorges. Visible along much of the coastline are intertidal mudflats of mangroves and low shrubs; to the south the terrain is covered by open woodland merging into open grassland in the lower half of the pictures.
MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Analysis of crystalline lens coloration using a black and white charge-coupled device camera.
Sakamoto, Y; Sasaki, K; Kojima, M
1994-01-01
To analyze lens coloration in vivo, we used a new type of Scheimpflug camera that is a black and white type of charge-coupled device (CCD) camera. A new methodology was proposed. Scheimpflug images of the lens were taken three times through red (R), green (G), and blue (B) filters, respectively. Three images corresponding with the R, G, and B channels were combined into one image on the cathode-ray tube (CRT) display. The spectral transmittance of the tricolor filters and the spectral sensitivity of the CCD camera were used to correct the scattering-light intensity of each image. Coloration of the lens was expressed on a CIE standard chromaticity diagram. The lens coloration of seven eyes analyzed by this method showed values almost the same as those obtained by the previous method using color film.
Galaxies Gather at Great Distances
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] Distant Galaxy Cluster Infrared Survey Poster [figure removed for brevity, see original site] [figure removed for brevity, see original site] Bird's Eye View Mosaic Bird's Eye View Mosaic with Clusters [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] 9.1 Billion Light-Years 8.7 Billion Light-Years 8.6 Billion Light-Years Astronomers have discovered nearly 300 galaxy clusters and groups, including almost 100 located 8 to 10 billion light-years away, using the space-based Spitzer Space Telescope and the ground-based Mayall 4-meter telescope at Kitt Peak National Observatory in Tucson, Ariz. The new sample represents a six-fold increase in the number of known galaxy clusters and groups at such extreme distances, and will allow astronomers to systematically study massive galaxies two-thirds of the way back to the Big Bang. A mosaic portraying a bird's eye view of the field in which the distant clusters were found is shown at upper left. It spans a region of sky 40 times larger than that covered by the full moon as seen from Earth. Thousands of individual images from Spitzer's infrared array camera instrument were stitched together to create this mosaic. The distant clusters are marked with orange dots. Close-up images of three of the distant galaxy clusters are shown in the adjoining panels. The clusters appear as a concentration of red dots near the center of each image. These images reveal the galaxies as they were over 8 billion years ago, since that's how long their light took to reach Earth and Spitzer's infrared eyes. These pictures are false-color composites, combining ground-based optical images captured by the Mosaic-I camera on the Mayall 4-meter telescope at Kitt Peak, with infrared pictures taken by Spitzer's infrared array camera. Blue and green represent visible light at wavelengths of 0.4 microns and 0.8 microns, respectively, while red indicates infrared light at 4.5 microns. Kitt Peak National Observatory is part of the National Optical Astronomy Observatory in Tuscon, Ariz.Red Aurora as Seen From the International Space Station (ISS)
NASA Technical Reports Server (NTRS)
2001-01-01
Auroras are caused when high-energy electrons pour down from the Earth's magnetosphere and collide with atoms. Red aurora, as captured here by a still digital camera aboard the International Space Station (ISS), occurs from 200 km to as high as 500 km altitude and is caused by the emission of 6300 Angstrom wavelength light from oxygen atoms. The light is emitted when the atoms return to their original unexcited state. The white spot in the image is from a light on inside of the ISS that is reflected off the inside of the window. The pale blue arch on the left side of the frame is sunlight reflecting off the atmospheric limb of the Earth. At times of peaks in solar activity, there are more geomagnetic storms and this increases the auroral activity viewed on Earth and by astronauts from orbit.
BLUE STRAGGLERS IN GLOBULAR CLUSTER 47 TUCANAE
NASA Technical Reports Server (NTRS)
2002-01-01
The core of globular cluster 47 Tucanae is home to many blue stragglers, rejuvenated stars that glow with the blue light of young stars. A ground-based telescope image (on the left) shows the entire crowded core of 47 Tucanae, located 15,000 light-years away in the constellation Tucana. Peering into the heart of the globular cluster's bright core, the Hubble Space Telescope's Wide Field and Planetary Camera 2 separated the dense clump of stars into many individual stars (image on right). Some of these stars shine with the light of old stars; others with the blue light of blue stragglers. The yellow circles in the Hubble telescope image highlight several of the cluster's blue stragglers. Analysis for this observation centered on one massive blue straggler. Astronomers theorize that blue stragglers are formed either by the slow merger of stars in a double-star system or by the collision of two unrelated stars. For the blue straggler in 47 Tucanae, astronomers favor the slow merger scenario. This image is a 3-color composite of archival Hubble Wide Field and Planetary Camera 2 images in the ultraviolet (blue), blue (green), and violet (red) filters. Color tables were assigned and scaled so that the red giant stars appear orange, main-sequence stars are white/green, and blue stragglers are appropriately blue. The ultraviolet images were taken on Oct. 25, 1995, and the blue and violet images were taken on Sept. 1, 1995. Credit: Rex Saffer (Villanova University) and Dave Zurek (STScI), and NASA
2017-08-11
These two views of Saturn's moon Titan exemplify how NASA's Cassini spacecraft has revealed the surface of this fascinating world. Cassini carried several instruments to pierce the veil of hydrocarbon haze that enshrouds Titan. The mission's imaging cameras also have several spectral filters sensitive to specific wavelengths of infrared light that are able to make it through the haze to the surface and back into space. These "spectral windows" have enable the imaging cameras to map nearly the entire surface of Titan. In addition to Titan's surface, images from both the imaging cameras and VIMS have provided windows into the moon's ever-changing atmosphere, chronicling the appearance and movement of hazes and clouds over the years. A large, bright and feathery band of summer clouds can be seen arcing across high northern latitudes in the view at right. These views were obtained with the Cassini spacecraft narrow-angle camera on March 21, 2017. Images taken using red, green and blue spectral filters were combined to create the natural-color view at left. The false-color view at right was made by substituting an infrared image (centered at 938 nanometers) for the red color channel. The views were acquired at a distance of approximately 613,000 miles (986,000 kilometers) from Titan. Image scale is about 4 miles (6 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21624
2015-12-09
This representation of Ceres' Occator Crater in false colors shows differences in the surface composition. Red corresponds to a wavelength range around 0.97 micrometers (near infrared), green to a wavelength range around 0.75 micrometers (red, visible light) and blue to a wavelength range of around 0.44 micrometers (blue, visible light). Occator measures about 60 miles (90 kilometers) wide. Scientists use false color to examine differences in surface materials. The color blue on Ceres is generally associated with bright material, found in more than 130 locations, and seems to be consistent with salts, such as sulfates. It is likely that silicate materials are also present. The images were obtained by the framing camera on NASA's Dawn spacecraft from a distance of about 2,700 miles (4,400 kilometers). http://photojournal.jpl.nasa.gov/catalog/PIA20180
Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test
Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno
2008-01-01
The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces. PMID:27873930
The optical design of the G-CLEF Spectrograph: the first light instrument for the GMT
NASA Astrophysics Data System (ADS)
Ben-Ami, Sagi; Epps, Harland; Evans, Ian; Mueller, Mark; Podgorski, William; Szentgyorgyi, Andrew
2016-08-01
The GMT-Consortium Large Earth Finder (G-CLEF), the first major light instrument for the GMT, is a fiber-fed, high-resolution echelle spectrograph. In the following paper, we present the optical design of G-CLEF. We emphasize the unique solutions derived for the spectrograph fiber-feed: the Mangin mirror that corrects the cylindrical field curvature, the implementation of VPH grisms as cross dispersers, and our novel solution for a multi-colored exposure meter. We describe the spectrograph blue and red cameras comprised of 7 and 8 elements respectively, with one aspheric surface in each camera, and present the expected echellogram imaged on the instrument focal planes. Finally, we present ghost analysis and mitigation strategy that takes into account both single reflection and double reflection back scattering from various elements in the optical train.
NASA Astrophysics Data System (ADS)
Nishidate, Izumi; Mustari, Afrina; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu; Kokubo, Yasuaki
2017-02-01
We propose a rapid imaging method to monitor the spatial distribution of total hemoglobin concentration (CHbT), the tissue oxygen saturation, and the scattering power b in the expression of μs'=aλ-b as the scattering parameters in cerebral cortex using a digital red-green-blue camera. In the method, the RGB-values are converted into the tristimulus values in CIEXYZ color space which is compatible with the common RGB working spaces. Monte Carlo simulation (MCS) for light transport in tissue is used to specify a relation among the tristimulus XYZ-values and the concentration of oxygenated hemoglobin, that of deoxygenated hemoglobin, and the scattering power b. In the present study, we performed sequential recordings of RGB images of in vivo exposed rat brain during the cortical spreading depolarization evoked by the topical application of KCl. Changes in the total hemoglobin concentration and the tissue oxygen saturation imply the temporary change in cerebral blood flow during CSD. Decrease in the scattering power b was observed before the profound increase in the total hemoglobin concentration, which is indicative of the reversible morphological changes in brain tissue during CSD. The results in this study indicate potential of the method to evaluate the pathophysiological conditions in brain tissue with a digital red-green-blue camera.
Young Stars Emerge from Orion Head
2007-05-17
This image from NASA's Spitzer Space Telescope shows infant stars "hatching" in the head of the hunter constellation, Orion. Astronomers suspect that shockwaves from a supernova explosion in Orion's head, nearly three million years ago, may have initiated this newfound birth . The region featured in this Spitzer image is called Barnard 30. It is located approximately 1,300 light-years away and sits on the right side of Orion's "head," just north of the massive star Lambda Orionis. Wisps of red in the cloud are organic molecules called polycyclic aromatic hydrocarbons. These molecules are formed anytime carbon-based materials are burned incompletely. On Earth, they can be found in the sooty exhaust from automobile and airplane engines. They also coat the grills where charcoal-broiled meats are cooked. This image shows infrared light captured by Spitzer's infrared array camera. Light with wavelengths of 8 and 5.8 microns (red and orange) comes mainly from dust that has been heated by starlight. Light of 4.5 microns (green) shows hot gas and dust; and light of 3.6 microns (blue) is from starlight. http://photojournal.jpl.nasa.gov/catalog/PIA09412
Young Stars Emerge from Orion's Head
NASA Technical Reports Server (NTRS)
2007-01-01
This image from NASA's Spitzer Space Telescope shows infant stars 'hatching' in the head of the hunter constellation, Orion. Astronomers suspect that shockwaves from a supernova explosion in Orion's head, nearly three million years ago, may have initiated this newfound birth The region featured in this Spitzer image is called Barnard 30. It is located approximately 1,300 light-years away and sits on the right side of Orion's 'head,' just north of the massive star Lambda Orionis. Wisps of red in the cloud are organic molecules called polycyclic aromatic hydrocarbons. These molecules are formed anytime carbon-based materials are burned incompletely. On Earth, they can be found in the sooty exhaust from automobile and airplane engines. They also coat the grills where charcoal-broiled meats are cooked. This image shows infrared light captured by Spitzer's infrared array camera. Light with wavelengths of 8 and 5.8 microns (red and orange) comes mainly from dust that has been heated by starlight. Light of 4.5 microns (green) shows hot gas and dust; and light of 3.6 microns (blue) is from starlight.Robotic Arm Camera on Mars with Lights On
NASA Technical Reports Server (NTRS)
2008-01-01
This image is a composite view of NASA's Phoenix Mars Lander's Robotic Arm Camera (RAC) with its lights on, as seen by the lander's Surface Stereo Imager (SSI). This image combines images taken on the afternoon of Phoenix's 116th Martian day, or sol (September 22, 2008). The RAC is about 8 centimeters (3 inches) tall. The SSI took images of the RAC to test both the light-emitting diodes (LEDs) and cover function. Individual images were taken in three SSI filters that correspond to the red, green, and blue LEDs one at a time. When combined, it appears that all three sets of LEDs are on at the same time. This composite image is not true color. The streaks of color extending from the LEDs are an artifact from saturated exposure. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.A View of Lightning from the Space Shuttle Red Sprites and Blue Jets
NASA Technical Reports Server (NTRS)
Vaughan, Otha H., Jr.
1999-01-01
An examination and analysis of video images of lightning captured by the Low Light Level Monochrome TV cameras of the space shuttle, have provided a variety of examples of new forms of lightning-like discharges that appear to move out of the top of very active thunderstorms. These images were obtained during a number of shuttle missions while conducting the Mesoscale Lightning Observational Experiment (MLE). The video images illustrate a variety of filamentary and broad-like discharges to the stratosphere and maybe related to the intense electrical fields that are generated by the thunderstorm, which may somehow play a part in the Earth's global electrical circuit. A typical event is seen as a single or multiple-like filament that can appear to occur at altitudes between 60 to 95 km above the storm top. In addition, another phenomenon not explained at the present time, appears to move out the top of the storm and then proceeds toward the stratosphere at speeds of about lOOkm/sec. These events, much like a jet, reach an altitude of at least 33 km before they begin to spread out into a cone like shape. More observations obtained from ground and aircraft using low light level color TV cameras have confirmed that the sprites are red while the jets are blue in color, hence the name Red Sprites and Blue Jets. Still images and video data will be presented, illustrating these new atmospheric phenomena.
Red ball ranging optimization based on dual camera ranging method
NASA Astrophysics Data System (ADS)
Kuang, Lei; Sun, Weijia; Liu, Jiaming; Tang, Matthew Wai-Chung
2018-05-01
In this paper, the process of positioning and moving to target red ball by NAO robot through its camera system is analyzed and improved using the dual camera ranging method. The single camera ranging method, which is adapted by NAO robot, was first studied and experimented. Since the existing error of current NAO Robot is not a single variable, the experiments were divided into two parts to obtain more accurate single camera ranging experiment data: forward ranging and backward ranging. Moreover, two USB cameras were used in our experiments that adapted Hough's circular method to identify a ball, while the HSV color space model was used to identify red color. Our results showed that the dual camera ranging method reduced the variance of error in ball tracking from 0.68 to 0.20.
Opportunity's Second Martian Birthday at Cape Verde
NASA Technical Reports Server (NTRS)
2007-01-01
A promontory nicknamed 'Cape Verde' can be seen jutting out from the walls of Victoria Crater in this approximate true-color picture taken by the panoramic camera on NASA's Mars Exploration Rover Opportunity. The rover took this picture on martian day, or sol, 1329 (Oct. 20, 2007), more than a month after it began descending down the crater walls -- and just 9 sols shy of its second Martian birthday on sol 1338 (Oct. 29, 2007). Opportunity landed on the Red Planet on Jan. 25, 2004. That's nearly four years ago on Earth, but only two on Mars because Mars takes longer to travel around the sun than Earth. One Martian year equals 687 Earth days. The overall soft quality of the image, and the 'haze' seen in the lower right portion, are the result of scattered light from dust on the front sapphire window of the rover's camera. This view was taken using three panoramic-camera filters, admitting light with wavelengths centered at 750 nanometers (near infrared), 530 nanometers (green) and 430 nanometers (violet).Andreozzi, Jacqueline M; Zhang, Rongxiao; Glaser, Adam K; Jarvis, Lesley A; Pogue, Brian W; Gladstone, David J
2015-02-01
To identify achievable camera performance and hardware needs in a clinical Cherenkov imaging system for real-time, in vivo monitoring of the surface beam profile on patients, as novel visual information, documentation, and possible treatment verification for clinicians. Complementary metal-oxide-semiconductor (CMOS), charge-coupled device (CCD), intensified charge-coupled device (ICCD), and electron multiplying-intensified charge coupled device (EM-ICCD) cameras were investigated to determine Cherenkov imaging performance in a clinical radiotherapy setting, with one emphasis on the maximum supportable frame rate. Where possible, the image intensifier was synchronized using a pulse signal from the Linac in order to image with room lighting conditions comparable to patient treatment scenarios. A solid water phantom irradiated with a 6 MV photon beam was imaged by the cameras to evaluate the maximum frame rate for adequate Cherenkov detection. Adequate detection was defined as an average electron count in the background-subtracted Cherenkov image region of interest in excess of 0.5% (327 counts) of the 16-bit maximum electron count value. Additionally, an ICCD and an EM-ICCD were each used clinically to image two patients undergoing whole-breast radiotherapy to compare clinical advantages and limitations of each system. Intensifier-coupled cameras were required for imaging Cherenkov emission on the phantom surface with ambient room lighting; standalone CMOS and CCD cameras were not viable. The EM-ICCD was able to collect images from a single Linac pulse delivering less than 0.05 cGy of dose at 30 frames/s (fps) and pixel resolution of 512 × 512, compared to an ICCD which was limited to 4.7 fps at 1024 × 1024 resolution. An intensifier with higher quantum efficiency at the entrance photocathode in the red wavelengths [30% quantum efficiency (QE) vs previous 19%] promises at least 8.6 fps at a resolution of 1024 × 1024 and lower monetary cost than the EM-ICCD. The ICCD with an intensifier better optimized for red wavelengths was found to provide the best potential for real-time display (at least 8.6 fps) of radiation dose on the skin during treatment at a resolution of 1024 × 1024.
A TV Camera System Which Extracts Feature Points For Non-Contact Eye Movement Detection
NASA Astrophysics Data System (ADS)
Tomono, Akira; Iida, Muneo; Kobayashi, Yukio
1990-04-01
This paper proposes a highly efficient camera system which extracts, irrespective of background, feature points such as the pupil, corneal reflection image and dot-marks pasted on a human face in order to detect human eye movement by image processing. Two eye movement detection methods are sugested: One utilizing face orientation as well as pupil position, The other utilizing pupil and corneal reflection images. A method of extracting these feature points using LEDs as illumination devices and a new TV camera system designed to record eye movement are proposed. Two kinds of infra-red LEDs are used. These LEDs are set up a short distance apart and emit polarized light of different wavelengths. One light source beams from near the optical axis of the lens and the other is some distance from the optical axis. The LEDs are operated in synchronization with the camera. The camera includes 3 CCD image pick-up sensors and a prism system with 2 boundary layers. Incident rays are separated into 2 wavelengths by the first boundary layer of the prism. One set of rays forms an image on CCD-3. The other set is split by the half-mirror layer of the prism and forms an image including the regularly reflected component by placing a polarizing filter in front of CCD-1 or another image not including the component by not placing a polarizing filter in front of CCD-2. Thus, three images with different reflection characteristics are obtained by three CCDs. Through the experiment, it is shown that two kinds of subtraction operations between the three images output from CCDs accentuate three kinds of feature points: the pupil and corneal reflection images and the dot-marks. Since the S/N ratio of the subtracted image is extremely high, the thresholding process is simple and allows reducting the intensity of the infra-red illumination. A high speed image processing apparatus using this camera system is decribed. Realtime processing of the subtraction, thresholding and gravity position calculation of the feature points is possible.
Volga Delta and the Caspian Sea
NASA Technical Reports Server (NTRS)
2002-01-01
Russia's Volga River is the largest river system in Europe, draining over 1.3 million square kilometers of catchment area into the Caspian Sea. The brackish Caspian is Earth's largest landlocked water body, and its isolation from the world's oceans has enabled the preservation of several unique animal and plant species. The Volga provides most of the Caspian's fresh water and nutrients, and also discharges large amounts of sediment and industrial waste into the relatively shallow northern part of the sea. These images of the region were captured by the Multi-angle Imaging SpectroRadiometer on October 5, 2001, during Terra orbit 9567. Each image represents an area of approximately 275 kilometers x 376 kilometers.The left-hand image is from MISR's nadir (vertical-viewing) camera, and shows how light is reflected at red, green, and blue wavelengths. The right-hand image is a false color composite of red-band imagery from MISR's 60-degree backward, nadir, and 60-degree forward-viewing cameras, displayed as red, green, and blue, respectively. Here, color variations indicate how light is reflected at different angles of view. Water appears blue in the right-hand image, for example, because sun glitter makes smooth, wet surfaces look brighter at the forward camera's view angle. The rougher-textured vegetated wetlands near the coast exhibit preferential backscattering, and consequently appear reddish. A small cloud near the center of the delta separates into red, green, and blue components due to geometric parallax associated with its elevation above the surface.Other notable features within the images include several linear features located near the Volga Delta shoreline. These long, thin lines are artificially maintained shipping channels, dredged to depths of at least 2 meters. The crescent-shaped Kulaly Island, also known as Seal Island, is visible near the right-hand edge of the images.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Growth Chambers on the International Space Station for Large Plants
NASA Technical Reports Server (NTRS)
Massa, G. D.; Wheeler, R. M.; Morrow, R. C.; Levine, H. G.
2016-01-01
The International Space Station (ISS) now has platforms for conducting research on horticultural plant species under LED lighting, and those capabilities continue to expand. The 'Veggie' vegetable production system was deployed to the ISS as an applied research platform for food production in space. Veggie is capable of growing a wide array of horticultural crops. It was designed for low power usage, low launch mass and stowage volume, and minimal crew time requirements. The Veggie flight hardware consists of a light cap containing red (630 nm), blue, (455 nm) and green (530 nm) LEDs. Interfacing with the light cap is an extendable bellows/baseplate for enclosing the plant canopy. A second large plant growth chamber, the Advanced Plant Habitat (APH), is will fly to the ISS in 2017. APH will be a fully controllable environment for high-quality plant physiological research. APH will control light (quality, level, and timing), temperature, CO2, relative humidity, and irrigation, while scrubbing any cabin or plant-derived ethylene and other volatile organic compounds. Additional capabilities include sensing of leaf temperature and root zone moisture, root zone temperature, and oxygen concentration. The light cap will have red (630 nm), blue (450 nm), green (525 nm), far red (730 nm) and broad spectrum white LEDs (4100K). There will be several internal cameras (visible and IR) to monitor and record plant growth and operations. Veggie and APH are available for research proposals.
NASA Astrophysics Data System (ADS)
Hosono, Satsuki; Sato, Shun; Ishida, Akane; Suzuki, Yo; Inohara, Daichi; Nogo, Kosuke; Abeygunawardhana, Pradeep K.; Suzuki, Satoru; Nishiyama, Akira; Wada, Kenji; Ishimaru, Ichiro
2015-07-01
For blood glucose level measurement of dialysis machines, we proposed AAA-battery-size ATR (Attenuated total reflection) Fourier spectroscopy in middle infrared light region. The proposed one-shot Fourier spectroscopic imaging is a near-common path and spatial phase-shift interferometer with high time resolution. Because numerous number of spectral data that is 60 (= camera frame rare e.g. 60[Hz]) multiplied by pixel number could be obtained in 1[sec.], statistical-averaging improvement realize high-accurate spectral measurement. We evaluated the quantitative accuracy of our proposed method for measuring glucose concentration in near-infrared light region with liquid cells. We confirmed that absorbance at 1600[nm] had high correlations with glucose concentrations (correlation coefficient: 0.92). But to measure whole-blood, complex light phenomenon caused from red blood cells, that is scattering and multiple reflection or so, deteriorate spectral data. Thus, we also proposed the ultrasound-assisted spectroscopic imaging that traps particles at standing-wave node. Thus, if ATR prism is oscillated mechanically, anti-node area is generated around evanescent light field on prism surface. By elimination complex light phenomenon of red blood cells, glucose concentration in whole-blood will be quantify with high accuracy. In this report, we successfully trapped red blood cells in normal saline solution with ultrasonic standing wave (frequency: 2[MHz]).
Robotic Arm Camera on Mars, with Lights Off
NASA Technical Reports Server (NTRS)
2008-01-01
This approximate color image is a view of NASA's Phoenix Mars Lander's Robotic Arm Camera (RAC) as seen by the lander's Surface Stereo Imager (SSI). This image was taken on the afternoon of the 116th Martian day, or sol, of the mission (September 22, 2008). The RAC is about 8 centimeters (3 inches) tall. The SSI took images of the RAC to test both the light-emitting diodes (LEDs) and cover function. Individual images were taken in three SSI filters that correspond to the red, green, and blue LEDs one at a time. This yields proper coloring when imaging Phoenix's surrounding Martian environment. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.Pai, Chih-Wei; Jou, Rong-Chang
2014-01-01
Literature has suggested that bicyclists' red-light violations (RLVs) tend not to cause accidents although RLV is a frequent and typical bicyclist's behaviour. High association between bicyclist RLVs and accidents were, however, revealed in Taiwan. The current research explores bicyclists' RLVs by classifying crossing behaviours into three distinct manners: risk-taking, opportunistic, and law-obeying. Other variables, as well as bicyclists' crossing behaviours, were captured through the use of video cameras that were installed at selected intersections in Taoyuan County, Taiwan. Considering the unobserved heterogeneity, this research develops a mixed logit model of bicyclists' three distinct crossing behaviours. Several variables (pupils in uniform, speed limit with 60km/h) appear to have heterogeneous effects, lending support to the use of mixed logit models in bicyclist RLV research. Several factors were found to significantly increase the likelihood of bicyclists' risky behaviours, most notably: intersections with short red-light duration, T/Y intersections, when riders were pupils in uniform, when riders were riding electric bicycles, when riders were unhelmeted. Implications of the research findings, and the concluding remarks, are finally provided. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nikitin, Vladimir; Berkovich, Yuliy A.; Skripnikov, Alexander; Zyablova, Natalya; Mukhoyan, Makar; Emelianov, Grigory
The experiment was conducted on Russian Biological Satelite Bion-M #1 19.04-19.05 2013. Five transparent plastic cultural flasks were placed in five light isolated sections of Biocont-B2 cylindrical container with inner diameter of 120 mm and height of 230 mm. In four sections the flasks could be illuminated by top or side LED with wavelength of 458 nm, 630 nm, 730 nm, and white (color temperature 5000° K, peaks 453, 559 nm). Photon flux in each variant was 15 umol/(m2c). In the fifth section the flask with the shoots was in conditions of constant dark. Each section was equipped with its own video camera module. Cameras, video recorder and lighting were managed by micro controller. 12 days before launch, 5 tips of the moss shoots were explanted at each of the five flasks on the agar medium with nutrient components and were cultivated under white fluorescent lamps at 12 hour photo period till the launch. After entering the orbit and during next 14 days of flight top LEDs were turned on above the flasks. Then for the following 14 days of flight the side LEDs of similar wavelength were turned on. The moss gametophores were cultivated at 12-h photoperiod. During the experiment on an hourly basis a video recording of the moss was performed. Similar equipment was used for ground control. After the experiment video files were used to produce separate time-lapse films for each flask using AviSynth program. In flight the shoots demonstrated the maximum growth speed with far red lighting and slower speed with white lighting. With blue and red lighting after switching to side light stimuli the growth of shoots almost stopped. In the dark the shoots continued to grow until the 13 day after launch of the satellite, then their growth stopped. In ground control the relation of growth rate with various LEDs remained basically the same, with the exception of side blue lighting, where the shoots demonstrated considerable vertical growth. In flight the angle of inclination towards the light source was maximal (about 90º) with white lighting, and somewhat smaller with 730 nm. Under red and blue light the angle of phototropic inclination was difficult to measure due to poor growth of the shoots.In ground control the growth rate under blue light was several times higher, than in flight and final degree of inclination of the shoot tip came to about 10º. In ground control under side red lighting the growth was weak, while demonstrating a pronounced phototropic bend of 90º. In ground control in the dark a vertical growth of one shoot was observed with the rate somewhat larger, than in flight variant. Data on the dynamics of inclination of experimental and control plants are presented. The acquired data will be used to analyse the mechanisms of phototropic growth changes of moss shoots.
Ambient-Light-Canceling Camera Using Subtraction of Frames
NASA Technical Reports Server (NTRS)
Morookian, John Michael
2004-01-01
The ambient-light-canceling camera (ALCC) is a proposed near-infrared electronic camera that would utilize a combination of (1) synchronized illumination during alternate frame periods and (2) subtraction of readouts from consecutive frames to obtain images without a background component of ambient light. The ALCC is intended especially for use in tracking the motion of an eye by the pupil center corneal reflection (PCCR) method. Eye tracking by the PCCR method has shown potential for application in human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological deficiencies. In the PCCR method, an eye is illuminated by near-infrared light from a lightemitting diode (LED). Some of the infrared light is reflected from the surface of the cornea. Some of the infrared light enters the eye through the pupil and is reflected from back of the eye out through the pupil a phenomenon commonly observed as the red-eye effect in flash photography. An electronic camera is oriented to image the user's eye. The output of the camera is digitized and processed by algorithms that locate the two reflections. Then from the locations of the centers of the two reflections, the direction of gaze is computed. As described thus far, the PCCR method is susceptible to errors caused by reflections of ambient light. Although a near-infrared band-pass optical filter can be used to discriminate against ambient light, some sources of ambient light have enough in-band power to compete with the LED signal. The mode of operation of the ALCC would complement or supplant spectral filtering by providing more nearly complete cancellation of the effect of ambient light. In the operation of the ALCC, a near-infrared LED would be pulsed on during one camera frame period and off during the next frame period. Thus, the scene would be illuminated by both the LED (signal) light and the ambient (background) light during one frame period, and would be illuminated with only ambient (background) light during the next frame period. The camera output would be digitized and sent to a computer, wherein the pixel values of the background-only frame would be subtracted from the pixel values of the signal-plus-background frame to obtain signal-only pixel values (see figure). To prevent artifacts of motion from entering the images, it would be necessary to acquire image data at a rate greater than the standard video rate of 30 frames per second. For this purpose, the ALCC would exploit a novel control technique developed at NASA s Jet Propulsion Laboratory for advanced charge-coupled-device (CCD) cameras. This technique provides for readout from a subwindow [region of interest (ROI)] within the image frame. Because the desired reflections from the eye would typically occupy a small fraction of the area within the image frame, the ROI capability would make it possible to acquire and subtract pixel values at rates of several hundred frames per second considerably greater than the standard video rate and sufficient to both (1) suppress motion artifacts and (2) track the motion of the eye between consecutive subtractive frame pairs.
2015-05-08
NASA's Curiosity Mars rover recorded this view of the sun setting at the close of the mission's 956th Martian day, or sol (April 15, 2015), from the rover's location in Gale Crater. This was the first sunset observed in color by Curiosity. The image comes from the left-eye camera of the rover's Mast Camera (Mastcam). The color has been calibrated and white-balanced to remove camera artifacts. Mastcam sees color very similarly to what human eyes see, although it is actually a little less sensitive to blue than people are. Dust in the Martian atmosphere has fine particles that permit blue light to penetrate the atmosphere more efficiently than longer-wavelength colors. That causes the blue colors in the mixed light coming from the sun to stay closer to sun's part of the sky, compared to the wider scattering of yellow and red colors. The effect is most pronounced near sunset, when light from the sun passes through a longer path in the atmosphere than it does at mid-day. Malin Space Science Systems, San Diego, built and operates the rover's Mastcam. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, manages the Mars Science Laboratory Project for NASA's Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19400
Porphyrin involvement in redshift fluorescence in dentin decay
NASA Astrophysics Data System (ADS)
Slimani, A.; Panayotov, I.; Levallois, B.; Cloitre, T.; Gergely, C.; Bec, N.; Larroque, C.; Tassery, H.; Cuisinier, F.
2014-05-01
The aim of this study was to evaluate the porphyrin involvement in the red fluorescence observed in dental caries with Soprolife® light-induced fluorescence camera in treatments mode (SOPRO, ACTEON Group, La Ciotat, France) and Vistacam® camera (DÜRR DENTAL AG, Bietigheim-Bissingen, Germany). The International Caries Detection and Assessment System (ICDAS) was used to rand the samples. Human teeth cross-sections, ranked from ICDAS score 0 to 6, were examined by epi-fluorescence microscopy and Confocal Raman microscopy. Comparable studies were done with Protoporphyrin IX, Porphyrin I and Pentosidine solutions. An RGB analysis of Soprolife® images was performed using ImageJ Software (1.46r, National Institutes of Health, USA). Fluorescence spectroscopy and MicroRaman spectroscopy revealed the presence of Protoporphyrin IX, in carious enamel, dentin and dental plaque. However, the presence of porphyrin I and pentosidine cannot be excluded. The results indicated that not only porphyrin were implicated in the red fluorescence, Advanced Glygation Endproducts (AGEs) of the Maillard reaction also contributed to this phenomenon.
Growth Chambers on the International Space Station for Large Plants
NASA Technical Reports Server (NTRS)
Massa, Gioia D.; Wheeler, Raymond M.; Morrow, Robert C.; Levine, Howard G.
2016-01-01
The International Space Station (ISS) now has platforms for conducting research on horticultural plant species under LED (Light Emitting Diodes) lighting, and those capabilities continue to expand. The Veggie vegetable production system was deployed to the ISS as an applied research platform for food production in space. Veggie is capable of growing a wide array of horticultural crops. It was designed for low power usage, low launch mass and stowage volume, and minimal crew time requirements. The Veggie flight hardware consists of a light cap containing red (630 nanometers), blue, (455 nanometers) and green (530 nanometers) LEDs. Interfacing with the light cap is an extendable bellowsbaseplate for enclosing the plant canopy. A second large plant growth chamber, the Advanced Plant Habitat (APH), is will fly to the ISS in 2017. APH will be a fully controllable environment for high-quality plant physiological research. APH will control light (quality, level, and timing), temperature, CO2, relative humidity, and irrigation, while scrubbing any cabin or plant-derived ethylene and other volatile organic compounds. Additional capabilities include sensing of leaf temperature and root zone moisture, root zone temperature, and oxygen concentration. The light cap will have red (630 nm), blue (450 nm), green (525 nm), far red (730 nm) and broad spectrum white LEDs (4100K). There will be several internal cameras (visible and IR) to monitor and record plant growth and operations. Veggie and APH are available for research proposals.
NASA Astrophysics Data System (ADS)
Strömberg, Tomas; Saager, Rolf B.; Kennedy, Gordon T.; Fredriksson, Ingemar; Salerud, Göran; Durkin, Anthony J.; Larsson, Marcus
2018-02-01
Spatial frequency domain imaging (SFDI) utilizes a digital light processing (DLP) projector for illuminating turbid media with sinusoidal patterns. The tissue absorption (μa) and reduced scattering coefficient (μ,s) are calculated by analyzing the modulation transfer function for at least two spatial frequencies. We evaluated different illumination strategies with a red, green and blue light emitting diodes (LED) in the DLP, while imaging with a filter mosaic camera, XiSpec, with 16 different multi-wavelength sensitive pixels in the 470-630 nm wavelength range. Data were compared to SFDI by a multispectral camera setup (MSI) consisting of four cameras with bandpass filters centered at 475, 560, 580 and 650 nm. A pointwise system for comprehensive microcirculation analysis was used (EPOS) for comparison. A 5-min arterial occlusion and release protocol on the forearm of a Caucasian male with fair skin was analyzed by fitting the absorption spectra of the chromophores HbO2, Hb and melanin to the estimatedμa. The tissue fractions of red blood cells (fRBC), melanin (/mel) and the Hb oxygenation (S02 ) were calculated at baseline, end of occlusion, early after release and late after release. EPOS results showed a decrease in S02 during the occlusion and hyperemia during release (S02 = 40%, 5%, 80% and 51%). The fRBC showed an increase during occlusion and release phases. The best MSI resemblance to the EPOS was for green LED illumination (S02 = 53%, 9%, 82%, 65%). Several illumination and analysis strategies using the XiSpec gave un-physiological results (e.g. negative S02 ). XiSpec with green LED illumination gave the expected change in /RBC , while the dynamics in S02 were less than those for EPOS. These results may be explained by the calculation of modulation using an illumination and detector setup with a broad spectral transmission bandwidth, with considerable variation in μa of included chromophores. Approaches for either reducing the effective bandwidth of the XiSpec filters or by including their characteristic in a light transport model for SFDI modulation, are proposed.
Aurora Austrailis taken during Expedition Six
2003-02-16
ISS006-E-028961 (16 Feb. 2003) --- The Expedition Six crew enjoyed this green aurora dancing over the night side of the Earth just after sunset on February 16, 2003. The reds and blues of sunset light up the air layer to the west. The image was recorded with a 58 mm lens on a digital still camera. Because auroras follow Earth's magnetic field, they are observed at Earth's poles when the oxygen and nitrogen atoms start to glow when bombarded by charged particles coming from the sun. In a sense, auroras are the "neon lights" of the poles.
Layered Outcrops in Gusev Crater (False Color)
NASA Technical Reports Server (NTRS)
2004-01-01
One of the ways scientists collect mineralogical data about rocks on Mars is to view them through filters that allow only specific wavelengths of light to pass through the lens of the panoramic camera. NASA's Mars Exploration Rover Spirit took this false-color image of the rock nicknamed 'Tetl' at 1:05 p.m. martian time on its 270th martian day, or sol (Oct. 5, 2004) using the panoramic camera's 750-, 530-, and 430-nanometer filters. Darker red hues in the image correspond to greater concentrations of oxidized soil and dust. Bluer hues correspond to portions of rock that are not as heavily coated with soils or are not as highly oxidized.Low Cost Night Vision System for Intruder Detection
NASA Astrophysics Data System (ADS)
Ng, Liang S.; Yusoff, Wan Azhar Wan; R, Dhinesh; Sak, J. S.
2016-02-01
The growth in production of Android devices has resulted in greater functionalities as well as lower costs. This has made previously more expensive systems such as night vision affordable for more businesses and end users. We designed and implemented robust and low cost night vision systems based on red-green-blue (RGB) colour histogram for a static camera as well as a camera on an unmanned aerial vehicle (UAV), using OpenCV library on Intel compatible notebook computers, running Ubuntu Linux operating system, with less than 8GB of RAM. They were tested against human intruders under low light conditions (indoor, outdoor, night time) and were shown to have successfully detected the intruders.
NASA Technical Reports Server (NTRS)
2001-01-01
These images taken through the wide angle camera near closest approach in the deep near-infrared methane band, combined with filters which sense electromagnetic radiation of orthogonal polarization, show that the light from the poles is polarized. That is, the poles appear bright in one image, and dark in the other. Polarized light is most readily scattered by aerosols. These images indicate that the aerosol particles at Jupiter's poles are small and likely consist of aggregates of even smaller particles, whereas the particles at the equator and covering the Great Red Spot are larger. Images like these will allow scientists to ascertain the distribution, size and shape of aerosols, and consequently, the distribution of heat, in Jupiter's atmosphere.NASA Astrophysics Data System (ADS)
Mehta, Dalip Singh; Sharma, Anuradha; Dubey, Vishesh; Singh, Veena; Ahmad, Azeem
2016-03-01
We present a single-shot white light interference microscopy for the quantitative phase imaging (QPI) of biological cells and tissues. A common path white light interference microscope is developed and colorful white light interferogram is recorded by three-chip color CCD camera. The recorded white light interferogram is decomposed into the red, green and blue color wavelength component interferograms and processed it to find out the RI for different color wavelengths. The decomposed interferograms are analyzed using local model fitting (LMF)" algorithm developed for reconstructing the phase map from single interferogram. LMF is slightly off-axis interferometric QPI method which is a single-shot method that employs only a single image, so it is fast and accurate. The present method is very useful for dynamic process where path-length changes at millisecond level. From the single interferogram a wavelength-dependent quantitative phase imaging of human red blood cells (RBCs) are reconstructed and refractive index is determined. The LMF algorithm is simple to implement and is efficient in computation. The results are compared with the conventional phase shifting interferometry and Hilbert transform techniques.
Ruggedized Spectrometers Are Built for Tough Jobs
NASA Technical Reports Server (NTRS)
2015-01-01
The Mars Curiosity Chemistry and Camera instrument, or ChemCam, analyzes the elemental composition of materials on the Red Planet by using a spectrometer to measure the wavelengths of light they emit. Principal investigator Roger Wiens worked with Ocean Optics, out of Dunedin, Florida, to rework the company's spectrometer to operate in cold and rowdy conditions and also during the stresses of liftoff. Those improvements have been incorporated into the firm's commercial product line.
2015-02-09
If your eyes could only see the color red, this is how Saturn's rings would look. Many Cassini color images, like this one, are taken in red light so scientists can study the often subtle color variations of Saturn's rings. These variations may reveal clues about the chemical composition and physical nature of the rings. For example, the longer a surface is exposed to the harsh environment in space, the redder it becomes. Putting together many clues derived from such images, scientists are coming to a deeper understanding of the rings without ever actually visiting a single ring particle. This view looks toward the sunlit side of the rings from about 11 degrees above the ringplane. The image was taken in red light with the Cassini spacecraft narrow-angle camera on Dec. 6, 2014. The view was acquired at a distance of approximately 870,000 miles (1.4 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 27 degrees. Image scale is 5 miles (8 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18301
Measurement of the length of pedestrian crossings and detection of traffic lights from image data
NASA Astrophysics Data System (ADS)
Shioyama, Tadayoshi; Wu, Haiyuan; Nakamura, Naoki; Kitawaki, Suguru
2002-09-01
This paper proposes a method for measurement of the length of a pedestrian crossing and for the detection of traffic lights from image data observed with a single camera. The length of a crossing is measured from image data of white lines painted on the road at a crossing by using projective geometry. Furthermore, the state of the traffic lights, green (go signal) or red (stop signal), is detected by extracting candidates for the traffic light region with colour similarity and selecting a true traffic light from them using affine moment invariants. From the experimental results, the length of a crossing is measured with an accuracy such that the maximum relative error of measured length is less than 5% and the rms error is 0.38 m. A traffic light is efficiently detected by selecting a true traffic light region with an affine moment invariant.
Computer-aided analysis for the Mechanics of Granular Materials (MGM) experiment, part 2
NASA Technical Reports Server (NTRS)
Parker, Joey K.
1987-01-01
Computer vision based analysis for the MGM experiment is continued and expanded into new areas. Volumetric strains of granular material triaxial test specimens have been measured from digitized images. A computer-assisted procedure is used to identify the edges of the specimen, and the edges are used in a 3-D model to estimate specimen volume. The results of this technique compare favorably to conventional measurements. A simplified model of the magnification caused by diffraction of light within the water of the test apparatus was also developed. This model yields good results when the distance between the camera and the test specimen is large compared to the specimen height. An algorithm for a more accurate 3-D magnification correction is also presented. The use of composite and RGB (red-green-blue) color cameras is discussed and potentially significant benefits from using an RGB camera are presented.
NASA Technical Reports Server (NTRS)
1982-01-01
Model II Multispectral Camera is an advanced aerial camera that provides optimum enhancement of a scene by recording spectral signatures of ground objects only in narrow, preselected bands of the electromagnetic spectrum. Its photos have applications in such areas as agriculture, forestry, water pollution investigations, soil analysis, geologic exploration, water depth studies and camouflage detection. The target scene is simultaneously photographed in four separate spectral bands. Using a multispectral viewer, such as their Model 75 Spectral Data creates a color image from the black and white positives taken by the camera. With this optical image analysis unit, all four bands are superimposed in accurate registration and illuminated with combinations of blue green, red, and white light. Best color combination for displaying the target object is selected and printed. Spectral Data Corporation produces several types of remote sensing equipment and also provides aerial survey, image processing and analysis and number of other remote sensing services.
Sugimura, Daisuke; Kobayashi, Suguru; Hamamoto, Takayuki
2017-11-01
Light field imaging is an emerging technique that is employed to realize various applications such as multi-viewpoint imaging, focal-point changing, and depth estimation. In this paper, we propose a concept of a dual-resolution light field imaging system to synthesize super-resolved multi-viewpoint images. The key novelty of this study is the use of an organic photoelectric conversion film (OPCF), which is a device that converts spectra information of incoming light within a certain wavelength range into an electrical signal (pixel value), for light field imaging. In our imaging system, we place the OPCF having the green spectral sensitivity onto the micro-lens array of the conventional light field camera. The OPCF allows us to acquire the green spectra information only at the center viewpoint with the full resolution of the image sensor. In contrast, the optical system of the light field camera in our imaging system captures the other spectra information (red and blue) at multiple viewpoints (sub-aperture images) but with low resolution. Thus, our dual-resolution light field imaging system enables us to simultaneously capture information about the target scene at a high spatial resolution as well as the direction information of the incoming light. By exploiting these advantages of our imaging system, our proposed method enables the synthesis of full-resolution multi-viewpoint images. We perform experiments using synthetic images, and the results demonstrate that our method outperforms other previous methods.
Lee, Changju; So, Jaehyun Jason; Ma, Jiaqi
2018-01-02
The conflicts among motorists entering a signalized intersection with the red light indication have become a national safety issue. Because of its sensitivity, efforts have been made to investigate the possible causes and effectiveness of countermeasures using comparison sites and/or before-and-after studies. Nevertheless, these approaches are ineffective when comparison sites cannot be found, or crash data sets are not readily available or not reliable for statistical analysis. Considering the random nature of red light running (RLR) crashes, an inventive approach regardless of data availability is necessary to evaluate the effectiveness of each countermeasure face to face. The aims of this research are to (1) review erstwhile literature related to red light running and traffic safety models; (2) propose a practical methodology for evaluation of RLR countermeasures with a microscopic traffic simulation model and surrogate safety assessment model (SSAM); (3) apply the proposed methodology to actual signalized intersection in Virginia, with the most prevalent scenarios-increasing the yellow signal interval duration, installing an advance warning sign, and an RLR camera; and (4) analyze the relative effectiveness by RLR frequency and the number of conflicts (rear-end and crossing). All scenarios show a reduction in RLR frequency (-7.8, -45.5, and -52.4%, respectively), but only increasing the yellow signal interval duration results in a reduced total number of conflicts (-11.3%; a surrogate safety measure of possible RLR-related crashes). An RLR camera makes the greatest reduction (-60.9%) in crossing conflicts (a surrogate safety measure of possible angle crashes), whereas increasing the yellow signal interval duration results in only a 12.8% reduction of rear-end conflicts (a surrogate safety measure of possible rear-end crash). Although increasing the yellow signal interval duration is advantageous because this reduces the total conflicts (a possibility of total RLR-related crashes), each countermeasure shows different effects by RLR-related conflict types that can be referred to when making a decision. Given that each intersection has different RLR crash issues, evaluated countermeasures are directly applicable to enhance the cost and time effectiveness, according to the situation of the target intersection. In addition, the proposed methodology is replicable at any site that has a dearth of crash data and/or comparison sites in order to test any other countermeasures (both engineering and enforcement countermeasures) for RLR crashes.
2D Measurements of the Balmer Series in Proto-MPEX using a Fast Visible Camera Setup
NASA Astrophysics Data System (ADS)
Lindquist, Elizabeth G.; Biewer, Theodore M.; Ray, Holly B.
2017-10-01
The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device with densities up to 1020 m-3 and temperatures up to 20 eV. Broadband spectral measurements show the visible emission spectra are solely due to the Balmer lines of deuterium. Monochromatic and RGB color Sanstreak SC1 Edgertronic fast visible cameras capture high speed video of plasmas in Proto-MPEX. The color camera is equipped with a long pass 450 nm filter and an internal Bayer filter to view the Dα line at 656 nm on the red channel and the Dβ line at 486 nm on the blue channel. The monochromatic camera has a 434 nm narrow bandpass filter to view the Dγ intensity. In the setup, a 50/50 beam splitter is used so both cameras image the same region of the plasma discharge. Camera images were aligned to each other by viewing a grid ensuring 1 pixel registration between the two cameras. A uniform intensity calibrated white light source was used to perform a pixel-to-pixel relative and an absolute intensity calibration for both cameras. Python scripts that combined the dual camera data, rendering the Dα, Dβ, and Dγ intensity ratios. Observations from Proto-MPEX discharges will be presented. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725.
Jupiter From 2.8 Million Miles
2016-08-25
This dual view of Jupiter was taken on August 23, when NASA's Juno spacecraft was 2.8 million miles (4.4 million kilometers) from the gas giant planet on the inbound leg of its initial 53.5-day capture orbit. The image on the left is a color composite taken with Junocam's visible red, green, and blue filters. The image on the right was also taken by JunoCam, but uses the camera's infrared filter, which is sensitive to the abundance of methane in the atmosphere. Bright features like the planet's Great Red Spot are higher in the atmosphere, and so have less of their light absorbed by the methane. http://photojournal.jpl.nasa.gov/catalog/PIA20884
NASA Astrophysics Data System (ADS)
Kawashima, Hayato; Yamaji, Masahiro; Suzuki, Jun'ichi; Tanaka, Shuhei
2011-03-01
We report an invisible two-dimensional (2D) barcode embedded into a synthetic fused silica by femtosecond laser processing using a computer-generated hologram (CGH) that generates a spatially extended femtosecond pulse beam in the depth direction. When we illuminate the irradiated 2D barcode pattern with a 254 nm ultraviolet (UV) light, a strong red photoluminescence (PL) is observed, and we can read it by using a complementary metal oxide semiconductor (CMOS) camera and image processing technology. This work provides a novel barcode fabrication method by femtosecond laser processing using a CGH and a barcode reading method by a red PL.
Spitzer Finds Clarity in the Inner Milky Way
NASA Technical Reports Server (NTRS)
2008-01-01
More than 800,000 frames from NASA's Spitzer Space Telescope were stitched together to create this infrared portrait of dust and stars radiating in the inner Milky Way. As inhabitants of a flat galactic disk, Earth and its solar system have an edge-on view of their host galaxy, like looking at a glass dish from its edge. From our perspective, most of the galaxy is condensed into a blurry narrow band of light that stretches completely around the sky, also known as the galactic plane. In this mosaic the galactic plane is broken up into five components: the far-left side of the plane (top image); the area just left of the galactic center (second to top); galactic center (middle); the area to the right of galactic center (second to bottom); and the far-right side of the plane (bottom). From Earth, the top two panels are visible to the northern hemisphere, and the bottom two images to the southern hemisphere. Together, these panels represent more than 50 percent of our entire Milky Way galaxy. The swaths of green represent organic molecules, called polycyclic aromatic hydrocarbons, which are illuminated by light from nearby star formation, while the thermal emission, or heat, from warm dust is rendered in red. Star-forming regions appear as swirls of red and yellow, where the warm dust overlaps with the glowing organic molecules. The blue specks sprinkled throughout the photograph are Milky Way stars. The bluish-white haze that hovers heavily in the middle panel is starlight from the older stellar population towards the center of the galaxy. This is a three-color composite that shows infrared observations from two Spitzer instruments. Blue represents 3.6-micron light and green shows light of 8 microns, both captured by Spitzer's infrared array camera. Red is 24-micron light detected by Spitzer's multiband imaging photometer. The Galactic Legacy Infrared Mid-Plane Survey Extraordinaire team (GLIMPSE) used the telescope's infrared array camera to see light from newborn stars, old stars and polycyclic aromatic hydrocarbons. A second group, the Multiband Imaging Photometer for Spitzer Galactic Plane Survey team (MIPSGAL), imaged dust in the inner galaxy with Spitzer's multiband imaging photometer.2016-11-18
This image of Ceres approximates how the dwarf planet's colors would appear to the eye. This view of Ceres, produced by the German Aerospace Center in Berlin, combines images taken during Dawn's first science orbit in 2015 using the framing camera's red, green and blue spectral filters. The color was calculated using a reflectance spectrum, which is based on the way that Ceres reflects different wavelengths of light and the solar wavelengths that illuminate Ceres. http://photojournal.jpl.nasa.gov/catalog/PIA21079
Imaging Molecular Signatures of Breast Cancer With X-ray Activated Nano-Phosphors
2011-09-01
high resolution with a decrease in X-ray dose to healthy tissue. For the first-year training goals, this grant has provided for extensive study in...europium (red) were studied . The light emission was imaged in a clinical X-ray scanner with a cooled CCD camera and a spectrophotometer; dose...Indeed, in a preliminary study , these phosphor were targeted to the Folate receptor (commonly expressed in breast cancer), and uptaken by live cells
Applying Bayesian hierarchical models to examine motorcycle crashes at signalized intersections.
Haque, Md Mazharul; Chin, Hoong Chor; Huang, Helai
2010-01-01
Motorcycles are overrepresented in road traffic crashes and particularly vulnerable at signalized intersections. The objective of this study is to identify causal factors affecting the motorcycle crashes at both four-legged and T signalized intersections. Treating the data in time-series cross-section panels, this study explores different Hierarchical Poisson models and found that the model allowing autoregressive lag-1 dependence specification in the error term is the most suitable. Results show that the number of lanes at the four-legged signalized intersections significantly increases motorcycle crashes largely because of the higher exposure resulting from higher motorcycle accumulation at the stop line. Furthermore, the presence of a wide median and an uncontrolled left-turn lane at major roadways of four-legged intersections exacerbate this potential hazard. For T signalized intersections, the presence of exclusive right-turn lane at both major and minor roadways and an uncontrolled left-turn lane at major roadways increases motorcycle crashes. Motorcycle crashes increase on high-speed roadways because they are more vulnerable and less likely to react in time during conflicts. The presence of red light cameras reduces motorcycle crashes significantly for both four-legged and T intersections. With the red light camera, motorcycles are less exposed to conflicts because it is observed that they are more disciplined in queuing at the stop line and less likely to jump start at the start of green.
NASA Astrophysics Data System (ADS)
Stepp, Herbert G.; Baumgartner, Reinhold; Beyer, Wolfgang; Knuechel, Ruth; Koerner, T. O.; Kriegmair, M.; Rick, Kai; Steinbach, Pia; Hofstetter, Alfons G.
1995-12-01
In a clinical pilot study performed on 104 patients suffering from bladder cancer it could be shown that intravesical instillation of a solution of 5-aminolevulinic acid (5-ALA) induces a tumorselective accumulation of Protoporphyrin IX (PPIX). Malignant lesions could be detected with a sensitivity of 97% and a specificity of 67%. The Kr+-laser as excitation light source could successfully be replaced by a filtered short arc Xe-lamp. Its emission wavelength band (375 nm - 440 nm) leads to an efficiency of 58% for PPIX- excitation compared to the laser. Two-hundred-sixty mW of output power at the distal end of a slightly modified cystoscope could be obtained. This is sufficient for recording fluorescence images with a target integrating color CCD-camera. Red fluorescence and blue remitted light are displayed simultaneously. Standard white light observation is possible with the same instrumentation. Pharmacokinetic measurements were performed on 18 patients after different routes of 5-ALA application (oral, inhalation and intravesical instillation). PPIX-fluorescence measurements were made on the skin and on the blood plasma. Pharmacokinetic of 5-ALA could be performed on blood plasma. Endoscopical florescence spectroscopy showed the high fluorescence contrast between tumor and normal tissue with a mean value of 10.7. Forthcoming clinical multicenter studies require an objective measure of the fluorescence intensity. Monte Carlo computer simulations showed that artifacts due to observation geometry and varying absorption can largely be reduced by ratioing fluorescence (red channel of camera) to remission (blue channel). Real time image ratioing provides false color images with a reliable fluorescence information.
NASA Technical Reports Server (NTRS)
2003-01-01
MGS MOC Release No. MOC2-344, 28 April 2003
This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image mosaic was constructed from data acquired by the MOC red wide angle camera. The large, circular feature in the upper left is Aram Chaos, an ancient impact crater filled with layered sedimentary rock that was later disrupted and eroded to form a blocky, 'chaotic' appearance. To the southeast of Aram Chaos, in the lower right of this picture, is Iani Chaos. The light-toned patches amid the large blocks of Iani Chaos are known from higher-resolution MOC images to be layered, sedimentary rock outcrops. The picture center is near 0.5oN, 20oW. Sunlight illuminates the scene from the left/upper left.Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery
NASA Astrophysics Data System (ADS)
Kwoh, L. K.; Huang, X.; Tan, W. J.
2012-07-01
XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.
Opto-fluidics based microscopy and flow cytometry on a cell phone for blood analysis.
Zhu, Hongying; Ozcan, Aydogan
2015-01-01
Blood analysis is one of the most important clinical tests for medical diagnosis. Flow cytometry and optical microscopy are widely used techniques to perform blood analysis and therefore cost-effective translation of these technologies to resource limited settings is critical for various global health as well as telemedicine applications. In this chapter, we review our recent progress on the integration of imaging flow cytometry and fluorescent microscopy on a cell phone using compact, light-weight and cost-effective opto-fluidic attachments integrated onto the camera module of a smartphone. In our cell-phone based opto-fluidic imaging cytometry design, fluorescently labeled cells are delivered into the imaging area using a disposable micro-fluidic chip that is positioned above the existing camera unit of the cell phone. Battery powered light-emitting diodes (LEDs) are butt-coupled to the sides of this micro-fluidic chip without any lenses, which effectively acts as a multimode slab waveguide, where the excitation light is guided to excite the fluorescent targets within the micro-fluidic chip. Since the excitation light propagates perpendicular to the detection path, an inexpensive plastic absorption filter is able to reject most of the scattered light and create a decent dark-field background for fluorescent imaging. With this excitation geometry, the cell-phone camera can record fluorescent movies of the particles/cells as they are flowing through the microchannel. The digital frames of these fluorescent movies are then rapidly processed to quantify the count and the density of the labeled particles/cells within the solution under test. With a similar opto-fluidic design, we have recently demonstrated imaging and automated counting of stationary blood cells (e.g., labeled white blood cells or unlabeled red blood cells) loaded within a disposable cell counting chamber. We tested the performance of this cell-phone based imaging cytometry and blood analysis platform by measuring the density of red and white blood cells as well as hemoglobin concentration in human blood samples, which showed a good match to our measurement results obtained using a commercially available hematology analyzer. Such a cell-phone enabled opto-fluidics microscopy, flow cytometry, and blood analysis platform could be especially useful for various telemedicine applications in remote and resource-limited settings.
GETTING TO THE HEART OF A GALAXY
NASA Technical Reports Server (NTRS)
2002-01-01
This collage of images in visible and infrared light reveals how the barred spiral galaxy NGC 1365 is feeding material into its central region, igniting massive star birth and probably causing its bulge of stars to grow. The material also is fueling a black hole in the galaxy's core. A galaxy's bulge is a central, football-shaped structure composed of stars, gas, and dust. The black-and-white image in the center, taken by a ground-based telescope, displays the entire galaxy. But the telescope's resolution is not powerful enough to reveal the flurry of activity in the galaxy's hub. The blue box in the galaxy's central region outlines the area observed by the NASA Hubble Space Telescope's visible-light camera, the Wide Field and Planetary Camera 2 (WFPC2). The red box pinpoints a narrower view taken by the Hubble telescope's infrared camera, the Near Infrared Camera and Multi-Object Spectrometer (NICMOS). A barred spiral is characterized by a lane of stars, gas, and dust slashing across a galaxy's central region. It has a small bulge that is dominated by a disk of material. The spiral arms begin at both ends of the bar. The bar is funneling material into the hub, which triggers star formation and feeds the bulge. The visible-light picture at upper left is a close-up view of the galaxy's hub. The bright yellow orb is the nucleus. The dark material surrounding the orb is gas and dust that is being funneled into the central region by the bar. The blue regions pinpoint young star clusters. In the infrared image at lower right, the Hubble telescope penetrates the dust seen in the WFPC2 picture to reveal more clusters of young stars. The bright blue dots represent young star clusters; the brightest of the red dots are young star clusters enshrouded in dust and visible only in the infrared image. The fainter red dots are older star clusters. The WFPC2 image is a composite of three filters: near-ultraviolet (3327 Angstroms), visible (5552 Angstroms), and near-infrared (8269 Angstroms). The NICMOS image, taken at a wavelength of 16,000 Angstroms, was combined with the visible and near-infrared wavelengths taken by WFPC2. The WFPC2 image was taken in January 1996; the NICMOS data were taken in April 1998. Credits for the ground-based image: Allan Sandage (The Observatories of the Carnegie Institution of Washington) and John Bedke (Computer Sciences Corporation and the Space Telescope Science Institute) Credits for the WFPC2 image: NASA and John Trauger (Jet Propulsion Laboratory) Credits for the NICMOS image: NASA, ESA, and C. Marcella Carollo (Columbia University)
New Views of a Familiar Beauty
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] Figure 1 [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 2Figure 3Figure 4Figure 5 This image composite compares the well-known visible-light picture of the glowing Trifid Nebula (left panel) with infrared views from NASA's Spitzer Space Telescope (remaining three panels). The Trifid Nebula is a giant star-forming cloud of gas and dust located 5,400 light-years away in the constellation Sagittarius. The false-color Spitzer images reveal a different side of the Trifid Nebula. Where dark lanes of dust are visible trisecting the nebula in the visible-light picture, bright regions of star-forming activity are seen in the Spitzer pictures. All together, Spitzer uncovered 30 massive embryonic stars and 120 smaller newborn stars throughout the Trifid Nebula, in both its dark lanes and luminous clouds. These stars are visible in all the Spitzer images, mainly as yellow or red spots. Embryonic stars are developing stars about to burst into existence. Ten of the 30 massive embryos discovered by Spitzer were found in four dark cores, or stellar 'incubators,' where stars are born. Astronomers using data from the Institute of Radioastronomy millimeter telescope in Spain had previously identified these cores but thought they were not quite ripe for stars. Spitzer's highly sensitive infrared eyes were able to penetrate all four cores to reveal rapidly growing embryos. Astronomers can actually count the individual embryos tucked inside the cores by looking closely at the Spitzer image taken by its infrared array camera (figure 4). This instrument has the highest spatial resolution of Spitzer's imaging cameras. The Spitzer image from the multiband imaging photometer (figure 5), on the other hand, specializes in detecting cooler materials. Its view highlights the relatively cool core material falling onto the Trifid's growing embryos. The middle panel is a combination of Spitzer data from both of these instruments. The embryos are thought to have been triggered by a massive 'type O' star, which can be seen as a white spot at the center of the nebula in all four images. Type O stars are the most massive stars, ending their brief lives in explosive supernovas. The small newborn stars probably arose at the same time as the O star, and from the same original cloud of gas and dust. The Spitzer infrared array camera image is a three-color composite of invisible light, showing emissions from wavelengths of 3.6 microns (blue), 4.5 microns (green), 5.8 and 8.0 microns (red). The Spitzer multiband imaging photometer image (figure 3) shows 24-micron emissions. The Spitzer mosaic image combines data from these pictures, showing light of 4.5 microns (blue), 8.0 microns (green) and 24 microns (red). The visible-light image (figure 2) is from the National Optical Astronomy Observatory, Tucson, Ariz.LIFTING THE VEIL OF DUST TO REVEAL THE SECRETS OF SPIRAL GALAXIES
NASA Technical Reports Server (NTRS)
2002-01-01
Astronomers have combined information from the NASA Hubble Space Telescope's visible- and infrared-light cameras to show the hearts of four spiral galaxies peppered with ancient populations of stars. The top row of pictures, taken by a ground-based telescope, represents complete views of each galaxy. The blue boxes outline the regions observed by the Hubble telescope. The bottom row represents composite pictures from Hubble's visible- and infrared-light cameras, the Wide Field and Planetary Camera 2 (WFPC2) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS). Astronomers combined views from both cameras to obtain the true ages of the stars surrounding each galaxy's bulge. The Hubble telescope's sharper resolution allows astronomers to study the intricate structure of a galaxy's core. The galaxies are ordered by the size of their bulges. NGC 5838, an 'S0' galaxy, is dominated by a large bulge and has no visible spiral arms; NGC 7537, an 'Sbc' galaxy, has a small bulge and loosely wound spiral arms. Astronomers think that the structure of NGC 7537 is very similar to our Milky Way. The galaxy images are composites made from WFPC2 images taken with blue (4445 Angstroms) and red (8269 Angstroms) filters, and NICMOS images taken in the infrared (16,000 Angstroms). They were taken in June, July, and August of 1997. Credits for the ground-based images: Allan Sandage (The Observatories of the Carnegie Institution of Washington) and John Bedke (Computer Sciences Corporation and the Space Telescope Science Institute) Credits for WFPC2 and NICMOS composites: NASA, ESA, and Reynier Peletier (University of Nottingham, United Kingdom)
Spitzer Makes 'Invisible' Visible
NASA Technical Reports Server (NTRS)
2004-01-01
Hidden behind a shroud of dust in the constellation Cygnus is a stellar nursery called DR21, which is giving birth to some of the most massive stars in our galaxy. Visible light images reveal no trace of this interstellar cauldron because of heavy dust obscuration. In fact, visible light is attenuated in DR21 by a factor of more than 10,000,000,000,000,000,000,000,000,000,000,000,000,000 (ten thousand trillion heptillion). New images from NASA's Spitzer Space Telescope allow us to peek behind the cosmic veil and pinpoint one of the most massive natal stars yet seen in our Milky Way galaxy. The never-before-seen star is 100,000 times as bright as the Sun. Also revealed for the first time is a powerful outflow of hot gas emanating from this star and bursting through a giant molecular cloud. The colorful image is a large-scale composite mosaic assembled from data collected at a variety of different wavelengths. Views at visible wavelengths appear blue, near-infrared light is depicted as green, and mid-infrared data from the InfraRed Array Camera (IRAC) aboard NASA's Spitzer Space Telescope is portrayed as red. The result is a contrast between structures seen in visible light (blue) and those observed in the infrared (yellow and red). A quick glance shows that most of the action in this image is revealed to the unique eyes of Spitzer. The image covers an area about two times that of a full moon.Vacuum-Compatible Wideband White Light and Laser Combiner Source System
NASA Technical Reports Server (NTRS)
Azizi, Alineza; Ryan, Daniel J.; Tang, Hong; Demers, Richard T.; Kadogawa, Hiroshi; An, Xin; Sun, George Y.
2010-01-01
For the Space Interferometry Mission (SIM) Spectrum Calibration Development Unit (SCDU) testbed, wideband white light is used to simulate starlight. The white light source mount requires extremely stable pointing accuracy (<3.2 microradians). To meet this and other needs, the laser light from a single-mode fiber was combined, through a beam splitter window with special coating from broadband wavelengths, with light from multimode fiber. Both lights were coupled to a photonic crystal fiber (PCF). In many optical systems, simulating a point star with broadband spectrum with stability of microradians for white light interferometry is a challenge. In this case, the cameras use the white light interference to balance two optical paths, and to maintain close tracking. In order to coarse align the optical paths, a laser light is sent into the system to allow tracking of fringes because a narrow band laser has a great range of interference. The design requirements forced the innovators to use a new type of optical fiber, and to take a large amount of care in aligning the input sources. The testbed required better than 1% throughput, or enough output power on the lowest spectrum to be detectable by the CCD camera (6 nW at camera). The system needed to be vacuum-compatible and to have the capability for combining a visible laser light at any time for calibration purposes. The red laser is a commercially produced 635-nm laser 5-mW diode, and the white light source is a commercially produced tungsten halogen lamp that gives a broad spectrum of about 525 to 800 nm full width at half maximum (FWHM), with about 1.4 mW of power at 630 nm. A custom-made beam splitter window with special coating for broadband wavelengths is used with the white light input via a 50-mm multi-mode fiber. The large mode area PCF is an LMA-8 made by Crystal Fibre (core diameter of 8.5 mm, mode field diameter of 6 mm, and numerical aperture at 625 nm of 0.083). Any science interferometer that needs a tracking laser fringe to assist in alignment can use this system.
Why do veins appear blue? A new look at an old question
NASA Astrophysics Data System (ADS)
Kienle, Alwin; Hibst, Raimund; Steiner, Rudolf; Lilge, Lothar; Vitkin, I. Alex; Wilson, Brian C.; Patterson, Michael S.
1996-03-01
We investigate why vessels that contain blood, which has a red or a dark red color, may look bluish in human tissue. A CCD camera was used to make images of diffusely reflected light at different wavelengths. Measurements of reflectance that are due to model blood vessels in scattering media and of human skin containing a prominent vein are presented. Monte Carlo simulations were used to calculate the spatially resolved diffuse reflectance for both situations. We show that the color of blood vessels is scattering and absorption characteristics of skin at different wavelengths, (ii) the oxygenation state of blood, which affects its absorption properties, (iii) the diameter and the depth of the vessels, and (iv) the visual perception process.
ERIC Educational Resources Information Center
Jeppsson, Fredrik; Frejd, Johanna; Lundmark, Frida
2017-01-01
This study focuses on investigating how students make use of their bodily experiences in combination with infrared (IR) cameras, as a way to make meaning in learning about heat, temperature, and friction. A class of 20 primary students (age 7-8 years), divided into three groups, took part in three IR camera laboratory experiments. The qualitative…
NASA Astrophysics Data System (ADS)
Chen, Shih-Hao; Chow, Chi-Wai
2015-01-01
Multiple-input and multiple-output (MIMO) scheme can extend the transmission capacity for the light-emitting-diode (LED) based visible light communication (VLC) systems. The MIMO VLC system that uses the mobile-phone camera as the optical receiver (Rx) to receive MIMO signal from the n×n Red-Green-Blue (RGB) LED array is desirable. The key step of decoding this signal is to detect the signal direction. If the LED transmitter (Tx) is rotated, the Rx may not realize the rotation and transmission error can occur. In this work, we propose and demonstrate a novel hierarchical transmission scheme which can reduce the computation complexity of rotation detection in LED array VLC system. We use the n×n RGB LED array as the MIMO Tx. In our study, a novel two dimensional Hadamard coding scheme is proposed. Using the different LED color layers to indicate the rotation, a low complexity rotation detection method can be used for improving the quality of received signal. The detection correction rate is above 95% in the indoor usage distance. Experimental results confirm the feasibility of the proposed scheme.
Instrumentation in Developing Chlorophyll Fluorescence Biosensing: A Review
Fernandez-Jaramillo, Arturo A.; Duarte-Galvan, Carlos; Contreras-Medina, Luis M.; Torres-Pacheco, Irineo; de J. Romero-Troncoso, Rene; Guevara-Gonzalez, Ramon G.; Millan-Almaraz, Jesus R.
2012-01-01
Chlorophyll fluorescence can be defined as the red and far-red light emitted by photosynthetic tissue when it is excited by a light source. This is an important phenomenon which permits investigators to obtain important information about the state of health of a photosynthetic sample. This article reviews the current state of the art knowledge regarding the design of new chlorophyll fluorescence sensing systems, providing appropriate information about processes, instrumentation and electronic devices. These types of systems and applications can be created to determine both comfort conditions and current problems within a given subject. The procedure to measure chlorophyll fluorescence is commonly split into two main parts; the first involves chlorophyll excitation, for which there are passive or active methods. The second part of the procedure is to closely measure the chlorophyll fluorescence response with specialized instrumentation systems. Such systems utilize several methods, each with different characteristics regarding to cost, resolution, ease of processing or portability. These methods for the most part include cameras, photodiodes and satellite images. PMID:23112686
Omega Centauri Looks Radiant in Infrared
NASA Technical Reports Server (NTRS)
2008-01-01
[figure removed for brevity, see original site] Poster Version A cluster brimming with millions of stars glistens like an iridescent opal in this image from NASA's Spitzer Space Telescope. Called Omega Centauri, the sparkling orb of stars is like a miniature galaxy. It is the biggest and brightest of the 150 or so similar objects, called globular clusters, that orbit around the outside of our Milky Way galaxy. Stargazers at southern latitudes can spot the stellar gem with the naked eye in the constellation Centaurus. Globular clusters are some of the oldest objects in our universe. Their stars are over 12 billion years old, and, in most cases, formed all at once when the universe was just a toddler. Omega Centauri is unusual in that its stars are of different ages and possess varying levels of metals, or elements heavier than boron. Astronomers say this points to a different origin for Omega Centauri than other globular clusters: they think it might be the core of a dwarf galaxy that was ripped apart and absorbed by our Milky Way long ago. In this new view of Omega Centauri, Spitzer's infrared observations have been combined with visible-light data from the National Science Foundation's Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory in Chile. Visible-light data with a wavelength of .55 microns is colored blue, 3.6-micron infrared light captured by Spitzer's infrared array camera is colored green and 24-micron infrared light taken by Spitzer's multiband imaging photometer is colored red. Where green and red overlap, the color yellow appears. Thus, the yellow and red dots are stars revealed by Spitzer. These stars, called red giants, are more evolved, larger and dustier. The stars that appear blue were spotted in both visible and 3.6-micron-, or near-, infrared light. They are less evolved, like our own sun. Some of the red spots in the picture are distant galaxies beyond our own. Spitzer found very little dust around any but the most luminous, coolest red giants, implying that the dimmer red giants do not form significant amounts of dust. The space between the stars in Omega Centauri was also found to lack dust, which means the dust is rapidly destroyed or leaves the cluster.Non-contact measurement of pulse wave velocity using RGB cameras
NASA Astrophysics Data System (ADS)
Nakano, Kazuya; Aoki, Yuta; Satoh, Ryota; Hoshi, Akira; Suzuki, Hiroyuki; Nishidate, Izumi
2016-03-01
Non-contact measurement of pulse wave velocity (PWV) using red, green, and blue (RGB) digital color images is proposed. Generally, PWV is used as the index of arteriosclerosis. In our method, changes in blood volume are calculated based on changes in the color information, and is estimated by combining multiple regression analysis (MRA) with a Monte Carlo simulation (MCS) model of the transit of light in human skin. After two pulse waves of human skins were measured using RGB cameras, and the PWV was calculated from the difference of the pulse transit time and the distance between two measurement points. The measured forehead-finger PWV (ffPWV) was on the order of m/s and became faster as the values of vital signs raised. These results demonstrated the feasibility of this method.
Impact Site: Cassini's Final Image
2017-09-15
This monochrome view is the last image taken by the imaging cameras on NASA's Cassini spacecraft. It looks toward the planet's night side, lit by reflected light from the rings, and shows the location at which the spacecraft would enter the planet's atmosphere hours later. A natural color view, created using images taken with red, green and blue spectral filters, is also provided (Figure 1). The imaging cameras obtained this view at approximately the same time that Cassini's visual and infrared mapping spectrometer made its own observations of the impact area in the thermal infrared. This location -- the site of Cassini's atmospheric entry -- was at this time on the night side of the planet, but would rotate into daylight by the time Cassini made its final dive into Saturn's upper atmosphere, ending its remarkable 13-year exploration of Saturn. The view was acquired on Sept. 14, 2017 at 19:59 UTC (spacecraft event time). The view was taken in visible light using the Cassini spacecraft wide-angle camera at a distance of 394,000 miles (634,000 kilometers) from Saturn. Image scale is about 11 miles (17 kilometers). The original image has a size of 512x512 pixels. A movie is available at https://photojournal.jpl.nasa.gov/catalog/PIA21895
Estimation of red-light running frequency using high-resolution traffic and signal data.
Chen, Peng; Yu, Guizhen; Wu, Xinkai; Ren, Yilong; Li, Yueguang
2017-05-01
Red-light-running (RLR) emerges as a major cause that may lead to intersection-related crashes and endanger intersection safety. To reduce RLR violations, it's critical to identify the influential factors associated with RLR and estimate RLR frequency. Without resorting to video camera recordings, this study investigates this important issue by utilizing high-resolution traffic and signal event data collected from loop detectors at five intersections on Trunk Highway 55, Minneapolis, MN. First, a simple method is proposed to identify RLR by fully utilizing the information obtained from stop bar detectors, downstream entrance detectors and advance detectors. Using 12 months of event data, a total of 6550 RLR cases were identified. According to a definition of RLR frequency as the conditional probability of RLR on a certain traffic or signal condition (veh/1000veh), the relationships between RLR frequency and some influential factors including arriving time at advance detector, approaching speed, headway, gap to the preceding vehicle on adjacent lane, cycle length, geometric characteristics and even snowing weather were empirically investigated. Statistical analysis shows good agreement with the traffic engineering practice, e.g., RLR is most likely to occur on weekdays during peak periods under large traffic demands and longer signal cycles, and a total of 95.24% RLR events occurred within the first 1.5s after the onset of red phase. The findings confirmed that vehicles tend to run the red light when they are close to intersection during phase transition, and the vehicles following the leading vehicle with short headways also likely run the red light. Last, a simplified nonlinear regression model is proposed to estimate RLR frequency based on the data from advance detector. The study is expected to helpbetter understand RLR occurrence and further contribute to the future improvement of intersection safety. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sunset Sequence in Mars Gale Crater Animation
2015-05-08
NASA's Curiosity Mars rover recorded this sequence of views of the sun setting at the close of the mission's 956th Martian day, or sol (April 15, 2015), from the rover's location in Gale Crater. The four images shown in sequence here were taken over a span of 6 minutes, 51 seconds. This was the first sunset observed in color by Curiosity. The images come from the left-eye camera of the rover's Mast Camera (Mastcam). The color has been calibrated and white-balanced to remove camera artifacts. Mastcam sees color very similarly to what human eyes see, although it is actually a little less sensitive to blue than people are. Dust in the Martian atmosphere has fine particles that permit blue light to penetrate the atmosphere more efficiently than longer-wavelength colors. That causes the blue colors in the mixed light coming from the sun to stay closer to sun's part of the sky, compared to the wider scattering of yellow and red colors. The effect is most pronounced near sunset, when light from the sun passes through a longer path in the atmosphere than it does at mid-day. Malin Space Science Systems, San Diego, built and operates the rover's Mastcam. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, manages the Mars Science Laboratory Project for NASA's Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19401
Wright, Timothy J; Vitale, Thomas; Boot, Walter R; Charness, Neil
2015-12-01
Recent empirical evidence has suggested that the flashes associated with red light running cameras (RLRCs) distract younger drivers, pulling attention away from the roadway and delaying processing of safety-relevant events. Considering the perceptual and attentional declines that occur with age, older drivers may be especially susceptible to the distracting effects of RLRC flashes, particularly in situations in which the flash is more salient (a bright flash at night compared with the day). The current study examined how age and situational factors potentially influence attention capture by RLRC flashes using covert (cuing effects) and overt (eye movement) indices of capture. We manipulated the salience of the flash by varying its luminance and contrast with respect to the background of the driving scene (either day or night scenes). Results of 2 experiments suggest that simulated RLRC flashes capture observers' attention, but, surprisingly, no age differences in capture were observed. However, an analysis examining early and late eye movements revealed that older adults may have been strategically delaying their eye movements in order to avoid capture. Additionally, older adults took longer to disengage attention following capture, suggesting at least 1 age-related disadvantage in capture situations. Findings have theoretical implications for understanding age differences in attention capture, especially with respect to capture in real-world scenes, and inform future work that should examine how the distracting effects of RLRC flashes influence driver behavior. (c) 2015 APA, all rights reserved).
Wright, Timothy J.; Vitale, Thomas; Boot, Walter R; Charness, Neil
2015-01-01
Recent empirical evidence suggests that the flashes associated with red light running cameras (RLRCs) distract younger drivers, pulling attention away from the roadway and delaying processing of safety-relevant events. Considering the perceptual and attentional declines that occur with age, older drivers may be especially susceptible to the distracting effects of RLRC flashes, particularly in situations in which the flash is more salient (a bright flash at night compared to the day). The current study examined how age and situational factors potentially influence attention capture by RLRC flashes using covert (cuing effects) and overt (eye movement) indices of capture. We manipulated the salience of the flash by varying its luminance and contrast with respect to the background of the driving scene (either day or night scenes). Results of two experiments suggest that simulated RLRC flashes capture observers' attention, but, surprisingly, no age differences in capture were observed. However, an analysis examining early and late eye movements revealed that older adults may have been strategically delaying their eye movements in order to avoid capture. Additionally, older adults took longer to disengage attention following capture, suggesting at least one age-related disadvantage in capture situations. Findings have theoretical implications for understanding age differences in attention capture, especially with respect to capture in real-world scenes, and inform future work that should examine how the distracting effects of RLRC flashes influence driver behavior. PMID:26479014
2009-04-30
This image from NASA's Spitzer Space Telescope shows the spiral galaxy NGC 2841, located about 46 million light-years from Earth in the constellation Ursa Major. The galaxy is helping astronomers solve one of the oldest puzzles in astronomy: Why do galaxies look so smooth, with stars sprinkled evenly throughout? An international team of astronomers has discovered that rivers of young stars flow from their hot, dense stellar nurseries, dispersing out to form large, smooth distributions. This image is a composite of three different wavelengths from Spitzer's infrared array camera. The shortest wavelengths are displayed inblue, and mostly show the older stars in NGC 2841, as well as foreground stars in our own Milky Way galaxy. The cooler areas are highlighted in red, and show the dusty, gaseous regions of the galaxy. Blue shows infrared light of 3.6 microns, green represents 4.5-micron light and red, 8.0-micron light. The contribution from starlight measured at 3.6 microns has been subtracted from the 8.0-micron data to enhance the visibility of the dust features.The shortest wavelengths are displayed inblue, and mostly show the older stars in NGC 2841, as well as foreground stars in our own Milky Way galaxy. http://photojournal.jpl.nasa.gov/catalog/PIA12001
NASA Technical Reports Server (NTRS)
2009-01-01
This figure charts 30 hours of observations taken by NASA's Spitzer Space Telescope of a strongly irradiated exoplanet (an planet orbiting a star beyond our own). Spitzer measured changes in the planet's heat, or infrared light. The lower graph shows precise measurements of infrared light with a wavelength of 8 microns coming from the HD 80606 stellar system. The system consists of a sun-like star and a planetary companion on an extremely eccentric, comet-like orbit. The geometry of the planet-star encounter is shown in the upper part of the figure. As the planet swung through its closest approach to the star, the Spitzer observations indicated that it experienced very rapid heating (as shown by the red curve). Just before close approach, the planet was eclipsed by the star as seen from Earth, allowing astronomers to determine the amount of energy coming from the planet in comparison to the amount coming from the star. The observations were made in Nov. of 2007, using Spitzer's infrared array camera. They represent a significant first for astronomers, opening the door to studying changes in atmospheric conditions of planets far beyond our own solar system.Ohsaki, Yoshinobu; Sasaki, Takaaki; Endo, Satoshi; Kitada, Masahiro; Okumura, Shunsuke; Hirai, Noriko; Kazebayashi, Yoshihiro; Toyoshima, Eri; Yamamoto, Yasushi; Takeyama, Kaneyoshi; Nakajima, Susumu; Sakata, Isao
2017-04-26
We observed red autofluorescence emanating from bronchial cancer lesions using a sensitive color-fluorescence endoscopy system. We investigated to clarify the origin of the red autofluorescence. The wavelengths of the red autofluorescence emanating from lesions were measured in eight patients using a spectrum analyzer and compared based on pathologic findings. Red autofluorescence at 617.3, 617.4, 619.0, and 617.1 nm was emitted by normal bronchus, inflamed tissue, tissue exhibiting mild dysplasia, and malignant lesions, respectively. Protoporphyrin, uroporphyrin, and coproporphyrin, the major porphyrin derivatives in human blood, were purchased to determine which porphyrin derivative is the source of red fluorescence when acquired de novo. We synthesized photoporphyrin, Zn-protoporphyrin and Zn-photoprotoporphyrin from protoporphyrin. Coproporphyrin and uroporphyrin emitted only weak fluorescence. Fluorescence was emitted by our synthesized Zn-photoprotoporphyrin at 625.5 nm and by photoprotoporphyrin at 664.0 nm. From these results, we conclude that Zn-photoprotoporphyrin was the source of the red autofluorescence observed in bronchial lesions. Zn-protoporphyrin is converted to Zn-photoprotoporphyrin by radiation with excitation light. Our results suggest that red autofluorescence emanating from Zn-photoprotoporphyrin in human tissues could interfere with photodynamic diagnosis using porphyrin derivatives such as Photofrin® and Lazerphyrin® with a sensitive endoscopy system, because color cameras cannot differentiate Zn-photoprotoporphyrin red fluorescence from that of other porphyrin derivatives.
Perfect Lighting for Facial Photography in Aesthetic Surgery: Ring Light.
Dölen, Utku Can; Çınar, Selçuk
2016-04-01
Photography is indispensable for plastic surgery. On-camera flashes can result in bleached out detail and colour. This is why most of the plastic surgery clinics prefer studio lighting similar to professional photographers'. In this article, we want to share a simple alternative to studio lighting that does not need extra space: Ring light. We took five different photographs of the same person with five different camera and lighting settings: Smartphone and ring light; point and shoot camera and on-camera flash; point and shoot camera and studio lighting; digital single-lens reflex (DLSR) camera and studio lighting; DSLR and ring light. Then, those photographs were assessed objectively with an online survey of five questions answered by three distinct populations: plastic surgeons (n: 28), professional portrait photographers (n: 24) and patients (n: 22) who had facial aesthetic procedures. Compared to the on-camera flash, studio lighting better showed the wrinkles of the subject. The ring light facilitated the perception of the wrinkles by providing homogenous soft light in a circular shape rather than bursting flashes. The combination of a DSLR camera and ring light gave the oldest looking subject according to 64 % of responders. The DSLR camera and the studio lighting demonstrated the youngest looking subject according to 70 % of the responders. The majority of the responders (78 %) chose the combination of DSLR camera and ring light that exhibited the wrinkles the most. We suggest using a ring light to obtain well-lit photographs without loss of detail, with any type of cameras. However, smartphones must be avoided if standard pictures are desired. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.
Martian Soil Ready for Robotic Laboratory Analysis
NASA Technical Reports Server (NTRS)
2008-01-01
NASA's Phoenix Mars Lander scooped up this Martian soil on the mission's 11th Martian day, or sol, after landing (June 5, 2008) as the first soil sample for delivery to the laboratory on the lander deck. The material includes a light-toned clod possibly from crusted surface of the ground, similar in appearance to clods observed near a foot of the lander. This approximately true-color view of the contents of the scoop on the Robotic Arm comes from combining separate images taken by the Robotic Arm Camera on Sol 11, using illumination by red, green and blue light-emitting diodes on the camera. The scoop loaded with this sample was poised over an open sample-delivery door of Thermal and Evolved-Gas Analyzer at the end of Sol 11, ready to be dumped into the instrument on the next sol. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.HUBBLE UNVEILS A GALAXY IN LIVING COLOR
NASA Technical Reports Server (NTRS)
2002-01-01
In this view of the center of the magnificent barred spiral galaxy NGC 1512, NASA Hubble Space Telescope's broad spectral vision reveals the galaxy at all wavelengths from ultraviolet to infrared. The colors (which indicate differences in light intensity) map where newly born star clusters exist in both 'dusty' and 'clean' regions of the galaxy. This color-composite image was created from seven images taken with three different Hubble cameras: the Faint Object Camera (FOC), the Wide Field and Planetary Camera 2 (WFPC2), and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS). NGC 1512 is a barred spiral galaxy in the southern constellation of Horologium. Located 30 million light-years away, relatively 'nearby' as galaxies go, it is bright enough to be seen with amateur telescopes. The galaxy spans 70,000 light-years, nearly as much as our own Milky Way galaxy. The galaxy's core is unique for its stunning 2,400 light-year-wide circle of infant star clusters, called a 'circumnuclear' starburst ring. Starbursts are episodes of vigorous formation of new stars and are found in a variety of galaxy environments. Taking advantage of Hubble's sharp vision, as well as its unique wavelength coverage, a team of Israeli and American astronomers performed one of the broadest and most detailed studies ever of such star-forming regions. The results, which will be published in the June issue of the Astronomical Journal, show that in NGC 1512 newly born star clusters exist in both dusty and clean environments. The clean clusters are readily seen in ultraviolet and visible light, appearing as bright, blue clumps in the image. However, the dusty clusters are revealed only by the glow of the gas clouds in which they are hidden, as detected in red and infrared wavelengths by the Hubble cameras. This glow can be seen as red light permeating the dark, dusty lanes in the ring. 'The dust obscuration of clusters appears to be an on-off phenomenon,' says Dan Maoz, who headed the collaboration. 'The clusters are either completely hidden, enshrouded in their birth clouds, or almost completely exposed.' The scientists believe that stellar winds and powerful radiation from the bright, newly born stars have cleared away the original natal dust cloud in a fast and efficient 'cleansing' process. Aaron Barth, a co-investigator on the team, adds: 'It is remarkable how similar the properties of this starburst are to those of other nearby starbursts that have been studied in detail with Hubble.' This similarity gives the astronomers the hope that, by understanding the processes occurring in nearby galaxies, they can better interpret observations of very distant and faint starburst galaxies. Such distant galaxies formed the first generations of stars, when the universe was a fraction of its current age. Circumstellar star-forming rings are common in the universe. Such rings within barred spiral galaxies may in fact comprise the most numerous class of nearby starburst regions. Astronomers generally believe that the giant bar funnels the gas to the inner ring, where stars are formed within numerous star clusters. Studies like this one emphasize the need to observe at many different wavelengths to get the full picture of the processes taking place.
Spitzer Makes Invisible Visible
2004-04-13
Hidden behind a shroud of dust in the constellation Cygnus is a stellar nursery called DR21, which is giving birth to some of the most massive stars in our galaxy. Visible light images reveal no trace of this interstellar cauldron because of heavy dust obscuration. In fact, visible light is attenuated in DR21 by a factor of more than 10,000,000,000,000,000,000,000,000,000,000,000,000,000 (ten thousand trillion heptillion). New images from NASA's Spitzer Space Telescope allow us to peek behind the cosmic veil and pinpoint one of the most massive natal stars yet seen in our Milky Way galaxy. The never-before-seen star is 100,000 times as bright as the Sun. Also revealed for the first time is a powerful outflow of hot gas emanating from this star and bursting through a giant molecular cloud. The colorful image is a large-scale composite mosaic assembled from data collected at a variety of different wavelengths. Views at visible wavelengths appear blue, near-infrared light is depicted as green, and mid-infrared data from the InfraRed Array Camera (IRAC) aboard NASA's Spitzer Space Telescope is portrayed as red. The result is a contrast between structures seen in visible light (blue) and those observed in the infrared (yellow and red). A quick glance shows that most of the action in this image is revealed to the unique eyes of Spitzer. The image covers an area about two times that of a full moon. http://photojournal.jpl.nasa.gov/catalog/PIA05734
NASA Astrophysics Data System (ADS)
Liu, L.; Huang, Zh.; Qiu, Zh.; Li, B.
2018-01-01
A handheld RGB camera was developed to monitor the in vivo distribution of porphyrin-based photosensitizer (PS) hematoporphyrin monomethyl ether (HMME) in blood vessels during photodynamic therapy (PDT). The focal length, f-number, International Standardization Organization (ISO) sensitivity, and shutter speed of the camera were optimized for the solution sample with various HMME concentrations. After the parameter optimization, it was found that the red intensity value of the fluorescence image was linearly related to the fluorescence intensity under investigated conditions. The RGB camera was then used to monitor the in vivo distribution of HMME in blood vessels in a skin-fold window chamber model. The red intensity value of the recorded RGB fluorescence image was found to be linearly correlated to HMME concentrations in the range 0-24 μM. Significant differences in the red to green intensity ratios were observed between the blood vessels and the surrounding tissue.
Snowstorm Along the China-Mongolia-Russia Borders
NASA Technical Reports Server (NTRS)
2004-01-01
Heavy snowfall on March 12, 2004, across north China's Inner Mongolia Autonomous Region, Mongolia and Russia, caused train and highway traffic to stop for several days along the Russia-China border. This pair of images from the Multi-angle Imaging SpectroRadiometer (MISR) highlights the snow and surface properties across the region on March 13. The left-hand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The right-hand image is a multi-angle false-color view made from the red band data of the 46-degree aftward camera, the nadir camera, and the 46-degree forward camera. About midway between the frozen expanse of China's Hulun Nur Lake (along the right-hand edge of the images) and Russia's Torey Lakes (above image center) is a dark linear feature that corresponds with the China-Mongolia border. In the upper portion of the images, many small plumes of black smoke rise from coal and wood fires and blow toward the southeast over the frozen lakes and snow-covered grasslands. Along the upper left-hand portion of the images, in Russia's Yablonovyy mountain range and the Onon River Valley, the terrain becomes more hilly and forested. In the nadir image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the multi-angle composite, open-canopy forested areas are indicated by green hues. Since this is a multi-angle composite, the green color arises not from the color of the leaves but from the architecture of the surface cover. The green areas appear brighter at the nadir angle than at the oblique angles because more of the snow-covered surface in the gaps between the trees is visible. Color variations in the multi-angle composite also indicate angular reflectance properties for areas covered by snow and ice. The light blue color of the frozen lakes is due to the increased forward scattering of smooth ice, and light orange colors indicate rougher ice or snow, which scatters more light in the backward direction. The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire Earth between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 22525. The panels cover an area of about 355 kilometers x 380 kilometers, and utilize data from blocks 50 to 52 within World Reference System-2 path 126. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.NASA Astrophysics Data System (ADS)
Yu, Liping; Pan, Bing
2017-08-01
Full-frame, high-speed 3D shape and deformation measurement using stereo-digital image correlation (stereo-DIC) technique and a single high-speed color camera is proposed. With the aid of a skillfully designed pseudo stereo-imaging apparatus, color images of a test object surface, composed of blue and red channel images from two different optical paths, are recorded by a high-speed color CMOS camera. The recorded color images can be separated into red and blue channel sub-images using a simple but effective color crosstalk correction method. These separated blue and red channel sub-images are processed by regular stereo-DIC method to retrieve full-field 3D shape and deformation on the test object surface. Compared with existing two-camera high-speed stereo-DIC or four-mirror-adapter-assisted singe-camera high-speed stereo-DIC, the proposed single-camera high-speed stereo-DIC technique offers prominent advantages of full-frame measurements using a single high-speed camera but without sacrificing its spatial resolution. Two real experiments, including shape measurement of a curved surface and vibration measurement of a Chinese double-side drum, demonstrated the effectiveness and accuracy of the proposed technique.
NASA Technical Reports Server (NTRS)
2007-01-01
A promontory nicknamed 'Cape Verde' can be seen jutting out from the walls of Victoria Crater in this false-color picture taken by the panoramic camera on NASA's Mars Exploration Rover Opportunity. The rover took this picture on martian day, or sol, 1329 (Oct. 20, 2007), more than a month after it began descending down the crater walls -- and just 9 sols shy of its second Martian birthday on sol 1338 (Oct. 29, 2007). Opportunity landed on the Red Planet on Jan. 25, 2004. That's nearly four years ago on Earth, but only two on Mars because Mars takes longer to travel around the sun than Earth. One Martian year equals 687 Earth days. This view was taken using three panoramic-camera filters, admitting light with wavelengths centered at 750 nanometers (near infrared), 530 nanometers (green) and 430 nanometers (violet).2001-10-01
Auroras are caused when high-energy electrons pour down from the Earth's magnetosphere and collide with atoms. Red aurora, as captured here by a still digital camera aboard the International Space Station (ISS), occurs from 200 km to as high as 500 km altitude and is caused by the emission of 6300 Angstrom wavelength light from oxygen atoms. The light is emitted when the atoms return to their original unexcited state. The white spot in the image is from a light on inside of the ISS that is reflected off the inside of the window. The pale blue arch on the left side of the frame is sunlight reflecting off the atmospheric limb of the Earth. At times of peaks in solar activity, there are more geomagnetic storms and this increases the auroral activity viewed on Earth and by astronauts from orbit.
HUBBLE'S INFRARED GALAXY GALLERY
NASA Technical Reports Server (NTRS)
2002-01-01
Astronomers have used the NASA Hubble Space Telescope to produce an infrared 'photo essay' of spiral galaxies. By penetrating the dust clouds swirling around the centers of these galaxies, the telescope's infrared vision is offering fresh views of star birth. These six images, taken with the Near Infrared Camera and Multi-Object Spectrometer, showcase different views of spiral galaxies, from a face-on image of an entire galaxy to a close-up of a core. The top row shows spirals at diverse angles, from face-on, (left); to slightly tilted, (center); to edge-on, (right). The bottom row shows close-ups of the hubs of three galaxies. In these images, red corresponds to glowing hydrogen, the raw material for star birth. The red knots outlining the curving spiral arms in NGC 5653 and NGC 3593, for example, pinpoint rich star-forming regions where the surrounding hydrogen gas is heated by intense ultraviolet radiation from young, massive stars. In visible light, many of these regions can be hidden from view by the clouds of gas and dust in which they were born. The glowing hydrogen found inside the cores of these galaxies, as in NGC 6946, may be due to star birth; radiation from active galactic nuclei (AGN), which are powered by massive black holes; or a combination of both. White is light from middle-age stars. Clusters of stars appear as white dots, as in NGC 2903. The galaxy cores are mostly white because of their dense concentration of stars. The dark material seen in these images is dust. These galaxies are part of a Hubble census of about 100 spiral galaxies. Astronomers at Space Telescope Science Institute took these images to fill gaps in the scheduling of a campaign using the NICMOS-3 camera. The data were non-proprietary, and were made available to the entire astronomical community. Filters: Three filters were used: red, blue, and green. Red represents emission at the Paschen Alpha line (light from glowing hydrogen) at a wavelength of 1.87 microns. Blue shows the galaxies in near-infrared light, measured between 1.4 and 1.8 microns (H-band emission). Green is a mixture of the two. Distance of galaxies from Earth: NGC 5653 - 161 million light-years; NGC 3593 - 28 million light-years; NGC 891 - 24 million light-years; NGC 4826 - 19 million light-years; NGC 2903 - 25 million light-years; and NGC 6946 - 20 million light-years. Credits: Torsten Boeker, Space Telescope Science Institute, and NASA NOTE TO EDITORS: Image files and photo caption are available on the Internet at: http://oposite.stsci.edu/pubinfo/pr/1999/10 or via links in http://oposite.stsci.edu/pubinfo/latest.html and http://oposite.stsci.edu/pubinfo/pictures.html Higher resolution digital versions of (300 dpi JPEG and TIFF) of the release photo are available at: http://oposite.stsci.edu/pubinfo/pr/1999/10/extra-photos.html STScI press releases and other information are available automatically by sending an Internet electronic mail message to pio-request@stsci.edu. In the body of the message (not the subject line) users should type the word 'subscribe' (don't use quotes). The system will respond with a confirmation of the subscription, and users will receive new press releases as they are issued. To unsubscribe, send mail to pio-request@stsci.edu. Leave the subject line blank, and type 'unsubscribe' (don't use quotes) in the body of the message.
Near-infrared fluorescence imaging with a mobile phone (Conference Presentation)
NASA Astrophysics Data System (ADS)
Ghassemi, Pejhman; Wang, Bohan; Wang, Jianting; Wang, Quanzeng; Chen, Yu; Pfefer, T. Joshua
2017-03-01
Mobile phone cameras employ sensors with near-infrared (NIR) sensitivity, yet this capability has not been exploited for biomedical purposes. Removing the IR-blocking filter from a phone-based camera opens the door to a wide range of techniques and applications for inexpensive, point-of-care biophotonic imaging and sensing. This study provides proof of principle for one of these modalities - phone-based NIR fluorescence imaging. An imaging system was assembled using a 780 nm light source along with excitation and emission filters with 800 nm and 825 nm cut-off wavelengths, respectively. Indocyanine green (ICG) was used as an NIR fluorescence contrast agent in an ex vivo rodent model, a resolution test target and a 3D-printed, tissue-simulating vascular phantom. Raw and processed images for red, green and blue pixel channels were analyzed for quantitative evaluation of fundamental performance characteristics including spectral sensitivity, detection linearity and spatial resolution. Mobile phone results were compared with a scientific CCD. The spatial resolution of CCD system was consistently superior to the phone, and green phone camera pixels showed better resolution than blue or green channels. The CCD exhibited similar sensitivity as processed red and blue pixels channels, yet a greater degree of detection linearity. Raw phone pixel data showed lower sensitivity but greater linearity than processed data. Overall, both qualitative and quantitative results provided strong evidence of the potential of phone-based NIR imaging, which may lead to a wide range of applications from cancer detection to glucose sensing.
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] Infrared Andromeda Galaxy (M31) Poster [figure removed for brevity, see original site] [figure removed for brevity, see original site] Stars Dust This animation shows the Andromeda galaxy, first as seen in visible light by the National Optical Astronomy Observatory, then as seen in infrared by NASA's Spitzer Space Telescope. The visible-light image highlights the galaxy's population of about one trillion stars. The stars are so crammed into its core that this region blazes with bright starlight. In contrast, the false-colored Spitzer view reveals red waves of dust against a more tranquil sea of blue stars. The dust lanes can be seen twirling all the way into the galaxy's center. This dust is warmed by young stars and shines at infrared wavelengths , which are represented in red. The blue color signifies shorter-wavelength infrared light primarily from older stars. The Andromeda galaxy, also known affectionately by astronomers as Messier 31, is located 2.5 million light-years away in the constellation Andromeda. It is the closest major galaxy to the Milky Way, making it the ideal specimen for carefully examining the nature of galaxies. On a clear, dark night, the galaxy can be spotted with the naked eye as a fuzzy blob. Andromeda's entire disk spans about 260,000 light-years, which means that a light beam would take 260,000 years to travel from one end of the galaxy to the other. By comparison, the Milky Way is about 100,000 light-years across. When viewed from Earth, Andromeda occupies a portion of the sky equivalent to seven full moons. Because this galaxy is so large, the infrared images had to be stitched together out of about 3,000 separate Spitzer exposures. The light detected by Spitzer's infrared array camera at 3.6 and 4.5 microns is sensitive mostly to starlight and is shown in blue and green, respectively. The 8-micron light shows warm dust and is shown in red. The contribution from starlight has been subtracted from the 8-micron image to better highlight the dust structures. Note: The size of the Full-Res TIFF for the still image is 14772 samples x 4953 lines.NASA Astrophysics Data System (ADS)
Schroeder, Walter; Schulze, Wolfram; Wetter, Thomas; Chen, Chi-Hsien
2008-08-01
Three-dimensional (3D) body surface reconstruction is an important field in health care. A popular method for this purpose is laser scanning. However, using Photometric Stereo (PS) to record lumbar lordosis and the surface contour of the back poses a viable alternative due to its lower costs and higher flexibility compared to laser techniques and other methods of three-dimensional body surface reconstruction. In this work, we extended the traditional PS method and proposed a new method for obtaining surface and volume data of a moving object. The principle of traditional Photometric Stereo uses at least three images of a static object taken under different light sources to obtain 3D information of the object. Instead of using normal light, the light sources in the proposed method consist of the RGB-Color-Model's three colors: red, green and blue. A series of pictures taken with a video camera can now be separated into the different color channels. Each set of the three images can then be used to calculate the surface normals as a traditional PS. This method waives the requirement that the object imaged must be kept still as in almost all the other body surface reconstruction methods. By putting two cameras opposite to a moving object and lighting the object with the colored light, the time-varying surface (4D) data can easily be calculated. The obtained information can be used in many medical fields such as rehabilitation, diabetes screening or orthopedics.
NASA Astrophysics Data System (ADS)
Sakota, D.; Sakamoto, R.; Sobajima, H.; Yokoyama, N.; Yokoyama, Y.; Waguri, S.; Ohuchi, K.; Takatani, S.
2008-02-01
Cardiovascular devices such as heart-lung machine generate un-physiological level of shear stress to damage red blood cells, leading to hemolysis. The diagnostic techniques of cell damages, however, have not yet been established. In this study, the time-resolved optical spectroscopy was applied to quantify red blood cell (RBC) damages caused by the extracorporeal circulation system. Experimentally, the fresh porcine blood was subjected to varying degrees of shear stress in the rotary blood pump, followed with measurement of the time-resolved transmission characteristics using the pico-second pulses at 651 nm. The propagated optical energy through the blood specimen was detected using a streak camera. The data were analyzed in terms of the mean cell volume (MCV) and mean cell hemoglobin concentration (MCHC) measured separately versus the energy and propagation time of the light pulses. The results showed that as the circulation time increased, the MCV increased with decrease in MCHC. It was speculated that the older RBCs with smaller size and fragile membrane properties had been selectively destroyed by the shear stress. The time-resolved optical spectroscopy is a useful technique in quantifying the RBCs' damages by measuring the energy and propagation time of the ultra-short light pulses through the blood.
Imaging camera system of OYGBR-phosphor-based white LED lighting
NASA Astrophysics Data System (ADS)
Kobashi, Katsuya; Taguchi, Tsunemasa
2005-03-01
The near-ultraviolet (nUV) white LED approach is analogous to three-color fluorescent lamp technology, which is based on the conversion of nUV radiation to visible light via the photoluminescence process in phosphor materials. The nUV light is not included in the white light generation from nUV-based white LED devices. This technology can thus provide a higher quality of white light than the blue and YAG method. A typical device demonstrates white luminescence with Tc=3,700 K, Ra > 93, K > 40 lm/W and chromaticity (x, y) = (0.39, 0.39), respectively. The orange, yellow, green and blue OYGB) or orange, yellow, red, green and blue (OYRGB) device shows a luminescence spectrum broader than of an RGB white LED and a better color rendering index. Such superior luminous characteristics could be useful for the application of several kinds of endoscope. We have shown the excellent pictures of digestive organs in a stomach of a dog due to the strong green component and high Ra.
Wide-field fluorescent microscopy on a cell-phone.
Zhu, Hongying; Yaglidere, Oguzhan; Su, Ting-Wei; Tseng, Derek; Ozcan, Aydogan
2011-01-01
We demonstrate wide-field fluorescent imaging on a cell-phone, using compact and cost-effective optical components that are mechanically attached to the existing camera unit of the cell-phone. Battery powered light-emitting diodes (LEDs) are used to side-pump the sample of interest using butt-coupling. The pump light is guided within the sample cuvette to excite the specimen uniformly. The fluorescent emission from the sample is then imaged with an additional lens that is put in front of the existing lens of the cell-phone camera. Because the excitation occurs through guided waves that propagate perpendicular to the detection path, an inexpensive plastic color filter is sufficient to create the dark-field background needed for fluorescent imaging. The imaging performance of this light-weight platform (~28 grams) is characterized with red and green fluorescent microbeads, achieving an imaging field-of-view of ~81 mm(2) and a spatial resolution of ~10 μm, which is enhanced through digital processing of the captured cell-phone images using compressive sampling based sparse signal recovery. We demonstrate the performance of this cell-phone fluorescent microscope by imaging labeled white-blood cells separated from whole blood samples as well as water-borne pathogenic protozoan parasites such as Giardia Lamblia cysts.
Temperature-Sensitive Coating Sensor Based on Hematite
NASA Technical Reports Server (NTRS)
Bencic, Timothy J.
2011-01-01
A temperature-sensitive coating, based on hematite (iron III oxide), has been developed to measure surface temperature using spectral techniques. The hematite powder is added to a binder that allows the mixture to be painted on the surface of a test specimen. The coating dynamically changes its relative spectral makeup or color with changes in temperature. The color changes from a reddish-brown appearance at room temperature (25 C) to a black-gray appearance at temperatures around 600 C. The color change is reversible and repeatable with temperature cycling from low to high and back to low temperatures. Detection of the spectral changes can be recorded by different sensors, including spectrometers, photodiodes, and cameras. Using a-priori information obtained through calibration experiments in known thermal environments, the color change can then be calibrated to yield accurate quantitative temperature information. Temperature information can be obtained at a point, or over an entire surface, depending on the type of equipment used for data acquisition. Because this innovation uses spectrophotometry principles of operation, rather than the current methods, which use photoluminescence principles, white light can be used for illumination rather than high-intensity short wavelength excitation. The generation of high-intensity white (or potentially filtered long wavelength light) is much easier, and is used more prevalently for photography and video technologies. In outdoor tests, the Sun can be used for short durations as an illumination source as long as the amplitude remains relatively constant. The reflected light is also much higher in intensity than the emitted light from the inefficient current methods. Having a much brighter surface allows a wider array of detection schemes and devices. Because color change is the principle of operation, the development of high-quality, lower-cost digital cameras can be used for detection, as opposed to the high-cost imagers needed for intensity measurements with the current methods. Alternative methods of detection are possible to increase the measurement sensitivity. For example, a monochrome camera can be used with an appropriate filter and a radiometric measurement of normalized intensity change that is proportional to the change coating temperature. Using different spectral regions yields different sensitivities and calibration curves for converting intensity change to temperature units. Alternatively, using a color camera, a ratio of the standard red, green, and blue outputs can be used as a self-referenced change. The blue region (less than 500 nm) does not change nearly as much as the red region (greater than 575 nm), so a ratio of color intensities will yield a calibrated temperature image. The new temperature sensor coating is easy to apply, is inexpensive, can contour complex shape surfaces, and can be a global surface measurement system based on spectrophotometry. The color change, or relative intensity change, at different colors makes the optical detection under white light illumination, and associated interpretation, much easier to measure and interpret than in the detection systems of the current methods.
Laser scattering by transcranial rat brain illumination
NASA Astrophysics Data System (ADS)
Sousa, Marcelo V. P.; Prates, Renato; Kato, Ilka T.; Sabino, Caetano P.; Suzuki, Luis C.; Ribeiro, Martha S.; Yoshimura, Elisabeth M.
2012-06-01
Due to the great number of applications of Low-Level-Laser-Therapy (LLLT) in Central Nervous System (CNS), the study of light penetration through skull and distribution in the brain becomes extremely important. The aim is to analyze the possibility of precise illumination of deep regions of the rat brain, measure the penetration and distribution of red (λ = 660 nm) and Near Infra-Red (NIR) (λ = 808 nm) diode laser light and compare optical properties of brain structures. The head of the animal (Rattus Novergicus) was epilated and divided by a sagittal cut, 2.3 mm away from mid plane. This section of rat's head was illuminated with red and NIR lasers in points above three anatomical structures: hippocampus, cerebellum and frontal cortex. A high resolution camera, perpendicularly positioned, was used to obtain images of the brain structures. Profiles of scattered intensities in the laser direction were obtained from the images. There is a peak in the scattered light profile corresponding to the skin layer. The bone layer gives rise to a valley in the profile indicating low scattering coefficient, or frontal scattering. Another peak in the region related to the brain is an indication of high scattering coefficient (μs) for this tissue. This work corroborates the use of transcranial LLLT in studies with rats which are subjected to models of CNS diseases. The outcomes of this study point to the possibility of transcranial LLLT in humans for a large number of diseases.
Dwarf Galaxies Swimming in Tidal Tails
NASA Technical Reports Server (NTRS)
2005-01-01
This false-color infrared image from NASA's Spitzer Space Telescope shows little 'dwarf galaxies' forming in the 'tails' of two larger galaxies that are colliding together. The big galaxies are at the center of the picture, while the dwarfs can be seen as red dots in the red streamers, or tidal tails. The two blue dots above the big galaxies are stars in the foreground. Galaxy mergers are common occurrences in the universe; for example, our own Milky Way galaxy will eventually smash into the nearby Andromeda galaxy. When two galaxies meet, they tend to rip each other apart, leaving a trail, called a tidal tail, of gas and dust in their wake. It is out of this galactic debris that new dwarf galaxies are born. The new Spitzer picture demonstrates that these particular dwarfs are actively forming stars. The red color indicates the presence of dust produced in star-forming regions, including organic molecules called polycyclic aromatic hydrocarbons. These carbon-containing molecules are also found on Earth, in car exhaust and on burnt toast, among other places. Here, the molecules are being heated up by the young stars, and, as a result, shine in infrared light. This image was taken by the infrared array camera on Spitzer. It is a 4-color composite of infrared light, showing emissions from wavelengths of 3.6 microns (blue), 4.5 microns (green), 5.8 microns (orange), and 8.0 microns (red). Starlight has been subtracted from the orange and red channels in order to enhance the dust features.NASA Astrophysics Data System (ADS)
Seo, Hokuto; Aihara, Satoshi; Namba, Masakazu; Watabe, Toshihisa; Ohtake, Hiroshi; Kubota, Misao; Egami, Norifumi; Hiramatsu, Takahiro; Matsuda, Tokiyoshi; Furuta, Mamoru; Nitta, Hiroshi; Hirao, Takashi
2010-01-01
Our group has been developing a new type of image sensor overlaid with three organic photoconductive films, which are individually sensitive to only one of the primary color components (blue (B), green (G), or red (R) light), with the aim of developing a compact, high resolution color camera without any color separation optical systems. In this paper, we firstly revealed the unique characteristics of organic photoconductive films. Only choosing organic materials can tune the photoconductive properties of the film, especially excellent wavelength selectivities which are good enough to divide the incident light into three primary colors. Color separation with vertically stacked organic films was also shown. In addition, the high-resolution of organic photoconductive films sufficient for high-definition television (HDTV) was confirmed in a shooting experiment using a camera tube. Secondly, as a step toward our goal, we fabricated a stacked organic image sensor with G- and R-sensitive organic photoconductive films, each of which had a zinc oxide (ZnO) thin film transistor (TFT) readout circuit, and demonstrated image pickup at a TV frame rate. A color image with a resolution corresponding to the pixel number of the ZnO TFT readout circuit was obtained from the stacked image sensor. These results show the potential for the development of high-resolution prism-less color cameras with stacked organic photoconductive films.
Light Echo From Star V838 Monocerotis
NASA Technical Reports Server (NTRS)
2002-01-01
This series of photos, captured by the NASA Hubble Space Telescope's (HST) Advanced Camera for Surveys from May to December 2002, dramatically demonstrates the reverberation of light through space caused by an unusual stellar outburst in January 2002. A burst of light from the bizarre star is spreading into space and reflecting off of surrounding circumstellar dust. As different parts are sequentially illuminated, the appearance of the dust changes. This effect is referred to as a 'light echo'. The red star at the center of the eyeball like feature is the unusual erupting super giant called V838 Monocerotis, or V Mon, located about 20,000 light-years away in the winter constellation Monoceros (the Unicorn). During its outburst, the star brightened to more than 600,000 times our Sun's luminosity. The circular feature has now expanded to slightly larger than the angular size of Jupiter on the sky, and will continue to expand for several more years until the light from the back side of the nebula begins to arrive. The light echo will then give the illusion of contracting, until it finally disappears by the end of the decade.
Optimisation approaches for concurrent transmitted light imaging during confocal microscopy.
Collings, David A
2015-01-01
The transmitted light detectors present on most modern confocal microscopes are an under-utilised tool for the live imaging of plant cells. As the light forming the image in this detector is not passed through a pinhole, out-of-focus light is not removed. It is this extended focus that allows the transmitted light image to provide cellular and organismal context for fluorescence optical sections generated confocally. More importantly, the transmitted light detector provides images that have spatial and temporal registration with the fluorescence images, unlike images taken with a separately-mounted camera. Because plants often provide difficulties for taking transmitted light images, with the presence of pigments and air pockets in leaves, this study documents several approaches to improving transmitted light images beginning with ensuring that the light paths through the microscope are correctly aligned (Köhler illumination). Pigmented samples can be imaged in real colour using sequential scanning with red, green and blue lasers. The resulting transmitted light images can be optimised and merged in ImageJ to generate colour images that maintain registration with concurrent fluorescence images. For faster imaging of pigmented samples, transmitted light images can be formed with non-absorbed wavelengths. Transmitted light images of Arabidopsis leaves expressing GFP can be improved by concurrent illumination with green and blue light. If the blue light used for YFP excitation is blocked from the transmitted light detector with a cheap, coloured glass filters, the non-absorbed green light will form an improved transmitted light image. Changes in sample colour can be quantified by transmitted light imaging. This has been documented in red onion epidermal cells where changes in vacuolar pH triggered by the weak base methylamine result in measurable colour changes in the vacuolar anthocyanin. Many plant cells contain visible levels of pigment. The transmitted light detector provides a useful tool for documenting and measuring changes in these pigments while maintaining registration with confocal imaging.
Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung
2017-07-08
A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.
Contrast enhancement for in vivo visible reflectance imaging of tissue oxygenation.
Crane, Nicole J; Schultz, Zachary D; Levin, Ira W
2007-08-01
Results are presented illustrating a straightforward algorithm to be used for real-time monitoring of oxygenation levels in blood cells and tissue based on the visible spectrum of hemoglobin. Absorbance images obtained from the visible reflection of white light through separate red and blue bandpass filters recorded by monochrome charge-coupled devices (CCDs) are combined to create enhanced images that suggest a quantitative correlation between the degree of oxygenated and deoxygenated hemoglobin in red blood cells. The filter bandpass regions are chosen specifically to mimic the color response of commercial 3-CCD cameras, representative of detectors with which the operating room laparoscopic tower systems are equipped. Adaptation of this filter approach is demonstrated for laparoscopic donor nephrectomies in which images are analyzed in terms of real-time in vivo monitoring of tissue oxygenation.
The kinelite project. A new powerful motion analyser for spacelab and space station
NASA Astrophysics Data System (ADS)
Venet, M.; Pinard, H.; McIntyre, J.; Berthoz, A.; Lacquaniti, F.
The goal of the Kinelite Project is to develop a space qualified motion analysis system to be used in space by the scientific community, mainly to support neuroscience protocols. The measurement principle of the Kinelite is to determine, by triangulation mean, the 3D position of small, lightweight, reflective markers positionned at the different points of interest. The scene is illuminated by Infra Red flashes and the reflected light is acquired by up to 8 precalibrated and synchronized CCD cameras. The main characteristics of the system are: - Camera field of view: 45 °, - Number of cameras: 2 to 8, - Acquisition frequency: 25, 50, 100 or 200 Hz, - CCD format: 256 × 256, - Number of markers: up to 64, - 3D accuracy: 2 mm, - Main dimensions: 45 cm × 45 cm × 30 cm, - Mass: 23 kg, - Power consumption: less than 200 W. The Kinelite will first fly aboard the NASA Spacelab; it will be used, during the NEUROLAB mission (4/98), to support the "Frames of References and Internal Models" (Principal Investigator: Pr. A.BERTHOZ, Co Investigators: J. Mc INTYRE, F. LACQUANITI).
Film dosimetry using a smart device camera: a feasibility study for point dose measurements
NASA Astrophysics Data System (ADS)
Aland, Trent; Jhala, Ekta; Kairn, Tanya; Trapp, Jamie
2017-10-01
In this work, a methodology for using a smartphone camera, in conjunction with a light-tight box operating in reflective transmission mode, is investigated as a proof of concept for use as a film dosimetry system. An imaging system was designed to allow the camera of a smartphone to be used as a pseudo densitometer. Ten pieces of Gafchromic EBT3 film were irradiated to doses up to 16.89 Gy and used to evaluate the effects of reproducibility and orientation, as well as the ability to create an accurate dose response curve for the smartphone based dosimetry system, using all three colour channels. Results were compared to a flatbed scanner system. Overall uncertainty was found to be best for the red channel with an uncertainty of 2.4% identified for film irradiated to 2.5 Gy and digitised using the smartphone system. This proof of concept exercise showed that although uncertainties still exceed a flatbed scanner system, the smartphone system may be useful for providing point dose measurements in situations where conventional flatbed scanners (or other dosimetry systems) are unavailable or unaffordable.
Film dosimetry using a smart device camera: a feasibility study for point dose measurements.
Aland, Trent; Jhala, Ekta; Kairn, Tanya; Trapp, Jamie
2017-10-03
In this work, a methodology for using a smartphone camera, in conjunction with a light-tight box operating in reflective transmission mode, is investigated as a proof of concept for use as a film dosimetry system. An imaging system was designed to allow the camera of a smartphone to be used as a pseudo densitometer. Ten pieces of Gafchromic EBT3 film were irradiated to doses up to 16.89 Gy and used to evaluate the effects of reproducibility and orientation, as well as the ability to create an accurate dose response curve for the smartphone based dosimetry system, using all three colour channels. Results were compared to a flatbed scanner system. Overall uncertainty was found to be best for the red channel with an uncertainty of 2.4% identified for film irradiated to 2.5 Gy and digitised using the smartphone system. This proof of concept exercise showed that although uncertainties still exceed a flatbed scanner system, the smartphone system may be useful for providing point dose measurements in situations where conventional flatbed scanners (or other dosimetry systems) are unavailable or unaffordable.
A new device for acquiring ground truth on the absorption of light by turbid waters
NASA Technical Reports Server (NTRS)
Klemas, V. (Principal Investigator); Srna, R.; Treasure, W.
1974-01-01
The author has identified the following significant results. A new device, called a Spectral Attenuation Board, has been designed and tested, which enables ERTS-1 sea truth collection teams to monitor the attenuation depths of three colors continuously, as the board is being towed behind a boat. The device consists of a 1.2 x 1.2 meter flat board held below the surface of the water at a fixed angle to the surface of the water. A camera mounted above the water takes photographs of the board. The resulting film image is analyzed by a micro-densitometer trace along the descending portion of the board. This yields information on the rate of attenuation of light penetrating the water column and the Secchi depth. Red and green stripes were painted on the white board to approximate band 4 and band 5 of the ERTS MSS so that information on the rate of light absorption by the water column of light in these regions of the visible spectrum could be concurrently measured. It was found that information from a red, green, and white stripe may serve to fingerprint the composition of the water mass. A number of these devices, when automated, could also be distributed over a large region to provide a cheap method of obtaining valuable satellite ground truth data at present time intervals.
Data-nonintrusive photonics-based credit card verifier with a low false rejection rate.
Sumriddetchkajorn, Sarun; Intaravanne, Yuttana
2010-02-10
We propose and experimentally demonstrate a noninvasive credit card verifier with a low false rejection rate (FRR). Our key idea is based on the use of three broadband light sources in our data-nonintrusive photonics-based credit card verifier structure, where spectral components of the embossed hologram images are registered as red, green, and blue. In this case, nine distinguishable variables are generated for a feed-forward neural network (FFNN). In addition, we investigate the center of mass of the image histogram projected onto the x axis (I(color)), making our system more tolerant of the intensity fluctuation of the light source. We also reduce the unwanted signals on each hologram image by simply dividing the hologram image into three zones and then calculating their corresponding I(color) values for red, green, and blue bands. With our proposed concepts, we implement our field test prototype in which three broadband white light light-emitting diodes (LEDs), a two-dimensional digital color camera, and a four-layer FFNN are used. Based on 249 genuine credit cards and 258 counterfeit credit cards, we find that the average of differences in I(color) values between genuine and counterfeit credit cards is improved by 1.5 times and up to 13.7 times. In this case, we can effectively verify credit cards with a very low FRR of 0.79%.
NASA Technical Reports Server (NTRS)
2008-01-01
A jet of gas firing out of a very young star can be seen ramming into a wall of material in this infrared image from NASA's Spitzer Space Telescope. The young star, called HH 211-mm, is cloaked in dust and can't be seen. But streaming away from the star are bipolar jets, color-coded blue in this view. The pink blob at the end of the jet to the lower left shows where the jet is hitting a wall of material. The jet is hitting the wall so hard that shock waves are being generated, which causes ice to vaporize off dust grains. The shock waves are also heating material up, producing energetic ultraviolet radiation. The ultraviolet radiation then breaks the water vapor molecules apart. The red color at the end of the lower jet represents shock-heated iron, sulfur and dust, while the blue color in both jets denotes shock-heated hydrogen molecules. HH 211-mm is part of a cluster of about 300 stars, called IC 348, located 1,000 light-years away in the constellation Perseus. This image is a composite of infrared data from Spitzer's infrared array camera and its multiband imaging photometer. Light with wavelengths of 3.6 and 4.5 microns is blue; 8-micron-light is green; and 24-micron light is red.3-D Flow Visualization with a Light-field Camera
NASA Astrophysics Data System (ADS)
Thurow, B.
2012-12-01
Light-field cameras have received attention recently due to their ability to acquire photographs that can be computationally refocused after they have been acquired. In this work, we describe the development of a light-field camera system for 3D visualization of turbulent flows. The camera developed in our lab, also known as a plenoptic camera, uses an array of microlenses mounted next to an image sensor to resolve both the position and angle of light rays incident upon the camera. For flow visualization, the flow field is seeded with small particles that follow the fluid's motion and are imaged using the camera and a pulsed light source. The tomographic MART algorithm is then applied to the light-field data in order to reconstruct a 3D volume of the instantaneous particle field. 3D, 3C velocity vectors are then determined from a pair of 3D particle fields using conventional cross-correlation algorithms. As an illustration of the concept, 3D/3C velocity measurements of a turbulent boundary layer produced on the wall of a conventional wind tunnel are presented. Future experiments are planned to use the camera to study the influence of wall permeability on the 3-D structure of the turbulent boundary layer.Schematic illustrating the concept of a plenoptic camera where each pixel represents both the position and angle of light rays entering the camera. This information can be used to computationally refocus an image after it has been acquired. Instantaneous 3D velocity field of a turbulent boundary layer determined using light-field data captured by a plenoptic camera.
Planetary Building Blocks Found in Surprising Place
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] Figure 1 This graph of data from NASA's Spitzer Space Telescope shows that an extraordinarily low-mass brown dwarf, or 'failed star,' is circled by a disc of planet-building dust. The brown dwarf, called OTS 44, is only 15 times the mass of Jupiter, making it the smallest known brown dwarf to host a planet-forming disc. Spitzer was able to see this unusual disc by measuring its infrared brightness. Whereas a brown dwarf without a disc (red dashed line) radiates infrared light at shorter wavelengths, a brown dwarf with a disc (orange line) gives off excess infrared light at longer wavelengths. This surplus light comes from the disc itself and is represented here as a yellow dotted line. Actual data points from observations of OTS 44 are indicated with orange dots. These data were acquired using Spitzer's infrared array camera.Colorimetric analysis of pigmented skin lesions: a pilot study with the Visi-Chroma VC-100 device.
Vereecken, P; Mommaerts, M; Duez, C; Petein, M; Laporte, M; Hubinon, J-L; Heenen, M
2006-01-01
Definition of the colour of pigmented skin lesions (PSLs) with the naked eye remains subjective and may be influenced by lighting. This problem underlines the usefulness of instrumental assessments such as epiluminescence microscopy and colorimetric devices. We describe here a new method of colour analysis of PSLs with the Visi-Chroma VC-100 device, which illuminates the surface of the skin with white light-emitting diodes (LEDs) and analyses the reflected light by a red-green-blue (RGB) charge-coupled device (CCD) colour camera. Twenty-one PSLs to be excised for cosmetic or medical reasons were analysed by this device with clinicopathological correlation. This method is feasible and might be useful to assess the colour of PSLs and allow comparisons for changes over time. Further studies are needed to determine the usefulness of this device in clinical practice.
Colorful Saturn, Getting Closer
2004-06-03
As Cassini coasts into the final month of its nearly seven-year trek, the serene majesty of its destination looms ahead. The spacecraft's cameras are functioning beautifully and continue to return stunning views from Cassini's position, 1.2 billion kilometers (750 million miles) from Earth and now 15.7 million kilometers (9.8 million miles) from Saturn. In this narrow angle camera image from May 21, 2004, the ringed planet displays subtle, multi-hued atmospheric bands, colored by yet undetermined compounds. Cassini mission scientists hope to determine the exact composition of this material. This image also offers a preview of the detailed survey Cassini will conduct on the planet's dazzling rings. Slight differences in color denote both differences in ring particle composition and light scattering properties. Images taken through blue, green and red filters were combined to create this natural color view. The image scale is 132 kilometers (82 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA06060
Chen, Brian R; Poon, Emily; Alam, Murad
2018-01-01
Lighting is an important component of consistent, high-quality dermatologic photography. There are different types of lighting solutions available. To evaluate currently available lighting equipment and methods suitable for procedural dermatology. Overhead lighting, built-in camera flashes, external flash units, studio strobes, and light-emitting diode (LED) light panels were evaluated with regard to their utility for dermatologic surgeons. A set of ideal lighting characteristics was used to examine the capabilities and limitations of each type of lighting solution. Recommendations regarding lighting solutions and optimal usage configurations were made in terms of the context of the clinical environment and the purpose of the image. Overhead lighting may be a convenient option for general documentation. An on-camera lighting solution using a built-in camera flash or a camera-mounted external flash unit provides portability and consistent lighting with minimal training. An off-camera lighting solution with studio strobes, external flash units, or LED light panels provides versatility and even lighting with minimal shadows and glare. The selection of an optimal lighting solution is contingent on practical considerations and the purpose of the image.
SEARCH FOR RED DWARF STARS IN GLOBULAR CLUSTER NGC 6397
NASA Technical Reports Server (NTRS)
2002-01-01
Left A NASA Hubble Space Telescope image of a small region (1.4 light-years across) in the globular star cluster NGC 6397. Simulated stars (diamonds) have been added to this view of the same region of the cluster to illustrate what astronomers would have expected to see if faint red dwarf stars were abundant in the Milky Way Galaxy. The field would then contain 500 stars, according to theoretical calculations. Right The unmodified HST image shows far fewer stars than would be expected, according to popular theories of star formation. HST resolves about 200 stars. The stellar density is so low that HST can literally see right through the cluster and resolve far more distant background galaxies. From this observation, scientists have identified the surprising cutoff point below which nature apparently doesn't make many stars smaller that 1/5 the mass of our Sun. These HST findings provide new insights into star formation in our Galaxy. Technical detail:The globular cluster NGC 6397, one of the nearest and densest agglomerations of stars, is located 7,200 light-years away in the southern constellation Ara. This visible-light picture was taken on March 3, 1994 with the Wide Field Planetary Camera 2, as part the HST parallel observing program. Credit: F. Paresce, ST ScI and ESA and NASA
History of Hubble Space Telescope (HST)
2002-12-01
This series of photos, captured by the NASA Hubble Space Telescope's (HST) Advanced Camera for Surveys from May to December 2002, dramatically demonstrates the reverberation of light through space caused by an unusual stellar outburst in January 2002. A burst of light from the bizarre star is spreading into space and reflecting off of surrounding circumstellar dust. As different parts are sequentially illuminated, the appearance of the dust changes. This effect is referred to as a "light echo". The red star at the center of the eyeball like feature is the unusual erupting super giant called V838 Monocerotis, or V Mon, located about 20,000 light-years away in the winter constellation Monoceros (the Unicorn). During its outburst, the star brightened to more than 600,000 times our Sun's luminosity. The circular feature has now expanded to slightly larger than the angular size of Jupiter on the sky, and will continue to expand for several more years until the light from the back side of the nebula begins to arrive. The light echo will then give the illusion of contracting, until it finally disappears by the end of the decade.
Rotating Jupiter With Great Red Spot, January 2017
2017-06-30
This video shows Jupiter as revealed by a powerful telescope and a mid-infrared filter sensitive to the giant planet's tropospheric temperatures and cloud thickness. It combines observations made on Jan. 14, 2017, using the Subaru Telescope in Hawaii. The filter used admits infrared light centered on a wavelength of 8.8 microns. The video includes interpolated frames for smoother apparent motion. The instrument used to take this image is Cooled Mid-Infrared Camera and Spectrometer (COMICS) of the National Astronomical Observatory of Japan's Subaru Telescope on the Maunakea volcano. Animations are available at https://photojournal.jpl.nasa.gov/catalog/PIA21715
Lensfree microscopy on a cellphone
Tseng, Derek; Mudanyali, Onur; Oztoprak, Cetin; Isikman, Serhan O.; Sencan, Ikbal; Yaglidere, Oguzhan; Ozcan, Aydogan
2010-01-01
We demonstrate lensfree digital microscopy on a cellphone. This compact and light-weight holographic microscope installed on a cellphone does not utilize any lenses, lasers or other bulky optical components and it may offer a cost-effective tool for telemedicine applications to address various global health challenges. Weighing ~38 grams (<1.4 ounces), this lensfree imaging platform can be mechanically attached to the camera unit of a cellphone where the samples are loaded from the side, and are vertically illuminated by a simple light-emitting diode (LED). This incoherent LED light is then scattered from each micro-object to coherently interfere with the background light, creating the lensfree hologram of each object on the detector array of the cellphone. These holographic signatures captured by the cellphone permit reconstruction of microscopic images of the objects through rapid digital processing. We report the performance of this lensfree cellphone microscope by imaging various sized micro-particles, as well as red blood cells, white blood cells, platelets and a waterborne parasite (Giardia lamblia). PMID:20445943
History of Hubble Space Telescope (HST)
1995-01-01
These eerie, dark, pillar-like structures are actually columns of cool interstellar hydrogen gas and dust that are also incubators for new stars. The pillars protrude from the interior wall of a dark molecular cloud like stalagmites from the floor of a cavern. They are part of the Eagle Nebula (also called M16), a nearby star-forming region 7,000 light-years away, in the constellation Serpens. The ultraviolet light from hot, massive, newborn stars is responsible for illuminating the convoluted surfaces of the columns and the ghostly streamers of gas boiling away from their surfaces, producing the dramatic visual effects that highlight the three-dimensional nature of the clouds. This image was taken on April 1, 1995 with the Hubble Space Telescope Wide Field Planetary Camera 2. The color image is constructed from three separate images taken in the light of emission from different types of atoms. Red shows emissions from singly-ionized sulfur atoms, green shows emissions from hydrogen, and blue shows light emitted by doubly-ionized oxygen atoms.
Full-color OLED on silicon microdisplay
NASA Astrophysics Data System (ADS)
Ghosh, Amalkumar P.
2002-02-01
eMagin has developed numerous enhancements to organic light emitting diode (OLED) technology, including a unique, up- emitting structure for OLED-on-silicon microdisplay devices. Recently, eMagin has fabricated full color SVGA+ resolution OLED microdisplays on silicon, with over 1.5 million color elements. The display is based on white light emission from OLED followed by LCD-type red, green and blue color filters. The color filters are patterned directly on OLED devices following suitable thin film encapsulation and the drive circuits are built directly on single crystal silicon. The resultant color OLED technology, with hits high efficiency, high brightness, and low power consumption, is ideally suited for near to the eye applications such as wearable PCS, wireless Internet applications and mobile phone, portable DVD viewers, digital cameras and other emerging applications.
Optical design of portable nonmydriatic fundus camera
NASA Astrophysics Data System (ADS)
Chen, Weilin; Chang, Jun; Lv, Fengxian; He, Yifan; Liu, Xin; Wang, Dajiang
2016-03-01
Fundus camera is widely used in screening and diagnosis of retinal disease. It is a simple, and widely used medical equipment. Early fundus camera expands the pupil with mydriatic to increase the amount of the incoming light, which makes the patients feel vertigo and blurred. Nonmydriatic fundus camera is a trend of fundus camera. Desktop fundus camera is not easy to carry, and only suitable to be used in the hospital. However, portable nonmydriatic retinal camera is convenient for patient self-examination or medical stuff visiting a patient at home. This paper presents a portable nonmydriatic fundus camera with the field of view (FOV) of 40°, Two kinds of light source are used, 590nm is used in imaging, while 808nm light is used in observing the fundus in high resolving power. Ring lights and a hollow mirror are employed to restrain the stray light from the cornea center. The focus of the camera is adjusted by reposition the CCD along the optical axis. The range of the diopter is between -20m-1 and 20m-1.
DuOCam: A Two-Channel Camera for Simultaneous Photometric Observations of Stellar Clusters
NASA Astrophysics Data System (ADS)
Maier, Erin R.; Witt, Emily; Depoy, Darren L.; Schmidt, Luke M.
2017-01-01
We have designed the Dual Observation Camera (DuOCam), which uses commercial, off-the-shelf optics to perform simultaneous photometric observations of astronomical objects at red and blue wavelengths. Collected light enters DuOCam’s optical assembly, where it is collimated by a negative doublet lens. It is then separated by a 45 degree blue dichroic filter (transmission bandpass: 530 - 800 nm, reflection bandpass: 400 - 475 nm). Finally, the separated light is focused by two identical positive doublet lenses onto two independent charge-coupled devices (CCDs), the SBIG ST-8300M and the SBIG STF-8300M. This optical assembly converts the observing telescope to an f/11 system, which balances maximum field of view with optimum focus. DuOCam was commissioned on the McDonald Observatory 0.9m, f/13.5 telescope from July 21st - 24th, 2016. Observations of three globular and three open stellar clusters were carried out. The resulting data were used to construct R vs. B-R color magnitude diagrams for a selection of the observed clusters. The diagrams display the characteristic evolutionary track for a stellar cluster, including the main sequence and main sequence turn-off.
NASA Astrophysics Data System (ADS)
Nishidate, Izumi; Kanie, Takuya; Mustari, Afrina; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu; Kokubo, Yasuaki
2018-02-01
We investigated a rapid imaging method to monitor the spatial distribution of total hemoglobin concentration (CHbT), the tissue oxygen saturation (StO2), and the scattering power b in the expression of musp=a(lambda)^-b as the scattering parameters in cerebral cortex using a digital red-green-blue camera. In the method, Monte Carlo simulation (MCS) for light transport in brain tissue is used to specify a relation among the RGB-values and the concentration of oxygenated hemoglobin (CHbO), that of deoxygenated hemoglobin (CHbR), and the scattering power b. In the present study, we performed sequential recordings of RGB images of in vivo exposed brain of rats while changing the fraction of inspired oxygen (FiO2), using a surgical microscope camera system. The time courses of CHbO, CHbR, CHbT, and StO2 indicated the well-known physiological responses in cerebral cortex. On the other hand, a fast decrease in the scattering power b was observed immediately after the respiratory arrest, which is similar to the negative deflection of the extracellular DC potential so-called anoxic depolarization. It is said that the DC shift coincident with a rise in extracellular potassium and can evoke cell deformation generated by water movement between intracellular and extracellular compartments, and hence the light scattering by tissue. Therefore, the decrease in the scattering power b after the respiratory arrest is indicative of changes in light scattering by tissue. The results in this study indicate potential of the method to evaluate the pathophysiological conditions and loss of tissue viability in brain tissue.
Bright Soil Near 'McCool' (3-D)
NASA Technical Reports Server (NTRS)
2006-01-01
While driving eastward toward the northwestern flank of 'McCool Hill,' the wheels of NASA's Mars Exploration Rover Spirit churned up the largest amount of bright soil discovered so far in the mission. This image from Spirit's panoramic camera (Pancam), taken on the rover's 788th Martian day, or sol, of exploration (March 22, 2006), shows the strikingly bright tone and large extent of the materials uncovered. Several days earlier, Spirit's wheels unearthed a small patch of light-toned material informally named 'Tyrone.' In images from Spirit's panoramic camera, 'Tyrone' strongly resembled both 'Arad' and 'Paso Robles,' two patches of light-toned soils discovered earlier in the mission. Spirit found 'Paso Robles' in 2005 while climbing 'Cumberland Ridge' on the western slope of 'Husband Hill.' In early January 2006, the rover discovered 'Arad' on the basin floor just south of 'Husband Hill.' Spirit's instruments confirmed that those soils had a salty chemistry dominated by iron-bearing sulfates. Spirit's Pancam and miniature thermal emission spectrometer examined this most recent discovery, and researchers will compare its properties with the properties of those other deposits. These discoveries indicate that salty, light-toned soil deposits might be widely distributed on the flanks and valley floors of the 'Columbia Hills' region in Gusev Crater on Mars. The salts, which are easily mobilized and concentrated in liquid solution, may record the past presence of water. So far, these enigmatic materials have generated more questions than answers, however, and as Spirit continues to drive across this region in search of a safe winter haven, the team continues to formulate and test hypotheses to explain the rover's most fascinating recent discovery. This stereo view combines images from the two blue (430-nanometer) filters in the Pancam's left and right 'eyes.' The image should be viewed using red-and-blue stereo glasses, with the red over your left eye.Høye, Gudrun; Fridman, Andrei
2013-05-06
Current high-resolution push-broom hyperspectral cameras introduce keystone errors to the captured data. Efforts to correct these errors in hardware severely limit the optical design, in particular with respect to light throughput and spatial resolution, while at the same time the residual keystone often remains large. The mixel camera solves this problem by combining a hardware component--an array of light mixing chambers--with a mathematical method that restores the hyperspectral data to its keystone-free form, based on the data that was recorded onto the sensor with large keystone. A Virtual Camera software, that was developed specifically for this purpose, was used to compare the performance of the mixel camera to traditional cameras that correct keystone in hardware. The mixel camera can collect at least four times more light than most current high-resolution hyperspectral cameras, and simulations have shown that the mixel camera will be photon-noise limited--even in bright light--with a significantly improved signal-to-noise ratio compared to traditional cameras. A prototype has been built and is being tested.
A detailed comparison of single-camera light-field PIV and tomographic PIV
NASA Astrophysics Data System (ADS)
Shi, Shengxian; Ding, Junfei; Atkinson, Callum; Soria, Julio; New, T. H.
2018-03-01
This paper conducts a comprehensive study between the single-camera light-field particle image velocimetry (LF-PIV) and the multi-camera tomographic particle image velocimetry (Tomo-PIV). Simulation studies were first performed using synthetic light-field and tomographic particle images, which extensively examine the difference between these two techniques by varying key parameters such as pixel to microlens ratio (PMR), light-field camera Tomo-camera pixel ratio (LTPR), particle seeding density and tomographic camera number. Simulation results indicate that the single LF-PIV can achieve accuracy consistent with that of multi-camera Tomo-PIV, but requires the use of overall greater number of pixels. Experimental studies were then conducted by simultaneously measuring low-speed jet flow with single-camera LF-PIV and four-camera Tomo-PIV systems. Experiments confirm that given a sufficiently high pixel resolution, a single-camera LF-PIV system can indeed deliver volumetric velocity field measurements for an equivalent field of view with a spatial resolution commensurate with those of multi-camera Tomo-PIV system, enabling accurate 3D measurements in applications where optical access is limited.
HUBBLE'S PLANETARY NEBULA GALLERY
NASA Technical Reports Server (NTRS)
2002-01-01
[Top left] - IC 3568 lies in the constellation Camelopardalis at a distance of about 9,000 light-years, and has a diameter of about 0.4 light-years (or about 800 times the diameter of our solar system). It is an example of a round planetary nebula. Note the bright inner shell and fainter, smooth, circular outer envelope. Credits: Howard Bond (Space Telescope Science Institute), Robin Ciardullo (Pennsylvania State University) and NASA [Top center] - NGC 6826's eye-like appearance is marred by two sets of blood-red 'fliers' that lie horizontally across the image. The surrounding faint green 'white' of the eye is believed to be gas that made up almost half of the star's mass for most of its life. The hot remnant star (in the center of the green oval) drives a fast wind into older material, forming a hot interior bubble which pushes the older gas ahead of it to form a bright rim. (The star is one of the brightest stars in any planetary.) NGC 6826 is 2,200 light- years away in the constellation Cygnus. The Hubble telescope observation was taken Jan. 27, 1996 with the Wide Field and Planetary Camera 2. Credits: Bruce Balick (University of Washington), Jason Alexander (University of Washington), Arsen Hajian (U.S. Naval Observatory), Yervant Terzian (Cornell University), Mario Perinotto (University of Florence, Italy), Patrizio Patriarchi (Arcetri Observatory, Italy) and NASA [Top right ] - NGC 3918 is in the constellation Centaurus and is about 3,000 light-years from us. Its diameter is about 0.3 light-year. It shows a roughly spherical outer envelope but an elongated inner balloon inflated by a fast wind from the hot central star, which is starting to break out of the spherical envelope at the top and bottom of the image. Credits: Howard Bond (Space Telescope Science Institute), Robin Ciardullo (Pennsylvania State University) and NASA [Bottom left] - Hubble 5 is a striking example of a 'butterfly' or bipolar (two-lobed) nebula. The heat generated by fast winds causes each of the lobes to expand, much like a pair of balloons with internal heaters. This observation was taken Sept. 9, 1997 by the Hubble telescope's Wide Field and Planetary Camera 2. Hubble 5 is 2,200 light-years away in the constellation Sagittarius. Credits: Bruce Balick (University of Washington), Vincent Icke (Leiden University, The Netherlands), Garrelt Mellema (Stockholm University), and NASA [Bottom center ] - Like NGC 6826, NGC 7009 has a bright central star at the center of a dark cavity bounded by a football-shaped rim of dense, blue and red gas. The cavity and its rim are trapped inside smoothly-distributed greenish material in the shape of a barrel and comprised of the star's former outer layers. At larger distances, and lying along the long axis of the nebula, a pair of red 'ansae', or 'handles' appears. Each ansa is joined to the tips of the cavity by a long greenish jet of material. The handles are clouds of low-density gas. NGC 7009 is 1,400 light-years away in the constellation Aquarius. The Hubble telescope observation was taken April 28, 1996 by the Wide Field and Planetary Camera 2. Credits: Bruce Balick (University of Washington), Jason Alexander (University of Washington), Arsen Hajian (U.S. Naval Observatory), Yervant Terzian (Cornell University), Mario Perinotto (University of Florence, Italy), Patrizio Patriarchi (Arcetri Observatory, Italy), NASA [Bottom right ] - NGC 5307 also lies in Centaurus but is about 10,000 light-years away and has a diameter of approximately 0.6 light-year. It is an example of a planetary nebula with a pinwheel or spiral structure; each blob of gas ejected from the central star has a counterpart on the opposite side of the star. Credits: Howard Bond (Space Telescope Science Institute), Robin Ciardullo (Pennsylvania State University) and NASA
Energy-efficient lighting system for television
Cawthorne, Duane C.
1987-07-21
A light control system for a television camera comprises an artificial light control system which is cooperative with an iris control system. This artificial light control system adjusts the power to lamps illuminating the camera viewing area to provide only sufficient artificial illumination necessary to provide a sufficient video signal when the camera iris is substantially open.
UAV-based NDVI calculation over grassland: An alternative approach
NASA Astrophysics Data System (ADS)
Mejia-Aguilar, Abraham; Tomelleri, Enrico; Asam, Sarah; Zebisch, Marc
2016-04-01
The Normalised Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring and assessing vegetation in remote sensing. The index relies on the reflectance difference between the near infrared (NIR) and red light and is thus able to track variations of structural, phenological, and biophysical parameters for seasonal and long-term monitoring. Conventionally, NDVI is inferred from space-borne spectroradiometers, such as MODIS, with moderate resolution up to 250 m ground resolution. In recent years, a new generation of miniaturized radiometers and integrated hyperspectral sensors with high resolution became available. Such small and light instruments are particularly adequate to be mounted on airborne unmanned aerial vehicles (UAV) used for monitoring services reaching ground sampling resolution in the order of centimetres. Nevertheless, such miniaturized radiometers and hyperspectral sensors are still very expensive and require high upfront capital costs. Therefore, we propose an alternative, mainly cheaper method to calculate NDVI using a camera constellation consisting of two conventional consumer-grade cameras: (i) a Ricoh GR modified camera that acquires the NIR spectrum by removing the internal infrared filter. A mounted optical filter additionally obstructs all wavelengths below 700 nm. (ii) A Ricoh GR in RGB configuration using two optical filters for blocking wavelengths below 600 nm as well as NIR and ultraviolet (UV) light. To assess the merit of the proposed method, we carry out two comparisons: First, reflectance maps generated by the consumer-grade camera constellation are compared to reflectance maps produced with a hyperspectral camera (Rikola). All imaging data and reflectance maps are processed using the PIX4D software. In the second test, the NDVI at specific points of interest (POI) generated by the consumer-grade camera constellation is compared to NDVI values obtained by ground spectral measurements using a portable spectroradiometer (Spectravista SVC HR-1024i). All data were collected on a dry alpine mountain grassland site in the Matsch valley, Italy, during the vegetation period of 2015. Data acquisition for the first comparison followed a pre-programmed flight plan in which the hyperspectral and alternative dual-camera constellation were mounted separately on an octocopter-UAV during two consecutive flight campaigns. Ground spectral measurements collection took place on the same site and on the same dates (three in total) of the flight campaigns. The proposed technique achieves promising results and therewith constitutes a cheap and simple way of collecting spatially explicit information on vegetated areas even in challenging terrain.
NASA Astrophysics Data System (ADS)
Ravkin, Ilya; Temov, Vladimir
1998-04-01
The detection and genetic analysis of fetal cells in maternal blood will permit noninvasive prenatal screening for genetic defects. Applied Imaging has developed and is currently evaluating a system for semiautomatic detection of fetal nucleated red blood cells on slides and acquisition of their DNA probe FISH images. The specimens are blood smears from pregnant women (9 - 16 weeks gestation) enriched for nucleated red blood cells (NRBC). The cells are identified by using labeled monoclonal antibodies directed to different types of hemoglobin chains (gamma, epsilon); the nuclei are stained with DAPI. The Applied Imaging system has been implemented with both Olympus BX and Nikon Eclipse series microscopes which were equipped with transmission and fluorescence optics. The system includes the following motorized components: stage, focus, transmission, and fluorescence filter wheels. A video camera with light integration (COHU 4910) permits low light imaging. The software capabilities include scanning, relocation, autofocusing, feature extraction, facilities for operator review, and data analysis. Detection of fetal NRBCs is achieved by employing a combination of brightfield and fluorescence images of nuclear and cytoplasmic markers. The brightfield and fluorescence images are all obtained with a single multi-bandpass dichroic mirror. A Z-stack of DNA probe FISH images is acquired by moving focus and switching excitation filters. This stack is combined to produce an enhanced image for presentation and spot counting.
NICMOS PEERS INTO HEART OF DYING STAR
NASA Technical Reports Server (NTRS)
2002-01-01
The Egg Nebula, also known as CRL 2688, is shown on the left as it appears in visible light with the Hubble Space Telescope's Wide Field and Planetary Camera 2 (WFPC2) and on the right as it appears in infrared light with Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS). Since infrared light is invisible to humans, the NICMOS image has been assigned colors to distinguish different wavelengths: blue corresponds to starlight reflected by dust particles, and red corresponds to heat radiation emitted by hot molecular hydrogen. Objects like the Egg Nebula are helping astronomers understand how stars like our Sun expel carbon and nitrogen -- elements crucial for life -- into space. Studies on the Egg Nebula show that these dying stars eject matter at high speeds along a preferred axis and may even have multiple jet-like outflows. The signature of the collision between this fast-moving material and the slower outflowing shells is the glow of hydrogen molecules captured in the NICMOS image. The distance between the tip of each jet is approximately 200 times the diameter of our solar system (out to Pluto's orbit). Credits: Rodger Thompson, Marcia Rieke, Glenn Schneider, Dean Hines (University of Arizona); Raghvendra Sahai (Jet Propulsion Laboratory); NICMOS Instrument Definition Team; and NASA Image files in GIF and JPEG format and captions may be accessed on the Internet via anonymous ftp from ftp.stsci.edu in /pubinfo.
NASA Technical Reports Server (NTRS)
Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)
1991-01-01
A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.
Composite video and graphics display for camera viewing systems in robotics and teleoperation
NASA Technical Reports Server (NTRS)
Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)
1993-01-01
A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.
M33: A Close Neighbor Reveals its True Size and Splendor (3-color composite)
NASA Technical Reports Server (NTRS)
2009-01-01
One of our closest galactic neighbors shows its awesome beauty in this new image from NASA's Spitzer Space Telescope. M33, also known as the Triangulum Galaxy, is a member of what's known as our Local Group of galaxies. Along with our own Milky Way, this group travels together in the universe, as they are gravitationally bound. In fact, M33 is one of the few galaxies that is moving toward the Milky Way despite the fact that space itself is expanding, causing most galaxies in the universe to grow farther and farther apart. When viewed with Spitzer's infrared eyes, this elegant spiral galaxy sparkles with color and detail. Stars appear as glistening blue gems (several of which are actually foreground stars in our own galaxy), while dust rich in organic molecules glows green. The diffuse orange-red glowing areas indicate star-forming regions, while small red flecks outside the spiral disk of M33 are most likely distant background galaxies. But not only is this new image beautiful, it also shows M33 to be surprising large bigger than its visible-light appearance would suggest. With its ability to detect cold, dark dust, Spitzer can see emission from cooler material well beyond the visible range of M33's disk. Exactly how this cold material moved outward from the galaxy is still a mystery, but winds from giant stars or supernovas may be responsible. M33 is located about 2.9 million light-years away in the constellation Triangulum. This is a three-color composite image showing infrared observations from two of Spitzer instruments. Blue represents combined 3.6- and 4.5-micron light and green shows light of 8 microns, both captured by Spitzer's infrared array camera. Red is 24-micron light detected by Spitzer's multiband imaging photometer.Jang, Jun-Chul; Choi, Mi-Jin; Yang, Yong-Soo; Lee, Hyung-Been; Yu, Young-Moon; Kim, Jong-Myoung
2016-06-01
To study the absorption characteristics of rhodopsin, a dim-light photoreceptor, in chub mackerel (Scomber japonicus) and the relationship between light wavelengths on the photoresponse, the rod opsin gene was cloned into an expression vector, pMT4. Recombinant opsin was transiently expressed in COS-1 cells and reconstituted with 11-cis-retinal. Cells containing the regenerated rhodopsin were solubilized and subjected to UV/Vis spectroscopic analysis in the dark and upon illumination. Difference spectra from the lysates indicated an absorption maximum of mackerel rhodopsin around 500 nm. Four types of light-emitting diode (LED) modules with different wavelengths (red, peak 627 nm; cyan, 505 nm; blue, 442 nm; white, 447 + 560 nm) were constructed to examine their effects on the photoresponse in chub mackerel. Behavioral responses of the mackerels, including speed and frequencies acclimated in the dark and upon LED illumination, were analyzed using an underwater acoustic camera. Compared to an average speed of 22.25 ± 1.57 cm/s of mackerel movement in the dark, speed increased to 22.97 ± 0.29, 24.66 ± 1.06, 26.28 ± 2.28, and 25.19 ± 1.91 cm/s upon exposure to red, blue, cyan, and white LEDs, respectively. There were increases of 103.48 ± 1.58, 109.37 ± 5.29, 118.48 ± 10.82, and 109.43 ± 3.92 %, respectively, in the relative speed of the fishes upon illumination with red, blue, cyan, and white LEDs compared with that in the dark (set at 100 %). Similar rate of wavelength-dependent responses was observed in a frequency analysis. These results indicate that an LED emitting a peak wavelength close to an absorption maximum of rhodopsin is more effective at eliciting a response to light.
Wolf, Lindsey L; Chowdhury, Ritam; Tweed, Jefferson; Vinson, Lori; Losina, Elena; Haider, Adil H; Qureshi, Faisal G
2017-08-01
To examine geographic variation in motor vehicle crash (MVC)-related pediatric mortality and identify state-level predictors of mortality. Using the 2010-2014 Fatality Analysis Reporting System, we identified passengers <15 years of age involved in fatal MVCs, defined as crashes on US public roads with ≥1 death (adult or pediatric) within 30 days. We assessed passenger, driver, vehicle, crash, and state policy characteristics as factors potentially associated with MVC-related pediatric mortality. Our outcomes were age-adjusted, MVC-related mortality rate per 100 000 children and percentage of children who died of those in fatal MVCs. Unit of analysis was US state. We used multivariable linear regression to define state characteristics associated with higher levels of each outcome. Of 18 116 children in fatal MVCs, 15.9% died. The age-adjusted, MVC-related mortality rate per 100 000 children varied from 0.25 in Massachusetts to 3.23 in Mississippi (mean national rate of 0.94). Predictors of greater age-adjusted, MVC-related mortality rate per 100 000 children included greater percentage of children who were unrestrained or inappropriately restrained (P < .001) and greater percentage of crashes on rural roads (P = .016). Additionally, greater percentages of children died in states without red light camera legislation (P < .001). For 10% absolute improvement in appropriate child restraint use nationally, our risk-adjusted model predicted >1100 pediatric deaths averted over 5 years. MVC-related pediatric mortality varied by state and was associated with restraint nonuse or misuse, rural roads, vehicle type, and red light camera policy. Revising state regulations and improving enforcement around these factors may prevent substantial pediatric mortality. Copyright © 2017 Elsevier Inc. All rights reserved.
Red and Green Fluorescence from Oral Biofilms
Hoogenkamp, Michel A.; Krom, Bastiaan P.; Janus, Marleen M.; ten Cate, Jacob M.; de Soet, Johannes J.; Crielaard, Wim; van der Veen, Monique H.
2016-01-01
Red and green autofluorescence have been observed from dental plaque after excitation by blue light. It has been suggested that this red fluorescence is related to caries and the cariogenic potential of dental plaque. Recently, it was suggested that red fluorescence may be related to gingivitis. Little is known about green fluorescence from biofilms. Therefore, we assessed the dynamics of red and green fluorescence in real-time during biofilm formation. In addition, the fluorescence patterns of biofilm formed from saliva of eight different donors are described under simulated gingivitis and caries conditions. Biofilm formation was analysed for 12 hours under flow conditions in a microfluidic BioFlux flow system with high performance microscopy using a camera to allow live cell imaging. For fluorescence images dedicated excitation and emission filters were used. Both green and red fluorescence were linearly related with the total biomass of the biofilms. All biofilms displayed to some extent green and red fluorescence, with higher red and green fluorescence intensities from biofilms grown in the presence of serum (gingivitis simulation) as compared to the sucrose grown biofilms (cariogenic simulation). Remarkably, cocci with long chain lengths, presumably streptococci, were observed in the biofilms. Green and red fluorescence were not found homogeneously distributed within the biofilms: highly fluorescent spots (both green and red) were visible throughout the biomass. An increase in red fluorescence from the in vitro biofilms appeared to be related to the clinical inflammatory response of the respective saliva donors, which was previously assessed during an in vivo period of performing no-oral hygiene. The BioFlux model proved to be a reliable model to assess biofilm fluorescence. With this model, a prediction can be made whether a patient will be prone to the development of gingivitis or caries. PMID:27997567
Red and Green Fluorescence from Oral Biofilms.
Volgenant, Catherine M C; Hoogenkamp, Michel A; Krom, Bastiaan P; Janus, Marleen M; Ten Cate, Jacob M; de Soet, Johannes J; Crielaard, Wim; van der Veen, Monique H
2016-01-01
Red and green autofluorescence have been observed from dental plaque after excitation by blue light. It has been suggested that this red fluorescence is related to caries and the cariogenic potential of dental plaque. Recently, it was suggested that red fluorescence may be related to gingivitis. Little is known about green fluorescence from biofilms. Therefore, we assessed the dynamics of red and green fluorescence in real-time during biofilm formation. In addition, the fluorescence patterns of biofilm formed from saliva of eight different donors are described under simulated gingivitis and caries conditions. Biofilm formation was analysed for 12 hours under flow conditions in a microfluidic BioFlux flow system with high performance microscopy using a camera to allow live cell imaging. For fluorescence images dedicated excitation and emission filters were used. Both green and red fluorescence were linearly related with the total biomass of the biofilms. All biofilms displayed to some extent green and red fluorescence, with higher red and green fluorescence intensities from biofilms grown in the presence of serum (gingivitis simulation) as compared to the sucrose grown biofilms (cariogenic simulation). Remarkably, cocci with long chain lengths, presumably streptococci, were observed in the biofilms. Green and red fluorescence were not found homogeneously distributed within the biofilms: highly fluorescent spots (both green and red) were visible throughout the biomass. An increase in red fluorescence from the in vitro biofilms appeared to be related to the clinical inflammatory response of the respective saliva donors, which was previously assessed during an in vivo period of performing no-oral hygiene. The BioFlux model proved to be a reliable model to assess biofilm fluorescence. With this model, a prediction can be made whether a patient will be prone to the development of gingivitis or caries.
NASA Astrophysics Data System (ADS)
Koenig, Karsten; Schneckenburger, Herbert; Hemmer, Joerg; Tromberg, Bruce J.; Steiner, Rudolf W.
1994-05-01
Certain bacteria are able to synthesize metal-free fluorescent porphyrins and can therefore be detected by sensitive autofluorescence measurements in the red spectral region. The porphyrin-producing bacterium Propionibacterium acnes, which is involved in the pathogenesis of acne vulgaris, was localized in human skin. Spectrally resolved fluorescence images of bacteria distribution in the face were obtained by a slow-scan CCD camera combined with a tunable liquid crystal filter. The structured autofluorescence of dental caries and dental plaque in the red is caused by oral bacteria, like Bacteroides or Actinomyces odontolyticus. `Caries images' were created by time-gated imaging in the ns-region after ultrashort laser excitation. Time-gated measurements allow the suppression of backscattered light and non-porphyrin autofluorescence. Biopsies of oral squamous cell carcinoma exhibited red autofluorescence in necrotic regions and high concentrations of the porphyrin-producing bacterium Pseudomonas aerigunosa. These studies suggest that the temporal and spectral characteristics of bacterial autofluorescence can be used in the diagnosis and treatment of a variety of diseases.
Radiation properties of two types of luminous textile devices containing plastic optical fibers
NASA Astrophysics Data System (ADS)
Selm, Bärbel; Rothmaier, Markus
2007-05-01
Luminous textiles have the potential to satisfy a need for thin and flexible light diffusers for treatment of intraoral cancerous tissue. Plastic optical fibers (POF) with diameters of 250 microns and smaller are used to make the textiles luminous. Usually light is supplied to the optical fiber at both ends. On the textile surface light emission occurs in a woven structure via damaged straight POFs, whereas the embroidered structure radiates the light out of macroscopically bent POFs. We compared the optical properties of these two types of textile diffusers using red light laser for the embroidery and light emitting diode (LED) for the woven structure as light sources, and found efficiencies for the luminous areas of the two samples of 19 % (woven) and 32 % (embroidery), respectively. It was shown that the efficiency can be greatly improved using an aluminium backing. Additional scattering layers lower the fluence rate by around 30 %. To analyse the homogeneity we took a photo of the illuminated surface using a 3CCD camera and found, for both textiles, a slightly skewed distribution of the dark and bright pixels. The interquartile range of brightness distribution of the embroidery is more than double as the woven structure.
Red Light Represses the Photophysiology of the Scleractinian Coral Stylophora pistillata
Wijgerde, Tim; van Melis, Anne; Silva, Catarina I. F.; Leal, Miguel C.; Vogels, Luc; Mutter, Claudia; Osinga, Ronald
2014-01-01
Light spectrum plays a key role in the biology of symbiotic corals, with blue light resulting in higher coral growth, zooxanthellae density, chlorophyll a content and photosynthesis rates as compared to red light. However, it is still unclear whether these physiological processes are blue-enhanced or red-repressed. This study investigated the individual and combined effects of blue and red light on the health, zooxanthellae density, photophysiology and colouration of the scleractinian coral Stylophora pistillata over 6 weeks. Coral fragments were exposed to blue, red, and combined 50/50% blue red light, at two irradiance levels (128 and 256 μmol m−2 s−1). Light spectrum affected the health/survival, zooxanthellae density, and NDVI (a proxy for chlorophyll a content) of S. pistillata. Blue light resulted in highest survival rates, whereas red light resulted in low survival at 256 μmol m−2 s−1. Blue light also resulted in higher zooxanthellae densities compared to red light at 256 μmol m−2 s−1, and a higher NDVI compared to red and combined blue red light. Overall, our results suggest that red light negatively affects the health, survival, symbiont density and NDVI of S. pistillata, with a dominance of red over blue light for NDVI. PMID:24658108
Red light represses the photophysiology of the scleractinian coral Stylophora pistillata.
Wijgerde, Tim; van Melis, Anne; Silva, Catarina I F; Leal, Miguel C; Vogels, Luc; Mutter, Claudia; Osinga, Ronald
2014-01-01
Light spectrum plays a key role in the biology of symbiotic corals, with blue light resulting in higher coral growth, zooxanthellae density, chlorophyll a content and photosynthesis rates as compared to red light. However, it is still unclear whether these physiological processes are blue-enhanced or red-repressed. This study investigated the individual and combined effects of blue and red light on the health, zooxanthellae density, photophysiology and colouration of the scleractinian coral Stylophora pistillata over 6 weeks. Coral fragments were exposed to blue, red, and combined 50/50% blue red light, at two irradiance levels (128 and 256 μmol m(-2) s(-1)). Light spectrum affected the health/survival, zooxanthellae density, and NDVI (a proxy for chlorophyll a content) of S. pistillata. Blue light resulted in highest survival rates, whereas red light resulted in low survival at 256 μmol m(-2) s(-1). Blue light also resulted in higher zooxanthellae densities compared to red light at 256 μmol m(-2) s(-1), and a higher NDVI compared to red and combined blue red light. Overall, our results suggest that red light negatively affects the health, survival, symbiont density and NDVI of S. pistillata, with a dominance of red over blue light for NDVI.
STARING INTO THE WINDS OF DESTRUCTION: HST/NICMOS IMAGES OF THE PLANETARY NEBULA NGC 7027
NASA Technical Reports Server (NTRS)
2002-01-01
The Hubble Space Telescope's Near Infrared Camera and Multi-Object Spectrometer (NICMOS) has captured a glimpse of a brief stage in the burnout of NGC 7027, a medium-mass star like our sun. The infrared image (on the left) shows a young planetary nebula in a state of rapid transition. This image alone reveals important new information. When astronomers combine this photo with an earlier image taken in visible light, they have a more complete picture of the final stages of star life. NGC 7027 is going through spectacular death throes as it evolves into what astronomers call a 'planetary nebula.' The term planetary nebula came about not because of any real association with planets, but because in early telescopes these objects resembled the disks of planets. A star can become a planetary nebula after it depletes its nuclear fuel - hydrogen and helium - and begins puffing away layers of material. The material settles into a wind of gas and dust blowing away from the dying star. This NICMOS image captures the young planetary nebula in the middle of a very short evolutionary phase, lasting perhaps less than 1,000 years. During this phase, intense ultraviolet radiation from the central star lights up a region of gas surrounding it. (This gas is glowing brightly because it has been made very hot by the star's intense ultraviolet radiation.) Encircling this hot gas is a cloud of dust and cool molecular hydrogen gas that can only be seen by an infrared camera. The molecular gas is being destroyed by ultraviolet light from the central star. THE INFRARED VIEW -- The composite color image of NGC 7027 (on the left) is among the first data of a planetary nebula taken with NICMOS. This picture is actually composed of three separate images taken at different wavelengths. The red color represents cool molecular hydrogen gas, the most abundant gas in the universe. The image reveals the central star, which is difficult to see in images taken with visible light. Surrounding it is an elongated region of gas and dust cast off by the star. This gas (appearing as white) has a temperature of several tens of thousands of degrees Fahrenheit. The object has two 'cones' of cool molecular hydrogen gas (the red material) glowing in the infrared. The gas has been energized by ultraviolet light from the star - a process known as fluorescence. Most of the material shed by the star remains outside of the bright regions. It is invisible in this image because the layers of material in and near the bright regions are still shielding it from the central star's intense radiation. NGC 7027 is one of the smallest objects of its kind to be imaged by the Hubble telescope. However, the region seen here is approximately 14,000 times the average distance between Earth and the sun. THE INFRARED AND VISIBLE LIGHT VIEW -- This visible and infrared light picture of NGC 7027 (on the right) provides a more complete view of how this planetary nebula is being shaped, revealing steps in its evolution. This image is composed of three exposures, one from the Wide Field and Planetary Camera 2 (WFPC2) and two from NICMOS. The blue represents the WFPC2 image; the green and red, NICMOS exposures. The white is emission from the hot gas surrounding the central star; the red and pink represent emission from cool molecular hydrogen gas. In effect, the colors represent the three layers in the material ejected by the dying star. Each layer depicts a change in temperature, beginning with a hot, bright central region, continuing with a thin boundary zone where molecular hydrogen gas is glowing and being destroyed, and ending with a cool, blue outer region of molecular gas and dust. NICMOS has allowed astronomers to clearly see the transition layer from hot, glowing atomic gas to cold molecular gas. The origin of the newly seen filamentary structures is not yet understood. The transition region is clearly seen as the pink- and red-colored cool molecular hydrogen gas. An understanding of the atomic and chemical processes taking place in this transition region are of importance to other areas of astronomy as well, including star formation regions. WFPC2 is best used to study the hot, glowing gas, which is the bright, oval-shaped region surrounding the central star. With WFPC2 we also see material beyond this core with light from the central star that is reflecting off dust in the cold gas surrounding the nebula. Combining exposures from the two cameras allows astronomers to clearly see the way the nebula is being shaped by winds and radiation. This information will help astronomers understand the complexities of stellar evolution. NGC 7027 is located about 3,000 light-years from the sun in the direction of the constellation Cygnus the Swan. Credits: William B. Latter (SIRTF Science Center/Caltech) and NASA Other team investigators are: J. L. Hora (Smithsonian Astrophysical Observatory), J. H. Bieging (Steward Observatory), D. M. Kelly (University of Wyoming), A. Dayal (JPL/Caltech), A.G.G.M. Tielens (University of Groningen), and S. Trammell (University of North Carolina at Charlotte).
NASA Astrophysics Data System (ADS)
Oh, Mirae; Lee, Hoonsoo; Cho, Hyunjeong; Moon, Sang-Ho; Kim, Eun-Kyung; Kim, Moon S.
2016-05-01
Current meat inspection in slaughter plants, for food safety and quality attributes including potential fecal contamination, is conducted through by visual examination human inspectors. A handheld fluorescence-based imaging device (HFID) was developed to be an assistive tool for human inspectors by highlighting contaminated food and food contact surfaces on a display monitor. It can be used under ambient lighting conditions in food processing plants. Critical components of the imaging device includes four 405-nm 10-W LEDs for fluorescence excitation, a charge-coupled device (CCD) camera, optical filter (670 nm used for this study), and Wi-Fi transmitter for broadcasting real-time video/images to monitoring devices such as smartphone and tablet. This study aimed to investigate the effectiveness of HFID in enhancing visual detection of fecal contamination on red meat, fat, and bone surfaces of beef under varying ambient luminous intensities (0, 10, 30, 50 and 70 foot-candles). Overall, diluted feces on fat, red meat and bone areas of beef surfaces were detectable in the 670-nm single-band fluorescence images when using the HFID under 0 to 50 foot-candle ambient lighting.
Can light-field photography ease focusing on the scalp and oral cavity?
Taheri, Arash; Feldman, Steven R
2013-08-01
Capturing a well-focused image using an autofocus camera can be difficult in oral cavity and on a hairy scalp. Light-field digital cameras capture data regarding the color, intensity, and direction of rays of light. Having information regarding direction of rays of light, computer software can be used to focus on different subjects in the field after the image data have been captured. A light-field camera was used to capture the images of the scalp and oral cavity. The related computer software was used to focus on scalp or different parts of oral cavity. The final pictures were compared with pictures taken with conventional, compact, digital cameras. The camera worked well for oral cavity. It also captured the pictures of scalp easily; however, we had to repeat clicking between the hairs on different points to choose the scalp for focusing. A major drawback of the system was the resolution of the resulting pictures that was lower than conventional digital cameras. Light-field digital cameras are fast and easy to use. They can capture more information on the full depth of field compared with conventional cameras. However, the resolution of the pictures is relatively low. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Characterization of Vegetation using the UC Davis Remote Sensing Testbed
NASA Astrophysics Data System (ADS)
Falk, M.; Hart, Q. J.; Bowen, K. S.; Ustin, S. L.
2006-12-01
Remote sensing provides information about the dynamics of the terrestrial biosphere with continuous spatial and temporal coverage on many different scales. We present the design and construction of a suite of instrument modules and network infrastructure with size, weight and power constraints suitable for small scale vehicles, anticipating vigorous growth in unmanned aerial vehicles (UAV) and other mobile platforms. Our approach provides the rapid deployment and low cost acquisition of high aerial imagery for applications requiring high spatial resolution and revisits. The testbed supports a wide range of applications, encourages remote sensing solutions in new disciplines and demonstrates the complete range of engineering knowledge required for the successful deployment of remote sensing instruments. The initial testbed is deployed on a Sig Kadet Senior remote controlled plane. It includes an onboard computer with wireless radio, GPS, inertia measurement unit, 3-axis electronic compass and digital cameras. The onboard camera is either a RGB digital camera or a modified digital camera with red and NIR channels. Cameras were calibrated using selective light sources, an integrating spheres and a spectrometer, allowing for the computation of vegetation indices such as the NDVI. Field tests to date have investigated technical challenges in wireless communication bandwidth limits, automated image geolocation, and user interfaces; as well as image applications such as environmental landscape mapping focusing on Sudden Oak Death and invasive species detection, studies on the impact of bird colonies on tree canopies, and precision agriculture.
NASA Astrophysics Data System (ADS)
Bechis, K.; Pitruzzello, A.
2014-09-01
This presentation describes our ongoing research into using a ground-based light field camera to obtain passive, single-aperture 3D imagery of LEO objects. Light field cameras are an emerging and rapidly evolving technology for passive 3D imaging with a single optical sensor. The cameras use an array of lenslets placed in front of the camera focal plane, which provides angle of arrival information for light rays originating from across the target, allowing range to target and 3D image to be obtained from a single image using monocular optics. The technology, which has been commercially available for less than four years, has the potential to replace dual-sensor systems such as stereo cameras, dual radar-optical systems, and optical-LIDAR fused systems, thus reducing size, weight, cost, and complexity. We have developed a prototype system for passive ranging and 3D imaging using a commercial light field camera and custom light field image processing algorithms. Our light field camera system has been demonstrated for ground-target surveillance and threat detection applications, and this paper presents results of our research thus far into applying this technology to the 3D imaging of LEO objects. The prototype 3D imaging camera system developed by Northrop Grumman uses a Raytrix R5 C2GigE light field camera connected to a Windows computer with an nVidia graphics processing unit (GPU). The system has a frame rate of 30 Hz, and a software control interface allows for automated camera triggering and light field image acquisition to disk. Custom image processing software then performs the following steps: (1) image refocusing, (2) change detection, (3) range finding, and (4) 3D reconstruction. In Step (1), a series of 2D images are generated from each light field image; the 2D images can be refocused at up to 100 different depths. Currently, steps (1) through (3) are automated, while step (4) requires some user interaction. A key requirement for light field camera operation is that the target must be within the near-field (Fraunhofer distance) of the collecting optics. For example, in visible light the near-field of a 1-m telescope extends out to about 3,500 km, while the near-field of the AEOS telescope extends out over 46,000 km. For our initial proof of concept, we have integrated our light field camera with a 14-inch Meade LX600 advanced coma-free telescope, to image various surrogate ground targets at up to tens of kilometers range. Our experiments with the 14-inch telescope have assessed factors and requirements that are traceable and scalable to a larger-aperture system that would have the near-field distance needed to obtain 3D images of LEO objects. The next step would be to integrate a light field camera with a 1-m or larger telescope and evaluate its 3D imaging capability against LEO objects. 3D imaging of LEO space objects with light field camera technology can potentially provide a valuable new tool for space situational awareness, especially for those situations where laser or radar illumination of the target objects is not feasible.
Light field rendering with omni-directional camera
NASA Astrophysics Data System (ADS)
Todoroki, Hiroshi; Saito, Hideo
2003-06-01
This paper presents an approach to capture visual appearance of a real environment such as an interior of a room. We propose the method for generating arbitrary viewpoint images by building light field with the omni-directional camera, which can capture the wide circumferences. Omni-directional camera used in this technique is a special camera with the hyperbolic mirror in the upper part of a camera, so that we can capture luminosity in the environment in the range of 360 degree of circumferences in one image. We apply the light field method, which is one technique of Image-Based-Rendering(IBR), for generating the arbitrary viewpoint images. The light field is a kind of the database that records the luminosity information in the object space. We employ the omni-directional camera for constructing the light field, so that we can collect many view direction images in the light field. Thus our method allows the user to explore the wide scene, that can acheive realistic representation of virtual enviroment. For demonstating the proposed method, we capture image sequence in our lab's interior environment with an omni-directional camera, and succesfully generate arbitray viewpoint images for virual tour of the environment.
MISR Global Images See the Light of Day
NASA Technical Reports Server (NTRS)
2002-01-01
As of July 31, 2002, global multi-angle, multi-spectral radiance products are available from the MISR instrument aboard the Terra satellite. Measuring the radiative properties of different types of surfaces, clouds and atmospheric particulates is an important step toward understanding the Earth's climate system. These images are among the first planet-wide summary views to be publicly released from the Multi-angle Imaging SpectroRadiometer experiment. Data for these images were collected during the month of March 2002, and each pixel represents monthly-averaged daylight radiances from an area measuring 1/2 degree in latitude by 1/2 degree in longitude.The top panel is from MISR's nadir (vertical-viewing) camera and combines data from the red, green and blue spectral bands to create a natural color image. The central view combines near-infrared, red, and green spectral data to create a false-color rendition that enhances highly vegetated terrain. It takes 9 days for MISR to view the entire globe, and only areas within 8 degrees of latitude of the north and south poles are not observed due to the Terra orbit inclination. Because a single pole-to-pole swath of MISR data is just 400 kilometers wide, multiple swaths must be mosaiced to create these global views. Discontinuities appear in some cloud patterns as a consequence of changes in cloud cover from one day to another.The lower panel is a composite in which red, green, and blue radiances from MISR's 70-degree forward-viewing camera are displayed in the northern hemisphere, and radiances from the 70-degree backward-viewing camera are displayed in the southern hemisphere. At the March equinox (spring in the northern hemisphere, autumn in the southern hemisphere), the Sun is near the equator. Therefore, both oblique angles are observing the Earth in 'forward scattering', particularly at high latitudes. Forward scattering occurs when you (or MISR) observe an object with the Sun at a point in the sky that is in front of you. Relative to the nadir view, this geometry accentuates the appearance of polar clouds, and can even reveal clouds that are invisible in the nadir direction. In relatively clear ocean areas, the oblique-angle composite is generally brighter than its nadir counterpart due to enhanced reflection of light by atmospheric particulates.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.DOE Office of Scientific and Technical Information (OSTI.GOV)
Buie, Marc W.; Young, Eliot F.; Young, Leslie A.
We present new light-curve measurements of Pluto and Charon taken with the Advanced Camera for Surveys High-resolution Camera on the Hubble Space Telescope. The observations were collected from 2002 June to 2003 June at 12 distinct sub-Earth longitudes over a range of solar phase angle 0.{sup 0}36-1.{sup 0}74-a larger range than previously measured. The new measurements of Pluto show that the light-curve amplitude has decreased since the mutual event season in the late 1980s. We also show that the average brightness has increased in the F555W (Johnson V equivalent) passband while the brightness has decreased in the F435W (Johnson Bmore » equivalent) passband. These data thus indicate a substantial reddening of the reflected light from Pluto. We find a weighted mean (B - V) = 0.9540 {+-} 0.0010 that is considerably higher than the long-standing value of (B - V) = 0.868 {+-} 0.003 most recently measured in 1992-1993. This change in color cannot be explained by the evolving viewing geometry and provides the strongest evidence to date for temporal changes on the surface of Pluto that are expected to be linked to volatile transport processes. We also report on the discovery of a new rotational modulation of Pluto's hemispherical color that ranges from 0.92 to 0.98 with the least red color at the longitude of maximum light and most red at minimum light. The phase coefficient of Pluto is nearly the same as measured in 1992-1993 with a value of {beta} {sub B} = 0.0392 {+-} 0.0064 and {beta} {sub V} = 0.0355 {+-} 0.0045 mag deg{sup -1} for the F435W and F555W data, respectively. The Pluto phase curve is still very close to linear but a small but significant nonlinearity is seen in the data. In contrast, the light curve of Charon is essentially the same as in 1992/1993, albeit with much less noise. We confirm that Charon's Pluto-facing hemisphere is 8% brighter than the hemisphere facing away from Pluto. The color of Charon is independent of longitude and has a mean weighted value of (B - V) = 0.7315 {+-} 0.0013. The phase curve for Charon is now shown to be strongly nonlinear and wavelength dependent. We present results for both Pluto and Charon that better constrain the single-particle scattering parameters from the Hapke scattering theory.« less
Time-of-flight depth image enhancement using variable integration time
NASA Astrophysics Data System (ADS)
Kim, Sun Kwon; Choi, Ouk; Kang, Byongmin; Kim, James Dokyoon; Kim, Chang-Yeong
2013-03-01
Time-of-Flight (ToF) cameras are used for a variety of applications because it delivers depth information at a high frame rate. These cameras, however, suffer from challenging problems such as noise and motion artifacts. To increase signal-to-noise ratio (SNR), the camera should calculate a distance based on a large amount of infra-red light, which needs to be integrated over a long time. On the other hand, the integration time should be short enough to suppress motion artifacts. We propose a ToF depth imaging method to combine advantages of short and long integration times exploiting an imaging fusion scheme proposed for color imaging. To calibrate depth differences due to the change of integration times, a depth transfer function is estimated by analyzing the joint histogram of depths in the two images of different integration times. The depth images are then transformed into wavelet domains and fused into a depth image with suppressed noise and low motion artifacts. To evaluate the proposed method, we captured a moving bar of a metronome with different integration times. The experiment shows the proposed method could effectively remove the motion artifacts while preserving high SNR comparable to the depth images acquired during long integration time.
Uele River, Cleared Pasture Lands, Zaire, Africa
1992-05-16
STS049-91-079 (7 - 16 May 1992) --- This 70mm frame, photographed from the Earth-orbiting Space Shuttle Endeavour, features a dendritic drainage pattern in Zaire. Cleared pasture land shows light green in this color photograph, in contrast to the dark, closed-canopy forest of Zaire. Remnant woodland along minor streams indicates the intricate drainage network of this hilly region. Scattered vegetation-free spots show the deep red, tropical soil of the region. The sediment-laden stream is the Vele River just west of the village of Niangara. A crew member used a 70mm handheld Hasselblad camera with a 250mm lens to record the image.
Far-red light is needed for efficient photochemistry and photosynthesis.
Zhen, Shuyang; van Iersel, Marc W
2017-02-01
The efficiency of monochromatic light to drive photosynthesis drops rapidly at wavelengths longer than 685nm. The photosynthetic efficiency of these longer wavelengths can be improved by adding shorter wavelength light, a phenomenon known as the Emerson enhancement effect. The reverse effect, the enhancement of photosynthesis under shorter wavelength light by longer wavelengths, however, has not been well studied and is often thought to be insignificant. We quantified the effect of adding far-red light (peak at 735nm) to red/blue or warm-white light on the photosynthetic efficiency of lettuce (Lactuca sativa). Adding far-red light immediately increased quantum yield of photosystem II (Φ PSII ) of lettuce by an average of 6.5 and 3.6% under red/blue and warm-white light, respectively. Similar or greater increases in Φ PSII were observed after 20min of exposure to far-red light. This longer-term effect of far-red light on Φ PSII was accompanied by a reduction in non-photochemical quenching of fluorescence (NPQ), indicating that far-red light reduced the dissipation of absorbed light as heat. The increase in Φ PSII and complementary decrease in NPQ is presumably due to preferential excitation of photosystem I (PSI) by far-red light, which leads to faster re-oxidization of the plastoquinone pool. This facilitates reopening of PSII reaction centers, enabling them to use absorbed photons more efficiently. The increase in Φ PSII by far-red light was associated with an increase in net photosynthesis (P n ). The stimulatory effect of far-red light increased asymptotically with increasing amounts of far-red. Overall, our results show that far-red light can increase the photosynthetic efficiency of shorter wavelength light that over-excites PSII. Copyright © 2016 Elsevier GmbH. All rights reserved.
Spectrally balanced chromatic landing approach lighting system
NASA Technical Reports Server (NTRS)
Chase, W. D. (Inventor)
1981-01-01
Red warning lights delineate the runway approach with additional blue lights juxtaposed with the red lights such that the red lights are chromatically balanced. The red/blue point light sources result in the phenomenon that the red lights appear in front of the blue lights with about one and one-half times the diameter of the blue. To a pilot observing these lights along a glide path, those red lights directly below appear to be nearer than the blue lights. For those lights farther away seen in perspective at oblique angles, the red lights appear to be in a position closer to the pilot and hence appear to be above the corresponding blue lights. This produces a very pronounced three dimensional effect referred to as chromostereopsis which provides valuable visual cues to enable the pilot to perceive his actual position above the ground and the actual distance to the runway.
Relating transverse ray error and light fields in plenoptic camera images
NASA Astrophysics Data System (ADS)
Schwiegerling, Jim; Tyo, J. Scott
2013-09-01
Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. The camera image is focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The resultant image is an array of circular exit pupil images, each corresponding to the overlying lenslet. The position of the lenslet encodes the spatial information of the scene, whereas as the sensor pixels encode the angular information for light incident on the lenslet. The 4D light field is therefore described by the 2D spatial information and 2D angular information captured by the plenoptic camera. In aberration theory, the transverse ray error relates the pupil coordinates of a given ray to its deviation from the ideal image point in the image plane and is consequently a 4D function as well. We demonstrate a technique for modifying the traditional transverse ray error equations to recover the 4D light field of a general scene. In the case of a well corrected optical system, this light field is easily related to the depth of various objects in the scene. Finally, the effects of sampling with both the lenslet array and the camera sensor on the 4D light field data are analyzed to illustrate the limitations of such systems.
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] Figure 1 Stellar Snowflake Cluster Combined Image [figure removed for brevity, see original site] Figure 2 Infrared Array CameraFigure 3 Multiband Imaging Photometer Newborn stars, hidden behind thick dust, are revealed in this image of a section of the Christmas Tree cluster from NASA's Spitzer Space Telescope, created in joint effort between Spitzer's infrared array camera and multiband imaging photometer instruments. The newly revealed infant stars appear as pink and red specks toward the center of the combined image (fig. 1). The stars appear to have formed in regularly spaced intervals along linear structures in a configuration that resembles the spokes of a wheel or the pattern of a snowflake. Hence, astronomers have nicknamed this the 'Snowflake' cluster. Star-forming clouds like this one are dynamic and evolving structures. Since the stars trace the straight line pattern of spokes of a wheel, scientists believe that these are newborn stars, or 'protostars.' At a mere 100,000 years old, these infant structures have yet to 'crawl' away from their location of birth. Over time, the natural drifting motions of each star will break this order, and the snowflake design will be no more. While most of the visible-light stars that give the Christmas Tree cluster its name and triangular shape do not shine brightly in Spitzer's infrared eyes, all of the stars forming from this dusty cloud are considered part of the cluster. Like a dusty cosmic finger pointing up to the newborn clusters, Spitzer also illuminates the optically dark and dense Cone nebula, the tip of which can be seen towards the bottom left corner of each image. This combined image shows the presence of organic molecules mixed with dust as wisps of green, which have been illuminated by nearby star formation. The larger yellowish dots neighboring the baby red stars in the Snowflake Cluster are massive stellar infants forming from the same cloud. The blue dots sprinkled across the image represent older Milky Way stars at various distances along this line of sight. This image is a five-channel, false-color composite, showing emission from wavelengths of 3.6 and 4.5 microns (blue), 5.8 microns (cyan), 8 microns (green), and 24 microns (red). The top right (fig. 2) image from the infrared array camera show that the nebula is still actively forming stars. The wisps of red (represented as green in the combined image) are organic molecules mixed with dust, which has been illuminated by nearby star formation. The infrared array camera picture is a four-channel, false-color composite, showing emission from wavelengths of 3.6 microns (blue), 4.5 microns (green), 5.8 microns (orange) and 8.0 microns (red). The bottom right image (fig. 3) from the multiband imaging photometer shows the colder dust of the nebula and unwraps the youngest stellar babies from their dusty covering. This is a false-color image showing emission at 24 microns (red).1986-01-17
Range : 9.1 million miles (5.7 million miles) P-29478C These two images pictures of Uranus, one in true color and the other in false color, were shot by Voyager 2's narrow angle camera. The picture at left has been processed to show Uranus as the human eye would see from the vantage point of the spacecraft. The image is a composite of shots taken through blue, green, and orange filters. The darker shadings on the upper right of the disk correspond to day-night boundaries on the planet. Beyond this boundary lies the hidden northern hemisphere of Uranus, which currently remains in total darkness as the planet rotates. The blue-green color results from the aborption of red light by methane gas in Uranus' deep, cold, and remarkably clear atmosphere. The picture at right uses false color and extreme contrast to bring out subtle details in the polar region of Uranus. Images obtained through ultraviolet, violet, and orange filters were respectively converted to the same blue, green, and red colors used to produce the picture at left. The very slight contrasts visible in true color are greatly exaggerated here. In this false colr picture, Uranus reveals a dark polar hood surrounded by aseries of progressively lighter concentric bands. One possible explanation is that a brownish haze or smog, concentrated around the pole, is arranged into bands of zonal motions of the upper atmosphere. Several artifacts of the optics and processing are visible. The occasional donut shapes are shadows cast by dust in the camera optics;the processing needed to bring ot faint features also bring out camera blemishes. in addition, the bright pink strip at the lower edge of the planets limb is an artifact of the image enhancement. In fact, the limb is dark and uniform in color around the planet.
What's Old is New in the Large Magellanic Cloud
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] Poster Version Large Magellanic Cloud This vibrant image from NASA's Spitzer Space Telescope shows the Large Magellanic Cloud, a satellite galaxy to our own Milky Way galaxy. The infrared image, a mosaic of 300,000 individual tiles, offers astronomers a unique chance to study the lifecycle of stars and dust in a single galaxy. Nearly one million objects are revealed for the first time in this Spitzer view, which represents about a 1,000-fold improvement in sensitivity over previous space-based missions. Most of the new objects are dusty stars of various ages populating the Large Magellanic Cloud; the rest are thought to be background galaxies. The blue color in the picture, seen most prominently in the central bar, represents starlight from older stars. The chaotic, bright regions outside this bar are filled with hot, massive stars buried in thick blankets of dust. The red color around these bright regions is from dust heated by stars, while the red dots scattered throughout the picture are either dusty, old stars or more distant galaxies. The greenish clouds contain cooler interstellar gas and molecular-sized dust grains illuminated by ambient starlight. Astronomers say this image allows them to quantify the process by which space dust -- the same stuff that makes up planets and even people -- is recycled in a galaxy. The picture shows dust at its three main cosmic hangouts: around the young stars, where it is being consumed (red-tinted, bright clouds); scattered about in the space between stars (greenish clouds); and in expelled shells of material from old stars (randomly-spaced red dots). The Large Magellanic Cloud, located 160,000 light-years from Earth, is one of a handful of dwarf galaxies that orbit our own Milky Way. It is approximately one-third as wide as the Milky Way, and, if it could be seen in its entirety, would cover the same amount of sky as a grid of about 480 full moons. About one-third of the entire galaxy can be seen in the Spitzer image. This picture is a composite of infrared light captured by Spitzer. Light with wavelengths of 3.6 (blue) and 8 (green) microns was captured by the telescope's infrared array camera; 24-micron light (red) was detected by the multiband imaging photometer.What Old is New in the Large Magellanic Cloud
2006-09-01
This vibrant image from NASA's Spitzer Space Telescope shows the Large Magellanic Cloud, a satellite galaxy to our own Milky Way galaxy. The infrared image, a mosaic of 300,000 individual tiles, offers astronomers a unique chance to study the lifecycle of stars and dust in a single galaxy. Nearly one million objects are revealed for the first time in this Spitzer view, which represents about a 1,000-fold improvement in sensitivity over previous space-based missions. Most of the new objects are dusty stars of various ages populating the Large Magellanic Cloud; the rest are thought to be background galaxies. The blue color in the picture, seen most prominently in the central bar, represents starlight from older stars. The chaotic, bright regions outside this bar are filled with hot, massive stars buried in thick blankets of dust. The red color around these bright regions is from dust heated by stars, while the red dots scattered throughout the picture are either dusty, old stars or more distant galaxies. The greenish clouds contain cooler interstellar gas and molecular-sized dust grains illuminated by ambient starlight. Astronomers say this image allows them to quantify the process by which space dust -- the same stuff that makes up planets and even people -- is recycled in a galaxy. The picture shows dust at its three main cosmic hangouts: around the young stars, where it is being consumed (red-tinted, bright clouds); scattered about in the space between stars (greenish clouds); and in expelled shells of material from old stars (randomly-spaced red dots). The Large Magellanic Cloud, located 160,000 light-years from Earth, is one of a handful of dwarf galaxies that orbit our own Milky Way. It is approximately one-third as wide as the Milky Way, and, if it could be seen in its entirety, would cover the same amount of sky as a grid of about 480 full moons. About one-third of the entire galaxy can be seen in the Spitzer image. This picture is a composite of infrared light captured by Spitzer. Light with wavelengths of 3.6 (blue) and 8 (green) microns was captured by the telescope's infrared array camera; 24-micron light (red) was detected by the multiband imaging photometer. http://photojournal.jpl.nasa.gov/catalog/PIA07137
2017-12-08
Hubble’s Spirograph In this classic Hubble image from 2000, the planetary nebula IC 418 glows like a multifaceted jewel with enigmatic patterns. IC 418 lies about 2,000 light-years from Earth in the direction of the constellation Lepus. A planetary nebula represents the final stage in the evolution of a star similar to our sun. The star at the center of IC 418 was a red giant a few thousand years ago, but then ejected its outer layers into space to form the nebula, which has now expanded to a diameter of about 0.1 light-year. The stellar remnant at the center is the hot core of the red giant, from which ultraviolet radiation floods out into the surrounding gas, causing it to fluoresce. Over the next several thousand years, the nebula will gradually disperse into space, and then the star will cool and fade away for billions of years as a white dwarf. Our own sun is expected to undergo a similar fate, but fortunately, this will not occur until some 5 billion years from now. The Hubble image of IC 418 is shown with colors added to represent the different camera filters used that isolate light from various chemical elements. Red shows emission from ionized nitrogen (the coolest gas in the nebula, located furthest from the hot nucleus), green shows emission from hydrogen and blue traces the emission from ionized oxygen (the hottest gas, closest to the central star). The remarkable textures seen in the nebula are newly revealed by the Hubble Space Telescope, and their origin is still uncertain. Read more: go.nasa.gov/2roofKS Credit: NASA and The Hubble Heritage Team (STScI/AURA); Acknowledgment: Dr. Raghvendra Sahai (JPL) and Dr. Arsen R. Hajian (USNO)
Dental calculus detection using the VistaCam.
Shakibaie, Fardad; Walsh, Laurence J
2016-12-01
The VistaCam® intra-oral camera system (Dürr Dental, Bietigheim-Bissingen, Germany) is a fluorescence system using light emitting diodes that produce a 405-nm violet light. This wavelength has potential application for detection of dental calculus based on red emissions from porphyrin molecules. This study assessed the digital scores obtained for both supragingival and subgingival calculus on 60 extracted teeth and compared these with lesions of dental caries. It has also examined the effect of saliva and blood on the fluorescence readings for dental calculus. VistaCam fluorescence scores for both supragingival (1.7-3.3) and subgingival calculus (1.3-2.4) were higher than those for sound root surfaces (0.9-1.1) and dental caries (0.9-2.2) ( p < .05). The readings for calculus samples were not affected by the presence of saliva or blood. These results suggest that the use of violet light fluorescence could be a possible adjunct to clinical examination for deposits of dental calculus.
NASA Astrophysics Data System (ADS)
Raghavan, Ajay; Saha, Bhaskar
2013-03-01
Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.
Trained neurons-based motion detection in optical camera communications
NASA Astrophysics Data System (ADS)
Teli, Shivani; Cahyadi, Willy Anugrah; Chung, Yeon Ho
2018-04-01
A concept of trained neurons-based motion detection (TNMD) in optical camera communications (OCC) is proposed. The proposed TNMD is based on neurons present in a neural network that perform repetitive analysis in order to provide efficient and reliable motion detection in OCC. This efficient motion detection can be considered another functionality of OCC in addition to two traditional functionalities of illumination and communication. To verify the proposed TNMD, the experiments were conducted in an indoor static downlink OCC, where a mobile phone front camera is employed as the receiver and an 8 × 8 red, green, and blue (RGB) light-emitting diode array as the transmitter. The motion is detected by observing the user's finger movement in the form of centroid through the OCC link via a camera. Unlike conventional trained neurons approaches, the proposed TNMD is trained not with motion itself but with centroid data samples, thus providing more accurate detection and far less complex detection algorithm. The experiment results demonstrate that the TNMD can detect all considered motions accurately with acceptable bit error rate (BER) performances at a transmission distance of up to 175 cm. In addition, while the TNMD is performed, a maximum data rate of 3.759 kbps over the OCC link is obtained. The OCC with the proposed TNMD combined can be considered an efficient indoor OCC system that provides illumination, communication, and motion detection in a convenient smart home environment.
A GRAND VIEW OF THE BIRTH OF 'HEFTY' STARS - 30 DORADUS NEBULA DETAILS
NASA Technical Reports Server (NTRS)
2002-01-01
These are two views of a highly active region of star birth located northeast of the central cluster, R136, in 30 Doradus. The orientation and scale are identical for both views. The top panel is a composite of images in two colors taken with the Hubble Space Telescope's visible-light camera, the Wide Field and Planetary Camera 2 (WFPC2). The bottom panel is a composite of pictures taken through three infrared filters with Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS). In both cases the colors of the displays were chosen to correlate with the nebula's and stars' true colors. Seven very young objects are identified with numbered arrows in the infrared image. Number 1 is a newborn, compact cluster dominated by a triple system of 'hefty' stars. It has formed within the head of a massive dust pillar pointing toward R136. The energetic outflows from R136 have shaped the pillar and triggered the collapse of clouds within its summit to form the new stars. The radiation and outflows from these new stars have in turn blown off the top of the pillar, so they can be seen in the visible-light as well as the infrared image. Numbers 2 and 3 also pinpoint newborn stars or stellar systems inside an adjacent, bright-rimmed pillar, likewise oriented toward R136. These objects are still immersed within their natal dust and can be seen only as very faint, red points in the visible-light image. They are, however, among the brightest objects in the infrared image, since dust does not block infrared light as much as visible light. Thus, numbers 2 and 3 and number 1 correspond respectively to two successive stages in the birth of massive stars. Number 4 is a very red star that has just formed within one of several very compact dust clouds nearby. Number 5 is another very young triple-star system with a surrounding cluster of fainter stars. They also can be seen in the visible-light picture. Most remarkable are the glowing patches numbered 6 and 7, which astronomers have interpreted as 'impact points' produced by twin jets of material slamming into surrounding dust clouds. These 'impact points' are perfectly aligned on opposite sides of number 5 (the triple-star system), and each is separated from the star system by about 5 light-years. The jets probably originate from a circumstellar disk around one of the young stars in number 5. They may be rotating counterclockwise, thus producing moving, luminous patches on the surrounding dust, like a searchlight creating spots on clouds. These infrared patches produced by jets from a massive, young star are a new astronomical phenomenon. Credits for NICMOS image: NASA/Nolan Walborn (Space Telescope Science Institute, Baltimore, Md.) and Rodolfo Barba' (La Plata Observatory, La Plata, Argentina) Credits for WFPC2 image: NASA/John Trauger (Jet Propulsion Laboratory, Pasadena, Calif.) and James Westphal (California Institute of Technology, Pasadena, Calif.)
NASA Technical Reports Server (NTRS)
2003-01-01
Dark smoke from oil fires extend for about 60 kilometers south of Iraq's capital city of Baghdad in these images acquired by the Multi-angle Imaging SpectroRadiometer (MISR) on April 2, 2003. The thick, almost black smoke is apparent near image center and contains chemical and particulate components hazardous to human health and the environment.The top panel is from MISR's vertical-viewing (nadir) camera. Vegetated areas appear red here because this display is constructed using near-infrared, red and blue band data, displayed as red, green and blue, respectively, to produce a false-color image. The bottom panel is a combination of two camera views of the same area and is a 3-D stereo anaglyph in which red band nadir camera data are displayed as red, and red band data from the 60-degree backward-viewing camera are displayed as green and blue. Both panels are oriented with north to the left in order to facilitate stereo viewing. Viewing the 3-D anaglyph with red/blue glasses (with the red filter placed over the left eye and the blue filter over the right) makes it possible to see the rising smoke against the surface terrain. This technique helps to distinguish features in the atmosphere from those on the surface. In addition to the smoke, several high, thin cirrus clouds (barely visible in the nadir view) are readily observed using the stereo image.The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 17489. The panels cover an area of about 187 kilometers x 123 kilometers, and use data from blocks 63 to 65 within World Reference System-2 path 168.MISR was built and is managed by NASA's Jet Propulsion Laboratory,Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Sturm, Sabine; Engelken, Johannes; Gruber, Ansgar; Vugrinec, Sascha; Kroth, Peter G; Adamska, Iwona; Lavaud, Johann
2013-07-30
Light, the driving force of photosynthesis, can be harmful when present in excess; therefore, any light harvesting system requires photoprotection. Members of the extended light-harvesting complex (LHC) protein superfamily are involved in light harvesting as well as in photoprotection and are found in the red and green plant lineages, with a complex distribution pattern of subfamilies in the different algal lineages. Here, we demonstrate that the recently discovered "red lineage chlorophyll a/b-binding-like proteins" (RedCAPs) form a monophyletic family within this protein superfamily. The occurrence of RedCAPs was found to be restricted to the red algal lineage, including red algae (with primary plastids) as well as cryptophytes, haptophytes and heterokontophytes (with secondary plastids of red algal origin). Expression of a full-length RedCAP:GFP fusion construct in the diatom Phaeodactylum tricornutum confirmed the predicted plastid localisation of RedCAPs. Furthermore, we observed that similarly to the fucoxanthin chlorophyll a/c-binding light-harvesting antenna proteins also RedCAP transcripts in diatoms were regulated in a diurnal way at standard light conditions and strongly repressed at high light intensities. The absence of RedCAPs from the green lineage implies that RedCAPs evolved in the red lineage after separation from the the green lineage. During the evolution of secondary plastids, RedCAP genes therefore must have been transferred from the nucleus of the endocytobiotic alga to the nucleus of the host cell, a process that involved complementation with pre-sequences allowing import of the gene product into the secondary plastid bound by four membranes. Based on light-dependent transcription and on localisation data, we propose that RedCAPs might participate in the light (intensity and quality)-dependent structural or functional reorganisation of the light-harvesting antennae of the photosystems upon dark to light shifts as regularly experienced by diatoms in nature. Remarkably, in plastids of the red lineage as well as in green lineage plastids, the phycobilisome based cyanobacterial light harvesting system has been replaced by light harvesting systems that are based on members of the extended LHC protein superfamily, either for one of the photosystems (PS I of red algae) or for both (diatoms). In their proposed function, the RedCAP protein family may thus have played a role in the evolutionary structural remodelling of light-harvesting antennae in the red lineage.
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor.
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-03-23
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works.
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-01-01
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works. PMID:29570690
NASA Astrophysics Data System (ADS)
Barkhouser, Robert H.; Arns, James; Gunn, James E.
2014-08-01
The Prime Focus Spectrograph (PFS) is a major instrument under development for the 8.2 m Subaru telescope on Mauna Kea. Four identical, fixed spectrograph modules are located in a room above one Nasmyth focus. A 55 m fiber optic cable feeds light into the spectrographs from a robotic fiber positioner mounted at the telescope prime focus, behind the wide field corrector developed for Hyper Suprime-Cam. The positioner contains 2400 fibers and covers a 1.3 degree hexagonal field of view. Each spectrograph module will be capable of simultaneously acquiring 600 spectra. The spectrograph optical design consists of a Schmidt collimator, two dichroic beamsplitters to separate the light into three channels, and for each channel a volume phase holographic (VPH) grating and a dual- corrector, modified Schmidt reimaging camera. This design provides a 275 mm collimated beam diameter, wide simultaneous wavelength coverage from 380 nm to 1.26 µm, and good imaging performance at the fast f/1.1 focal ratio required from the cameras to avoid oversampling the fibers. The three channels are designated as the blue, red, and near-infrared (NIR), and cover the bandpasses 380-650 nm (blue), 630-970 nm (red), and 0.94-1.26 µm (NIR). A mosaic of two Hamamatsu 2k×4k, 15 µm pixel CCDs records the spectra in the blue and red channels, while the NIR channel employs a 4k×4k, substrate-removed HAWAII-4RG array from Teledyne, with 15 µm pixels and a 1.7 µm wavelength cutoff. VPH gratings have become the dispersing element of choice for moderate-resolution astronomical spectro- graphs due their potential for very high diffraction efficiency, low scattered light, and the more compact instru- ment designs offered by transmissive dispersers. High quality VPH gratings are now routinely being produced in the sizes required for instruments on large telescopes. These factors made VPH gratings an obvious choice for PFS. In order to reduce risk to the project, as well as fully exploit the performance potential of this technology, a set of three prototype VPH gratings (one each of the blue, red, and NIR designs) was ordered and has been recently delivered. The goal for these prototype units, but not a requirement, was to meet the specifications for the final gratings in order to serve as spares and also as early demonstration and integration articles. In this paper we present the design and specifications for the PFS gratings, the plan and setups used for testing both the prototype and final gratings, and results from recent optical testing of the prototype grating set.
Gene profiling of the red light signalling pathways in roots.
Molas, Maria Lia; Kiss, John Z; Correll, Melanie J
2006-01-01
Red light, acting through the phytochromes, controls numerous aspects of plant development. Many of the signal transduction elements downstream of the phytochromes have been identified in the aerial portions of the plant; however, very few elements in red-light signalling have been identified specifically for roots. Gene profiling studies using microarrays and quantitative Real-Time PCR were performed to characterize gene expression changes in roots of Arabidopsis seedlings exposed to 1 h of red light. Several factors acting downstream of phytochromes in red-light signalling in roots were identified. Some of the genes found to be differentially expressed in this study have already been characterized in the red-light-signalling pathway for whole plants. For example, PHYTOCHROME KINASE 1 (PKS1), LONG HYPOCOTYL 5 (HY5), EARLY FLOWERING 4 (ELF4), and GIGANTEA (GI) were all significantly up-regulated in roots of seedlings exposed to 1 h of red light. The up-regulation of SUPPRESSOR OF PHYTOCHROME A RESPONSES 1 (SPA1) and CONSTITUTIVE PHOTOMORPHOGENIC 1-like (COP1-like) genes suggests that the PHYA-mediated pathway was attenuated by red light. In addition, genes involved in lateral root and root hair formation, root plastid development, phenylpropanoid metabolism, and hormone signalling were also regulated by exposure to red light. Interestingly, members of the RPT2/NPH3 (ROOT PHOTOTROPIC 2/NON PHOTOTROPIC HYPOCOTYL 3) family, which have been shown to mediate blue-light-induced phototropism, were also differentially regulated in roots in red light. Therefore, these results suggest that red and blue light pathways interact in roots of seedlings and that many elements involved in red-light-signalling found in the aerial portions of the plant are differentially expressed in roots within 1 h of red light exposure.
NASA Astrophysics Data System (ADS)
Mehrübeoğlu, Mehrübe; McLauchlan, Lifford
2006-02-01
The goal of this project was to detect the intensity of traffic on a road at different times of the day during daytime. Although the work presented utilized images from a section of a highway, the results of this project are intended for making decisions on the type of intervention necessary on any given road at different times for traffic control, such as installation of traffic signals, duration of red, green and yellow lights at intersections, and assignment of traffic control officers near school zones or other relevant locations. In this project, directional patterns are used to detect and count the number of cars in traffic images over a fixed area of the road to determine local traffic intensity. Directional patterns are chosen because they are simple and common to almost all moving vehicles. Perspective vision effects specific to each camera orientation has to be considered, as they affect the size and direction of patterns to be recognized. In this work, a simple and fast algorithm has been developed based on horizontal directional pattern matching and perspective vision adjustment. The results of the algorithm under various conditions are presented and compared in this paper. Using the developed algorithm, the traffic intensity can accurately be determined on clear days with average sized cars. The accuracy is reduced on rainy days when the camera lens contains raindrops, when there are very long vehicles, such as trucks or tankers, in the view, and when there is very low light around dusk or dawn.
Compact whole-body fluorescent imaging of nude mice bearing EGFP expressing tumor
NASA Astrophysics Data System (ADS)
Chen, Yanping; Xiong, Tao; Chu, Jun; Yu, Li; Zeng, Shaoqun; Luo, Qingming
2005-01-01
Issue of tumor has been a hotspot of current medicine. It is important for tumor research to detect tumors bearing in animal models easily, fast, repetitively and noninvasivly. Many researchers have paid their increasing interests on the detecting. Some contrast agents, such as green fluorescent protein (GFP) and Discosoma red fluorescent protein (Dsred) were applied to enhance image quality. Three main kinds of imaging scheme were adopted to visualize fluorescent protein expressing tumors in vivo. These schemes based on fluorescence stereo microscope, cooled charge-coupled-device (CCD) or camera as imaging set, and laser or mercury lamp as excitation light source. Fluorescence stereo microscope, laser and cooled CCD are expensive to many institutes. The authors set up an inexpensive compact whole-body fluorescent imaging tool, which consisted of a Kodak digital camera (model DC290), fluorescence filters(B and G2;HB Optical, Shenyang, Liaoning, P.R. China) and a mercury 50-W lamp power supply (U-LH50HG;Olympus Optical, Japan) as excitation light source. The EGFP was excited directly by mercury lamp with D455/70 nm band-pass filter and fluorescence was recorded by digital camera with 520nm long-pass filter. By this easy operation tool, the authors imaged, in real time, fluorescent tumors growing in live mice. The imaging system is external and noninvasive. For half a year our experiments suggested the imaging scheme was feasible. Whole-body fluorescence optical imaging for fluorescent expressing tumors in nude mouse is an ideal tool for antitumor, antimetastatic, and antiangiogenesis drug screening.
2016-09-05
Saturn's rings appear to bend as they pass behind the planet's darkened limb due to refraction by Saturn's upper atmosphere. The effect is the same as that seen in an earlier Cassini view (see PIA20491), except this view looks toward the unlit face of the rings, while the earlier image viewed the rings' sunlit side. The difference in illumination brings out some noticeable differences. The A ring is much darker here, on the rings' unlit face, since its larger particles primarily reflect light back toward the sun (and away from Cassini's cameras in this view). The narrow F ring (at bottom), which was faint in the earlier image, appears brighter than all of the other rings here, thanks to the microscopic dust that is prevalent within that ring. Small dust tends to scatter light forward (meaning close to its original direction of travel), making it appear bright when backlit. (A similar effect has plagued many a driver with a dusty windshield when driving toward the sun.) This view looks toward the unilluminated side of the rings from about 19 degrees below the ring plane. The image was taken in red light with the Cassini spacecraft narrow-angle camera on July 24, 2016. The view was acquired at a distance of approximately 527,000 miles (848,000 kilometers) from Saturn and at a sun-Saturn-spacecraft, or phase, angle of 169 degrees. Image scale is 3 miles (5 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20497
Large, high resolution integrating TV sensor for astronomical appliations
NASA Technical Reports Server (NTRS)
Spitzer, L. J.
1977-01-01
A magnetically focused SEC tube developed for photometric applications is described. Efforts to design a 70 mm version of the tube which meets the ST f/24 camera requirements of the space telescope are discussed. The photometric accuracy of the 70 mm tube is executed to equal that of the previously developed 35 mm tube. The tube meets the criterion of 50 percent response at 20 cycles/mm in the central region of the format, and, with the removal of the remaining magnetic parts, this spatial frequency is expected over almost all of the format. Since the ST f/24 camera requires sensitivity in the red as well as the ultraviolet and visible spectra, attempts were made to develop tubes with this ability. It was found that it may be necessary to choose between red and u.v. sensitivity and tradeoff red sensitivity for low background. Results of environmental tests indicate no substantive problems in utilizing it in a flight camera system that will meet the space shuttle launch requirements.
Life at the Intersection of Colliding Galaxies
2004-09-07
This false-color image from NASA's Spitzer Space Telescope reveals hidden populations of newborn stars at the heart of the colliding "Antennae" galaxies. These two galaxies, known individually as NGC 4038 and 4039, are located around 68 million light-years away and have been merging together for about the last 800 million years. The latest Spitzer observations provide a snapshot of the tremendous burst of star formation triggered in the process of this collision, particularly at the site where the two galaxies overlap. The image was taken by Spitzer's infrared array camera and is a combination of infrared light ranging from 3.6 microns (shown in blue) to 8.0 microns (shown in red). The dust emission (red) is by far the strongest feature in this image. Starlight was systematically subtracted from the longer wavelength data (red) to enhance dust features. The two nuclei, or centers, of the merging galaxies show up as white areas, one above the other. The brightest clouds of forming stars lie in the overlap region between and left of the nuclei. Throughout the sky, astronomers have identified many of these so-called "interacting" galaxies, whose spiral discs have been stretched and distorted by their mutual gravity as they pass close to one another. The distances involved are so large that the interactions evolve on timescales comparable to geologic changes on Earth. Observations of such galaxies, combined with computer models of these collisions, show that the galaxies often become forever bound to one another, eventually merging into a single, spheroidal-shaped galaxy. Wavelengths of 3.6 microns are represented in blue, 4.5 microns in green and 5.8-8.0 microns in red. This image was taken on Dec. 24, 2003. http://photojournal.jpl.nasa.gov/catalog/PIA06853
Condenser for illuminating a ringfield camera with synchrotron emission light
Sweatt, W.C.
1996-04-30
The present invention relates generally to the field of condensers for collecting light from a synchrotron radiation source and directing the light into a ringfield of a lithography camera. The present invention discloses a condenser comprising collecting, processing, and imaging optics. The collecting optics are comprised of concave and convex spherical mirrors that collect the light beams. The processing optics, which receive the light beams, are comprised of flat mirrors that converge and direct the light beams into a real entrance pupil of the camera in a symmetrical pattern. In the real entrance pupil are located flat mirrors, common to the beams emitted from the preceding mirrors, for generating substantially parallel light beams and for directing the beams toward the ringfield of a camera. Finally, the imaging optics are comprised of a spherical mirror, also common to the beams emitted from the preceding mirrors, images the real entrance pupil through the resistive mask and into the virtual entrance pupil of the camera. Thus, the condenser is comprised of a plurality of beams with four mirrors corresponding to a single beam plus two common mirrors. 9 figs.
Condenser for illuminating a ringfield camera with synchrotron emission light
Sweatt, William C.
1996-01-01
The present invention relates generally to the field of condensers for collecting light from a synchrotron radiation source and directing the light into a ringfield of a lithography camera. The present invention discloses a condenser comprising collecting, processing, and imaging optics. The collecting optics are comprised of concave and convex spherical mirrors that collect the light beams. The processing optics, which receive the light beams, are comprised of flat mirrors that converge and direct the light beams into a real entrance pupil of the camera in a symmetrical pattern. In the real entrance pupil are located flat mirrors, common to the beams emitted from the preceding mirrors, for generating substantially parallel light beams and for directing the beams toward the ringfield of a camera. Finally, the imaging optics are comprised of a spherical mirror, also common to the beams emitted from the preceding mirrors, images the real entrance pupil through the resistive mask and into the virtual entrance pupil of the camera. Thus, the condenser is comprised of a plurality of beams with four mirrors corresponding to a single beam plus two common mirrors.
Li, Tian-Jiao; Li, Sai; Yuan, Yuan; Liu, Yu-Dong; Xu, Chuan-Long; Shuai, Yong; Tan, He-Ping
2017-04-03
Plenoptic cameras are used for capturing flames in studies of high-temperature phenomena. However, simulations of plenoptic camera models can be used prior to the experiment improve experimental efficiency and reduce cost. In this work, microlens arrays, which are based on the established light field camera model, are optimized into a hexagonal structure with three types of microlenses. With this improved plenoptic camera model, light field imaging of static objects and flame are simulated using the calibrated parameters of the Raytrix camera (R29). The optimized models improve the image resolution, imaging screen utilization, and shooting range of depth of field.
Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System
NASA Astrophysics Data System (ADS)
Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki
In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.
The research of adaptive-exposure on spot-detecting camera in ATP system
NASA Astrophysics Data System (ADS)
Qian, Feng; Jia, Jian-jun; Zhang, Liang; Wang, Jian-Yu
2013-08-01
High precision acquisition, tracking, pointing (ATP) system is one of the key techniques of laser communication. The spot-detecting camera is used to detect the direction of beacon in laser communication link, so that it can get the position information of communication terminal for ATP system. The positioning accuracy of camera decides the capability of laser communication system directly. So the spot-detecting camera in satellite-to-earth laser communication ATP systems needs high precision on target detection. The positioning accuracy of cameras should be better than +/-1μ rad . The spot-detecting cameras usually adopt centroid algorithm to get the position information of light spot on detectors. When the intensity of beacon is moderate, calculation results of centroid algorithm will be precise. But the intensity of beacon changes greatly during communication for distance, atmospheric scintillation, weather etc. The output signal of detector will be insufficient when the camera underexposes to beacon because of low light intensity. On the other hand, the output signal of detector will be saturated when the camera overexposes to beacon because of high light intensity. The calculation accuracy of centroid algorithm becomes worse if the spot-detecting camera underexposes or overexposes, and then the positioning accuracy of camera will be reduced obviously. In order to improve the accuracy, space-based cameras should regulate exposure time in real time according to light intensity. The algorithm of adaptive-exposure technique for spot-detecting camera based on metal-oxide-semiconductor (CMOS) detector is analyzed. According to analytic results, a CMOS camera in space-based laser communication system is described, which utilizes the algorithm of adaptive-exposure to adapting exposure time. Test results from imaging experiment system formed verify the design. Experimental results prove that this design can restrain the reduction of positioning accuracy for the change of light intensity. So the camera can keep stable and high positioning accuracy during communication.
NASA Technical Reports Server (NTRS)
Goins, G. D.; Yorio, N. C.; Sanwo, M. M.; Brown, C. S.
1996-01-01
To determine the influence of narrow-spectrum red light-emitting diodes (LED's) on plant growth and seed production, wheat (Triticum aestivum L.cv Superdwarf) and Arabidopsis (Arabidopsis thaliana (L.) Heynh, race Columbia) plants were grown under red LED's (peak emission 660 nm) and compared to plants grown under daylight fluorescent (white) light and red LED's supplemented with either 1 percent or 10 percent blue fluorescent (BF) light. Wheat growth under red LED's alone appeared normal, whereas Arabidopsis under red LED's alone developed curled leaf margins and a spiraling growth pattern. Both wheat and Arabidopsis under red LED's alone or red LED's + 1 percent BF light had significantly lower seed yield than plants grown under white light. However, the addition of 10 percent BF light to red LED's partially alleviated the adverse effect of red LED's on yield. Irrespective of the light treatment, viable seeds were produced by wheat(75-92 percent germination rate) and Arabidopsis (85-100 percent germination rate). These results indicate that wheat, and to a lesser extent Arabidopsis, can be successfully grown under red LED's alone, but supplemental blue light is required with red LED's to sufficiently match the growth characteristics and seed yield associated with plants grown under white light.
Hyperspectral imaging for detection of black tip damage in wheat kernels
NASA Astrophysics Data System (ADS)
Delwiche, Stephen R.; Yang, I.-Chang; Kim, Moon S.
2009-05-01
A feasibility study was conducted on the use of hyperspectral imaging to differentiate sound wheat kernels from those with the fungal condition called black point or black tip. Individual kernels of hard red spring wheat were loaded in indented slots on a blackened machined aluminum plate. Damage conditions, determined by official (USDA) inspection, were either sound (no damage) or damaged by the black tip condition alone. Hyperspectral imaging was separately performed under modes of reflectance from white light illumination and fluorescence from UV light (~380 nm) illumination. By cursory inspection of wavelength images, one fluorescence wavelength (531 nm) was selected for image processing and classification analysis. Results indicated that with this one wavelength alone, classification accuracy can be as high as 95% when kernels are oriented with their dorsal side toward the camera. It is suggested that improvement in classification can be made through the inclusion of multiple wavelength images.
Wang, Yajun; Laughner, Jacob I.; Efimov, Igor R.; Zhang, Song
2013-01-01
This paper presents a two-frequency binary phase-shifting technique to measure three-dimensional (3D) absolute shape of beating rabbit hearts. Due to the low contrast of the cardiac surface, the projector and the camera must remain focused, which poses challenges for any existing binary method where the measurement accuracy is low. To conquer this challenge, this paper proposes to utilize the optimal pulse width modulation (OPWM) technique to generate high-frequency fringe patterns, and the error-diffusion dithering technique to produce low-frequency fringe patterns. Furthermore, this paper will show that fringe patterns produced with blue light provide the best quality measurements compared to fringe patterns generated with red or green light; and the minimum data acquisition speed for high quality measurements is around 800 Hz for a rabbit heart beating at 180 beats per minute. PMID:23482151
Matsumura, Kenta; Rolfe, Peter; Lee, Jihyoung; Yamakoshi, Takehiro
2014-01-01
Recent progress in information and communication technologies has made it possible to measure heart rate (HR) and normalized pulse volume (NPV), which are important physiological indices, using only a smartphone. This has been achieved with reflection mode photoplethysmography (PPG), by using a smartphone’s embedded flash as a light source and the camera as a light sensor. Despite its widespread use, the method of PPG is susceptible to motion artifacts as physical displacements influence photon propagation phenomena and, thereby, the effective optical path length. Further, it is known that the wavelength of light used for PPG influences the photon penetration depth and we therefore hypothesized that influences of motion artifact could be wavelength-dependant. To test this hypothesis, we made measurements in 12 healthy volunteers of HR and NPV derived from reflection mode plethysmograms recorded simultaneously at three different spectral regions (red, green and blue) at the same physical location with a smartphone. We then assessed the accuracy of the HR and NPV measurements under the influence of motion artifacts. The analyses revealed that the accuracy of HR was acceptably high with all three wavelengths (all rs > 0.996, fixed biases: −0.12 to 0.10 beats per minute, proportional biases: r = −0.29 to 0.03), but that of NPV was the best with green light (r = 0.791, fixed biases: −0.01 arbitrary units, proportional bias: r = 0.11). Moreover, the signal-to-noise ratio obtained with green and blue light PPG was higher than that of red light PPG. These findings suggest that green is the most suitable color for measuring HR and NPV from the reflection mode photoplethysmogram under motion artifact conditions. We conclude that the use of green light PPG could be of particular benefit in ambulatory monitoring where motion artifacts are a significant issue. PMID:24618594
Image system for three dimensional, 360 DEGREE, time sequence surface mapping of moving objects
Lu, Shin-Yee
1998-01-01
A three-dimensional motion camera system comprises a light projector placed between two synchronous video cameras all focused on an object-of-interest. The light projector shines a sharp pattern of vertical lines (Ronchi ruling) on the object-of-interest that appear to be bent differently to each camera by virtue of the surface shape of the object-of-interest and the relative geometry of the cameras, light projector and object-of-interest Each video frame is captured in a computer memory and analyzed. Since the relative geometry is known and the system pre-calibrated, the unknown three-dimensional shape of the object-of-interest can be solved for by matching the intersections of the projected light lines with orthogonal epipolar lines corresponding to horizontal rows in the video camera frames. A surface reconstruction is made and displayed on a monitor screen. For 360.degree. all around coverage of theobject-of-interest, two additional sets of light projectors and corresponding cameras are distributed about 120.degree. apart from one another.
Image system for three dimensional, 360{degree}, time sequence surface mapping of moving objects
Lu, S.Y.
1998-12-22
A three-dimensional motion camera system comprises a light projector placed between two synchronous video cameras all focused on an object-of-interest. The light projector shines a sharp pattern of vertical lines (Ronchi ruling) on the object-of-interest that appear to be bent differently to each camera by virtue of the surface shape of the object-of-interest and the relative geometry of the cameras, light projector and object-of-interest. Each video frame is captured in a computer memory and analyzed. Since the relative geometry is known and the system pre-calibrated, the unknown three-dimensional shape of the object-of-interest can be solved for by matching the intersections of the projected light lines with orthogonal epipolar lines corresponding to horizontal rows in the video camera frames. A surface reconstruction is made and displayed on a monitor screen. For 360{degree} all around coverage of the object-of-interest, two additional sets of light projectors and corresponding cameras are distributed about 120{degree} apart from one another. 20 figs.
Tavladoraki, Paraskevi; Kloppstech, Klaus; Argyroudi-Akoyunoglou, Joan
1989-01-01
The mRNA coding for light-harvesting complex of PSII (LHC-II) apoprotein is present in etiolated bean (Phaseolus vulgaris L.) leaves; its level is low in 5-day-old leaves, increases about 3 to 4 times in 9- to 13-day-old leaves, and decreases thereafter. A red light pulse induces an increase in LHC-II mRNA level, which is reversed by far red light, in all ages of the etiolated tissue tested. The phytochrome-controlled initial increase of LHC-II mRNA level is higher in 9- and 13-day-old than in 5- and 17-day-old bean leaves. The amount of LHC-II mRNA, accumulated in the dark after a red light pulse, oscillates rhythmically with a period of about 24 hours. This rhythm is also observed in continuous white light and in the dark following exposure to continuous white light, and persists for at least 70 hours. A second red light pulse, applied 36 hours after initiation of the rhythm, induces a phase-shift, which is prevented by far red light immediately following the second red light pulse. A persistent, but gradually reduced, far red reversibility of the red light-induced increase in LHC-II mRNA level is observed. In contrast, far red reversibility of the red light-induced clock setting is only observed when far red follows immediately the red light. It is concluded that (a) the light-induced LHC-II mRNA accumulation follows an endogenous, circadian rhythm, for the appearance of which a red light pulse is sufficient, (b) the circadian oscillator is under phytochrome control, and (c) a stable Pfr form, which exists for several hours, is responsible for sustaining LHC-II gene transcription. Images Figure 1 Figure 2 Figure 8 PMID:16666825
2006-09-01
This vibrant image from NASA's Spitzer Space Telescope shows the Large Magellanic Cloud, a satellite galaxy to our own Milky Way galaxy. The infrared image, a mosaic of more than 100,000 individual tiles, offers astronomers a unique chance to study the lifecycle of stars and dust in a single galaxy. Nearly one million objects are revealed for the first time in this Spitzer view, which represents about a 1,000-fold improvement in sensitivity over previous space-based missions. Most of the new objects are dusty stars of various ages populating the Large Magellanic Cloud; the rest are thought to be background galaxies. The blue color in the picture, seen most prominently in the central bar, represents starlight from older stars. The chaotic, bright regions outside this bar are filled with hot, massive stars buried in thick blankets of dust. The red clouds contain cooler interstellar gas and molecular-sized dust grains illuminated by ambient starlight. The Large Magellanic Cloud, located 160,000 light-years from Earth, is one of a handful of dwarf galaxies that orbit our own Milky Way. It is approximately one-third as wide as the Milky Way, and, if it could be seen in its entirety, would cover the same amount of sky as a grid of about 480 full moons. About one-third of the whole galaxy can be seen in the Spitzer image. This picture is a composite of infrared light captured by Spitzer's infrared array camera. Light with wavelengths of 8 and 5.8 microns is red and orange: 4.5-micron light is green; and 3.6-micron light is blue. http://photojournal.jpl.nasa.gov/catalog/PIA07136
M33: A Close Neighbor Reveals its True Size and Splendor
NASA Technical Reports Server (NTRS)
2009-01-01
One of our closest galactic neighbors shows its awesome beauty in this new image from NASA's Spitzer Space Telescope. M33, also known as the Triangulum Galaxy, is a member of what's known as our Local Group of galaxies. Along with our own Milky Way, this group travels together in the universe, as they are gravitationally bound. In fact, M33 is one of the few galaxies that is moving toward the Milky Way despite the fact that space itself is expanding, causing most galaxies in the universe to grow farther and farther apart. When viewed with Spitzer's infrared eyes, this elegant spiral galaxy sparkles with color and detail. Stars appear as glistening blue gems (many of which are actually foreground stars in our own galaxy), while dust in the spiral disk of the galaxy glows pink and red. But not only is this new image beautiful, it also shows M33 to be surprising large bigger than its visible-light appearance would suggest. With its ability to detect cold, dark dust, Spitzer can see emission from cooler material well beyond the visible range of M33's disk. Exactly how this cold material moved outward from the galaxy is still a mystery, but winds from giant stars or supernovas may be responsible. M33 is located about 2.9 million light-years away in the constellation Triangulum. This composite image was taken by Spitzer's infrared array camera. The color blue indicates infrared light of 3.6 microns, green shows 4.5-micron light, and red 8.0 microns.Comparison of red autofluorescing plaque and disclosed plaque-a cross-sectional study.
Volgenant, Catherine M C; Fernandez Y Mostajo, Mercedes; Rosema, Nanning A M; van der Weijden, Fridus A; Ten Cate, Jacob M; van der Veen, Monique H
2016-12-01
The aim of this cross-sectional study was to assess the correlation between dental plaque scores determined by the measurement of red autofluorescence or by visualization with a two-tone solution. Clinical photographs were used for this study. Overnight plaque from the anterior teeth of 48 participants was assessed for red fluorescence on photographs (taken with a QLF-camera) using a modified Quigley & Hein (mQH) index. A two-tone disclosing solution was applied. Total disclosed plaque was clinically assessed using the mQH index. In addition, total and blue disclosed plaque was scored on clinical photographs using the mQH index. A strong correlation was observed between the total disclosed plaque scored on photographs and the clinical scores (r = 0.70 at site level; r = 0.88 at subject level). The correlation between red fluorescent plaque and total plaque, as assessed on the photographs, was moderate to strong and significant (r = 0.50 at the site level; r = 0.70 at the subject level), with the total plaque scores consistently higher than the red fluorescent plaque scores. The correlation between red fluorescent plaque and blue disclosed plaque was weak to moderate and significant (r = 0.30 at the site level; r = 0.50 at the subject level). Plaque, as scored on white-light photographs, corresponds well with clinically assessed plaque. A weak to moderate correlation between red fluorescing plaque and total disclosed plaque or blue disclosed plaque was found. What at present is considered to be matured dental plaque, which appears blue following the application of a two-tone disclosing solution, is not in agreement with red fluorescent dental plaque assessment.
Kleptoparasitic behavior and species richness at Mt. Graham red squirrel middens
Andrew J. Edelman; John L. Koprowski; Jennifer L. Edelman
2005-01-01
We used remote photography to assess the frequency of inter- and intra-specific kleptoparasitism and species richness at Mt. Graham red squirrel (Tamiasciurus hudsonicus grahamensis) middens. Remote cameras and conifer cones were placed at occupied and unoccupied middens, and random sites. Species richness of small mammals was higher at red squirrel...
Avian nestling predation by endangered Mount Graham red squirrel
Claire A. Zugmeyer; John L. Koprowski
2007-01-01
Studies using artificial nests or remote cameras have documented avian predation by red squirrels (Tamiasciurus hudsonicus). Although several direct observations of avian predation events are known in the northern range of the red squirrel distribution, no accounts have been reported in the southern portion. We observed predation upon a hermit thrush...
Inhomogeneity in optical properties of rat brain: a study for LLLT dosimetry
NASA Astrophysics Data System (ADS)
Sousa, Marcelo V. P.; Prates, Renato; Kato, Ilka T.; Sabino, Caetano P.; Yoshimura, Tania M.; Suzuki, Luis C.; Magalhães, Ana C.; Yoshimura, Elisabeth M.; Ribeiro, Martha S.
2013-03-01
Over the last few years, low-level light therapy (LLLT) has shown an incredible suitability for a wide range of applications for central nervous system (CNS) related diseases. In this therapeutic modality light dosimetry is extremely critical so the study of light propagation through the CNS organs is of great importance. To better understand how light intensity is delivered to the most relevant neural sites we evaluated optical transmission through slices of rat brain point by point. We experimented red (λ = 660 nm) and near infrared (λ = 808 nm) diode laser light analyzing the light penetration and distribution in the whole brain. A fresh Wistar rat (Rattus novergicus) brain was cut in sagittal slices and illuminated with a broad light beam. A high-resolution digital camera was employed to acquire data of transmitted light. Spatial profiles of the light transmitted through the sample were obtained from the images. Peaks and valleys in the profiles show sites where light was less or more attenuated. The peak intensities provide information about total attenuation and the peak widths are correlated to the scattering coefficient at that individual portion of the sample. The outcomes of this study provide remarkable information for LLLT dose-dependent studies involving CNS and highlight the importance of LLLT dosimetry in CNS organs for large range of applications in animal and human diseases.
Pinhole Cameras: For Science, Art, and Fun!
ERIC Educational Resources Information Center
Button, Clare
2007-01-01
A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…
NASA Astrophysics Data System (ADS)
Piermattei, Livia; Bozzi, Carlo Alberto; Mancini, Adriano; Tassetti, Anna Nora; Karel, Wilfried; Pfeifer, Norbert
2017-04-01
Unmanned aerial vehicles (UAVs) in combination with consumer grade cameras have become standard tools for photogrammetric applications and surveying. The recent generation of multispectral, cost-efficient and lightweight cameras has fostered a breakthrough in the practical application of UAVs for precision agriculture. For this application, multispectral cameras typically use Green, Red, Red-Edge (RE) and Near Infrared (NIR) wavebands to capture both visible and invisible images of crops and vegetation. These bands are very effective for deriving characteristics like soil productivity, plant health and overall growth. However, the quality of results is affected by the sensor architecture, the spatial and spectral resolutions, the pattern of image collection, and the processing of the multispectral images. In particular, collecting data with multiple sensors requires an accurate spatial co-registration of the various UAV image datasets. Multispectral processed data in precision agriculture are mainly presented as orthorectified mosaics used to export information maps and vegetation indices. This work aims to investigate the acquisition parameters and processing approaches of this new type of image data in order to generate orthoimages using different sensors and UAV platforms. Within our experimental area we placed a grid of artificial targets, whose position was determined with differential global positioning system (dGPS) measurements. Targets were used as ground control points to georeference the images and as checkpoints to verify the accuracy of the georeferenced mosaics. The primary aim is to present a method for the spatial co-registration of visible, Red-Edge, and NIR image sets. To demonstrate the applicability and accuracy of our methodology, multi-sensor datasets were collected over the same area and approximately at the same time using the fixed-wing UAV senseFly "eBee". The images were acquired with the camera Canon S110 RGB, the multispectral cameras Canon S110 NIR and S110 RE and with the multi-camera system Parrot Sequoia, which is composed of single-band cameras (Green, Red, Red Edge, NIR and RGB). Imagery from each sensor was georeferenced and mosaicked with the commercial software Agisoft PhotoScan Pro and different approaches for image orientation were compared. To assess the overall spatial accuracy of each dataset the root mean square error was computed between check point coordinates measured with dGPS and coordinates retrieved from georeferenced image mosaics. Additionally, image datasets from different UAV platforms (i.e. DJI Phantom 4Pro, DJI Phantom 3 professional, and DJI Inspire 1 Pro) were acquired over the same area and the spatial accuracy of the orthoimages was evaluated.
Tardu, Mehmet; Dikbas, Ugur Meric; Baris, Ibrahim; Kavakli, Ibrahim Halil
2016-11-01
Light is one of the main environmental cues that affects the physiology and behavior of many organisms. The effect of light on genome-wide transcriptional regulation has been well-studied in green algae and plants, but not in red algae. Cyanidioschyzon merolae is used as a model red algae, and is suitable for studies on transcriptomics because of its compact genome with a relatively small number of genes. In addition, complete genome sequences of the nucleus, mitochondrion, and chloroplast of this organism have been determined. Together, these attributes make C. merolae an ideal model organism to study the response to light stimuli at the transcriptional and the systems biology levels. Previous studies have shown that light significantly affects cell signaling in this organism, but there are no reports on its blue light- and red light-mediated transcriptional responses. We investigated the direct effects of blue and red light at the transcriptional level using RNA-seq. Blue and red lights were found to regulate 35 % of the total genes in C. merolae. Blue light affected the transcription of genes involved in protein synthesis while red light specifically regulated the transcription of genes involved in photosynthesis and DNA repair. Blue or red light regulated genes involved in carbon metabolism and pigment biosynthesis. Overall, our data showed that red and blue light regulate the majority of the cellular, cell division, and repair processes in C. merolae.
Phototropin 1 and dim-blue light modulate the red light de-etiolation response.
Wang, Yihai; M Folta, Kevin
2014-01-01
Light signals regulate seedling morphological changes during de-etiolation through the coordinated actions of multiple light-sensing pathways. Previously we have shown that red-light-induced hypocotyl growth inhibition can be reversed by addition of dim blue light through the action of phototropin 1 (phot1). Here we further examine the fluence-rate relationships of this blue light effect in short-term (hours) and long-term (days) hypocotyl growth assays. The red stem-growth inhibition and blue promotion is a low-fluence rate response, and blue light delays or attenuates both the red light and far-red light responses. These de-etiolation responses include blue light reversal of red or far-red induced apical hook opening. This response also requires phot1. Cryptochromes (cry1 and cry2) are activated by higher blue light fluence-rates and override phot1's influence on hypocotyl growth promotion. Exogenous application of auxin transport inhibitor naphthylphthalamic acid abolished the blue light stem growth promotion in both hypocotyl growth and hook opening. Results from the genetic tests of this blue light effect in auxin transporter mutants, as well as phytochrome kinase substrate mutants indicated that aux1 may play a role in blue light reversal of red light response. Together, the phot1-mediated adjustment of phytochrome-regulated photomorphogenic events is most robust in dim blue light conditions and is likely modulated by auxin transport through its transporters.
NASA Technical Reports Server (NTRS)
Yorio, N. C.; Goins, G. D.; Kagie, H. R.; Wheeler, R. M.; Sager, J. C.
2001-01-01
Radish (Raphanus sativus L. cv. Cherriette), lettuce (Lactuca sativa L. cv. Waldmann's Green), and spinach (Spinacea oleracea L. cv. Nordic IV) plants were grown under 660-nm red light-emitting diodes (LEDs) and were compared at equal photosynthetic photon flux (PPF) with either plants grown under cool-white fluorescent lamps (CWF) or red LEDs supplemented with 10% (30 micromoles m-2 s-1) blue light (400-500 nm) from blue fluorescent (BF) lamps. At 21 days after planting (DAP), leaf photosynthetic rates and stomatal conductance were greater for plants grown under CWF light than for those grown under red LEDs, with or without supplemental blue light. At harvest (21 DAP), total dry-weight accumulation was significantly lower for all species tested when grown under red LEDs alone than when grown under CWF light or red LEDs + 10% BF light. Moreover, total dry weight for radish and spinach was significantly lower under red LEDs + 10% BF than under CWF light, suggesting that addition of blue light to the red LEDs was still insufficient for achieving maximal growth for these crops.
Near infra-red astronomy with adaptive optics and laser guide stars at the Keck Observatory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Max, C.E.; Gavel, D.T.; Olivier, S.S.
1995-08-03
A laser guide star adaptive optics system is being built for the W. M. Keck Observatory`s 10-meter Keck II telescope. Two new near infra-red instruments will be used with this system: a high-resolution camera (NIRC 2) and an echelle spectrometer (NIRSPEC). The authors describe the expected capabilities of these instruments for high-resolution astronomy, using adaptive optics with either a natural star or a sodium-layer laser guide star as a reference. They compare the expected performance of these planned Keck adaptive optics instruments with that predicted for the NICMOS near infra-red camera, which is scheduled to be installed on the Hubblemore » Space Telescope in 1997.« less
NASA Technical Reports Server (NTRS)
Goins, G. D.; Yorio, N. C.; Sanwo, M. M.; Brown, C. S.; Sager, J. C. (Principal Investigator)
1997-01-01
Red light-emitting diodes (LEDs) are a potential light source for growing plants in spaceflight systems because of their safety, small mass and volume, wavelength specificity, and longevity. Despite these attractive features, red LEDs must satisfy requirements for plant photosynthesis and photomorphogenesis for successful growth and seed yield. To determine the influence of gallium aluminium arsenide (GaAlAs) red LEDs on wheat photomorphogenesis, photosynthesis, and seed yield, wheat (Triticum aestivum L., cv. 'USU-Super Dwarf') plants were grown under red LEDs and compared to plants grown under daylight fluorescent (white) lamps and red LEDs supplemented with either 1% or 10% blue light from blue fluorescent (BF) lamps. Compared to white light-grown plants, wheat grown under red LEDs alone demonstrated less main culm development during vegetative growth through preanthesis, while showing a longer flag leaf at 40 DAP and greater main culm length at final harvest (70 DAP). As supplemental BF light was increased with red LEDs, shoot dry matter and net leaf photosynthesis rate increased. At final harvest, wheat grown under red LEDs alone displayed fewer subtillers and a lower seed yield compared to plants grown under white light. Wheat grown under red LEDs+10% BF light had comparable shoot dry matter accumulation and seed yield relative to wheat grown under white light. These results indicate that wheat can complete its life cycle under red LEDs alone, but larger plants and greater amounts of seed are produced in the presence of red LEDs supplemented with a quantity of blue light.
Kotabová, Eva; Jarešová, Jana; Kaňa, Radek; Sobotka, Roman; Bína, David; Prášil, Ondřej
2014-06-01
Chromera velia is an alveolate alga associated with scleractinian corals. Here we present detailed work on chromatic adaptation in C. velia cultured under either blue or red light. Growth of C. velia under red light induced the accumulation of a light harvesting antenna complex exhibiting unusual spectroscopic properties with red-shifted absorption and atypical 710nm fluorescence emission at room temperature. Due to these characteristic features the complex was designated "Red-shifted Chromera light harvesting complex" (Red-CLH complex). Its detailed biochemical survey is described in the accompanying paper (Bina et al. 2013, this issue). Here, we show that the accumulation of Red-CLH complex under red light represents a slow acclimation process (days) that is reversible with much faster kinetics (hours) under blue light. This chromatic adaptation allows C. velia to maintain all important parameters of photosynthesis constant under both light colors. We further demonstrated that the C. velia Red-CLH complex is assembled from a 17kDa antenna protein and is functionally connected to photosystem II as it shows variability of chlorophyll fluorescence. Red-CLH also serves as an additional locus for non-photochemical quenching. Although overall rates of oxygen evolution and carbon fixation were similar for both blue and red light conditions, the presence of Red-CLH in C. velia cells increases the light harvesting potential of photosystem II, which manifested as a doubled oxygen evolution rate at illumination above 695nm. This data demonstrates a remarkable long-term remodeling of C. velia light-harvesting system according to light quality and suggests physiological significance of 'red' antenna complexes. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
2004-01-01
This image mosaic illustrates how scientists use the color calibration targets (upper left) located on both Mars Exploration Rovers to fine-tune the rovers' sense of color. In the center, spectra, or light signatures, acquired in the laboratory of the colored chips on the targets are shown as lines. Actual data from Mars Exploration Rover Spirit's panoramic camera is mapped on top of these lines as dots. The plot demonstrates that the observed colors of Mars match the colors of the chips, and thus approximate the red planet's true colors. This finding is further corroborated by the picture taken on Mars of the calibration target, which shows the colored chips as they would appear on Earth.
VizieR Online Data Catalog: NGC 6822 Cepheids JHKs light curves (Feast+, 2012)
NASA Astrophysics Data System (ADS)
Feast, M. W.; Whitelock, P. A.; Menzies, J. W.; Matsunaga, N.
2013-03-01
Our survey of NGC 6822 is confined to the optical bar which is aligned nearly north-south. We used the Japanese-South African IRSF (InfraRed Survey Facility) telescope equipped with the SIRIUS camera, which permits simultaneous imaging in the J, H and Ks bands. We defined three overlapping fields, with field 1 centred at RA=19:44:56 and DE=-14:48:06 (2000.0). Fields 2 and 3 are centred 6.7 arcmin north and south, respectively, from field 1. The three fields, approximately 7.8 arcmin2, were observed in JHKs at 19, 18 and 16 epochs, respectively, over a period of 3.5yr. (2 data files).
Phase and amplitude wave front sensing and reconstruction with a modified plenoptic camera
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Ko, Jonathan; Nelson, William; Davis, Christopher C.
2014-10-01
A plenoptic camera is a camera that can retrieve the direction and intensity distribution of light rays collected by the camera and allows for multiple reconstruction functions such as: refocusing at a different depth, and for 3D microscopy. Its principle is to add a micro-lens array to a traditional high-resolution camera to form a semi-camera array that preserves redundant intensity distributions of the light field and facilitates back-tracing of rays through geometric knowledge of its optical components. Though designed to process incoherent images, we found that the plenoptic camera shows high potential in solving coherent illumination cases such as sensing both the amplitude and phase information of a distorted laser beam. Based on our earlier introduction of a prototype modified plenoptic camera, we have developed the complete algorithm to reconstruct the wavefront of the incident light field. In this paper the algorithm and experimental results will be demonstrated, and an improved version of this modified plenoptic camera will be discussed. As a result, our modified plenoptic camera can serve as an advanced wavefront sensor compared with traditional Shack- Hartmann sensors in handling complicated cases such as coherent illumination in strong turbulence where interference and discontinuity of wavefronts is common. Especially in wave propagation through atmospheric turbulence, this camera should provide a much more precise description of the light field, which would guide systems in adaptive optics to make intelligent analysis and corrections.
Young Stars Emerge from Orion Head
2007-05-17
This image from NASA's Spitzer Space Telescope shows infant stars "hatching" in the head of the hunter constellation, Orion. Astronomers suspect that shockwaves from a supernova explosion in Orion's head, nearly three million years ago, may have initiated this newfound birth. The region featured in this Spitzer image is called Barnard 30. It is located approximately 1,300 light-years away and sits on the right side of Orion's "head," just north of the massive star Lambda Orionis. Wisps of green in the cloud are organic molecules called polycyclic aromatic hydrocarbons. These molecules are formed anytime carbon-based materials are burned incompletely. On Earth, they can be found in the sooty exhaust from automobile and airplane engines. They also coat the grills where charcoal-broiled meats are cooked. Tints of orange-red in the cloud are dust particles warmed by the newly forming stars. The reddish-pink dots at the top of the cloud are very young stars embedded in a cocoon of cosmic gas and dust. Blue spots throughout the image are background Milky Way along this line of sight. This composite includes data from Spitzer's infrared array camera instrument, and multiband imaging photometer instrument. Light at 4.5 microns is shown as blue, 8.0 microns is green, and 24 microns is red. http://photojournal.jpl.nasa.gov/catalog/PIA09411
Young Stars Emerge from Orion's Head
NASA Technical Reports Server (NTRS)
2007-01-01
This image from NASA's Spitzer Space Telescope shows infant stars 'hatching' in the head of the hunter constellation, Orion. Astronomers suspect that shockwaves from a supernova explosion in Orion's head, nearly three million years ago, may have initiated this newfound birth The region featured in this Spitzer image is called Barnard 30. It is located approximately 1,300 light-years away and sits on the right side of Orion's 'head,' just north of the massive star Lambda Orionis. Wisps of green in the cloud are organic molecules called polycyclic aromatic hydrocarbons. These molecules are formed anytime carbon-based materials are burned incompletely. On Earth, they can be found in the sooty exhaust from automobile and airplane engines. They also coat the grills where charcoal-broiled meats are cooked. Tints of orange-red in the cloud are dust particles warmed by the newly forming stars. The reddish-pink dots at the top of the cloud are very young stars embedded in a cocoon of cosmic gas and dust. Blue spots throughout the image are background Milky Way along this line of sight. This composite includes data from Spitzer's infrared array camera instrument, and multiband imaging photometer instrument. Light at 4.5 microns is shown as blue, 8.0 microns is green, and 24 microns is red.Jensupakarn, Auearree; Kanitpong, Kunnawee
2018-04-01
In Thailand, red light running is considered as one of the most dangerous behaviors at intersection. Red light running (RLR) behavior is the failure to obey the traffic control signal. However, motorcycle riders and car drivers who are running through red lights could be influenced by human factors or road environment at intersection. RLR could be advertent or inadvertent behavior influenced by many factors. Little research study has been done to evaluate the contributing factors influencing the red-light violation behavior. This study aims to determine the factors influencing the red light running behavior including human characteristics, physical condition of intersection, traffic signal operation, and traffic condition. A total of 92 intersections were observed in Chiang Mai, Nakhon Ratchasima, and Chonburi, the major provinces in each region of Thailand. In addition, the socio-economic characteristics of red light runners were obtained from self-reported questionnaire survey. The Binary Logistic Regression and the Multiple Linear Regression models were used to determine the characteristics of red light runners and the factors influencing rates of red light running respectively. The results from this study can help to understand the characteristics of red light runners and factors affecting them to run red lights. For motorcycle riders and car drivers, age, gender, occupation, driving license, helmet/seatbelt use, and the probability to be penalized when running the red light significantly affect RLR behavior. In addition, the results indicated that vehicle travelling direction, time of day, existence of turning lane, number of lanes, lane width, intersection sight distance, type of traffic signal pole, type of traffic signal operation, length of yellow time interval, approaching speed, distance from intersection warning sign to stop line, and pavement roughness significantly affect RLR rates. Copyright © 2018 Elsevier Ltd. All rights reserved.
Tanaka, Seiya; Ario, Nobuyuki; Nakagawa, Andressa Camila Seiko; Tomita, Yuki; Murayama, Naoki; Taniguchi, Takatoshi; Hamaoka, Norimitsu; Iwaya-Inoue, Mari; Ishibashi, Yushi
2017-06-03
Soybean pods are located at the nodes, where they are in the shadow, whereas cowpea pods are located outside of the leaves and are exposed to sunlight. To compare the effects of light quality on pod growth in soybean and cowpea, we measured the length of pods treated with white, blue, red or far-red light. In both species, pods elongated faster during the dark period than during the light period in all light treatments except red light treatment in cowpea. Red light significantly suppressed pod elongation in soybean during the dark and light periods. On the other hand, the elongation of cowpea pods treated with red light markedly promoted during the light period. These results suggested that the difference in the pod set sites between soybean and cowpea might account for the difference in their red light responses for pod growth.
Mastcam Special Filters Help Locate Variations Ahead
2017-11-01
This pair of images from the Mast Camera (Mastcam) on NASA's Curiosity rover illustrates how special filters are used to scout terrain ahead for variations in the local bedrock. The upper panorama is in the Mastcam's usual full color, for comparison. The lower panorama of the same scene, in false color, combines three exposures taken through different "science filters," each selecting for a narrow band of wavelengths. Filters and image processing steps were selected to make stronger signatures of hematite, an iron-oxide mineral, evident as purple. Hematite is of interest in this area of Mars -- partway up "Vera Rubin Ridge" on lower Mount Sharp -- as holding clues about ancient environmental conditions under which that mineral originated. In this pair of panoramas, the strongest indications of hematite appear related to areas where the bedrock is broken up. With information from this Mastcam reconnaissance, the rover team selected destinations in the scene for close-up investigations to gain understanding about the apparent patchiness in hematite spectral features. The Mastcam's left-eye camera took the component images of both panoramas on Sept. 12, 2017, during the 1,814th Martian day, or sol, of Curiosity's work on Mars. The view spans from south-southeast on the left to south-southwest on the right. The foreground across the bottom of the scene is about 50 feet (about 15 meters) wide. Figure 1 includes scale bars of 1 meter (3.3 feet) in the middle distance and 5 meters (16 feet) at upper right. Curiosity's Mastcam combines two cameras: the right eye with a telephoto lens and the left eye with a wider-angle lens. Each camera has a filter wheel that can be rotated in front of the lens for a choice of eight different filters. One filter for each camera is clear to all visible light, for regular full-color photos, and another is specifically for viewing the Sun. Some of the other filters were selected to admit wavelengths of light that are useful for identifying iron minerals. Each of the filters used for the lower panorama shown here admits light from a narrow band of wavelengths, extending to only about 5 to 10 nanometers longer or shorter than the filter's central wavelength. The three observations combined into this product used filters centered at three near-infrared wavelengths: 751 nanometers, 867 nanometers and 1,012 nanometers. Hematite distinctively absorbs some frequencies of infrared light more than others. Usual color photographs from digital cameras -- such as the upper panorama here from Mastcam -- combine information from red, green and blue filtering. The filters are in a microscopic grid in a "Bayer" filter array situated directly over the detector behind the lens, with wider bands of wavelengths. The colors of the upper panorama, as with most featured images from Mastcam, have been tuned with a color adjustment similar to white balancing for approximating how the rocks and sand would appear under daytime lighting conditions on Earth. https://photojournal.jpl.nasa.gov/catalog/PIA22065
3D surface pressure measurement with single light-field camera and pressure-sensitive paint
NASA Astrophysics Data System (ADS)
Shi, Shengxian; Xu, Shengming; Zhao, Zhou; Niu, Xiaofu; Quinn, Mark Kenneth
2018-05-01
A novel technique that simultaneously measures three-dimensional model geometry, as well as surface pressure distribution, with single camera is demonstrated in this study. The technique takes the advantage of light-field photography which can capture three-dimensional information with single light-field camera, and combines it with the intensity-based pressure-sensitive paint method. The proposed single camera light-field three-dimensional pressure measurement technique (LF-3DPSP) utilises a similar hardware setup to the traditional two-dimensional pressure measurement technique, with exception that the wind-on, wind-off and model geometry images are captured via an in-house-constructed light-field camera. The proposed LF-3DPSP technique was validated with a Mach 5 flared cone model test. Results show that the technique is capable of measuring three-dimensional geometry with high accuracy for relatively large curvature models, and the pressure results compare well with the Schlieren tests, analytical calculations, and numerical simulations.
Enhancing swimming pool safety by the use of range-imaging cameras
NASA Astrophysics Data System (ADS)
Geerardyn, D.; Boulanger, S.; Kuijk, M.
2015-05-01
Drowning is the cause of death of 372.000 people, each year worldwide, according to the report of November 2014 of the World Health Organization.1 Currently, most swimming pools only use lifeguards to detect drowning people. In some modern swimming pools, camera-based detection systems are nowadays being integrated. However, these systems have to be mounted underwater, mostly as a replacement of the underwater lighting. In contrast, we are interested in range imaging cameras mounted on the ceiling of the swimming pool, allowing to distinguish swimmers at the surface from drowning people underwater, while keeping the large field-of-view and minimizing occlusions. However, we have to take into account that the water surface of a swimming pool is not a flat, but mostly rippled surface, and that the water is transparent for visible light, but less transparent for infrared or ultraviolet light. We investigated the use of different types of 3D cameras to detect objects underwater at different depths and with different amplitudes of surface perturbations. Specifically, we performed measurements with a commercial Time-of-Flight camera, a commercial structured-light depth camera and our own Time-of-Flight system. Our own system uses pulsed Time-of-Flight and emits light of 785 nm. The measured distances between the camera and the object are influenced through the perturbations on the water surface. Due to the timing of our Time-of-Flight camera, our system is theoretically able to minimize the influence of the reflections of a partially-reflecting surface. The combination of a post image-acquisition filter compensating for the perturbations and the use of a light source with shorter wavelengths to enlarge the depth range can improve the current commercial cameras. As a result, we can conclude that low-cost range imagers can increase swimming pool safety, by inserting a post-processing filter and the use of another light source.
Skupsch, C; Chaves, H; Brücker, C
2011-08-01
The Cranz-Schardin camera utilizes a Q-switched Nd:YAG laser and four single CCD cameras. Light pulse energy in the range of 25 mJ and pulse duration of about 5 ns is provided by the laser. The laser light is converted to incoherent light by Rhodamine-B fluorescence dye in a cuvette. The laser beam coherence is intentionally broken in order to avoid speckle. Four light fibers collect the fluorescence light and are used for illumination. Different light fiber lengths enable a delay of illumination between consecutive images. The chosen interframe time is 25 ns, corresponding to 40 × 10(6) frames per second. Exemplarily, the camera is applied to observe the bow shock in front of a water jet, propagating in air at supersonic speed. The initial phase of the formation of a jet structure is recorded.
[A Method for Selecting Self-Adoptive Chromaticity of the Projected Markers].
Zhao, Shou-bo; Zhang, Fu-min; Qu, Xing-hua; Zheng, Shi-wei; Chen, Zhe
2015-04-01
The authors designed a self-adaptive projection system which is composed of color camera, projector and PC. In detail, digital micro-mirror device (DMD) as a spatial light modulator for the projector was introduced in the optical path to modulate the illuminant spectrum based on red, green and blue light emitting diodes (LED). However, the color visibility of active markers is affected by the screen which has unknown reflective spectrum as well. Here active markers are projected spot array. And chromaticity feature of markers is sometimes submerged in similar spectral screen. In order to enhance the color visibility of active markers relative to screen, a method for selecting self-adaptive chromaticity of the projected markers in 3D scanning metrology is described. Color camera with 3 channels limits the accuracy of device characterization. For achieving interconversion of device-independent color space and device-dependent color space, high-dimensional linear model of reflective spectrum was built. Prior training samples provide additional constraints to yield high-dimensional linear model with more than three degrees of freedom. Meanwhile, spectral power distribution of ambient light was estimated. Subsequently, markers' chromaticity in CIE color spaces was selected via maximization principle of Euclidean distance. The setting values of RGB were easily estimated via inverse transform. Finally, we implemented a typical experiment to show the performance of the proposed approach. An 24 Munsell Color Checker was used as projective screen. Color difference in the chromaticity coordinates between the active marker and the color patch was utilized to evaluate the color visibility of active markers relative to the screen. The result comparison between self-adaptive projection system and traditional diode-laser light projector was listed and discussed to highlight advantage of our proposed method.
SOFIA Science Instruments: Commissioning, Upgrades and Future Opportunities
NASA Technical Reports Server (NTRS)
Smith, Erin C.
2014-01-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is the world's largest airborne observatory, featuring a 2.5 meter telescope housed in the aft section of a Boeing 747sp aircraft. SOFIA's current instrument suite includes: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), a 5-40 µm dual band imager/grism spectrometer developed at Cornell University; HIPO (High-speed Imaging Photometer for Occultations), a 0.3-1.1 micron imager built by Lowell Observatory; FLITECAM (First Light Infrared Test Experiment CAMera), a 1-5 micron wide-field imager/grism spectrometer developed at UCLA; FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), a 42-210 micron IFU grating spectrograph completed by University Stuttgart; and EXES (Echelon-Cross- Echelle Spectrograph), a 5-28 micron high-resolution spectrometer being completed by UC Davis and NASA Ames. A second generation instrument, HAWC+ (Highresolution Airborne Wideband Camera), is a 50-240 micron imager being upgraded at JPL to add polarimetry and new detectors developed at GSFC. SOFIA will continually update its instrument suite with new instrumentation, technology demonstration experiments and upgrades to the existing instrument suite. This paper details instrument capabilities and status as well as plans for future instrumentation, including the call for proposals for 3rd generation SOFIA science instruments.
Performance of Color Camera Machine Vision in Automated Furniture Rough Mill Systems
D. Earl Kline; Agus Widoyoko; Janice K. Wiedenbeck; Philip A. Araman
1998-01-01
The objective of this study was to evaluate the performance of color camera machine vision for lumber processing in a furniture rough mill. The study used 134 red oak boards to compare the performance of automated gang-rip-first rough mill yield based on a prototype color camera lumber inspection system developed at Virginia Tech with both estimated optimum rough mill...
A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer.
Shen, Bailey Y; Mukai, Shizuo
2017-01-01
Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient.
A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer
Shen, Bailey Y.
2017-01-01
Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient. PMID:28396802
Light Requirement for Shoot Regeneration in Horseradish Hairy Roots 1
Saitou, Tsutomu; Kamada, Hiroshi; Harada, Hiroshi
1992-01-01
Hairy roots of horseradish (Armoracia rusticana) were induced by inoculation with Agrobacterium rhizogenes harboring Ri plasmid and cultured on phytohormone-free Murashige and Skoog medium after eliminating the bacteria. Hairy roots grew vigorously and sometimes formed yellowish calli under dark conditions. On the other hand, growth of hairy roots stopped after several weeks of culture with light, then shoots were regenerated. Frequency of shoot formation from hairy roots increased as the culture period in light lengthened and the light intensity increased. The shoot regeneration was induced by treatment with white or red light, but not with far-red light. Shoot regeneration by red light was inhibited by following treatment with far-red light. Red and far-red light reversibly affected shoot regeneration. Excised roots of nontransformed plants grew quite slowly on phytohormone-free Murashige and Skoog medium and occasionally formed shoots under white light conditions. PMID:16669041
Light requirement for shoot regeneration in horseradish hairy roots.
Saitou, T; Kamada, H; Harada, H
1992-08-01
Hairy roots of horseradish (Armoracia rusticana) were induced by inoculation with Agrobacterium rhizogenes harboring Ri plasmid and cultured on phytohormone-free Murashige and Skoog medium after eliminating the bacteria. Hairy roots grew vigorously and sometimes formed yellowish calli under dark conditions. On the other hand, growth of hairy roots stopped after several weeks of culture with light, then shoots were regenerated. Frequency of shoot formation from hairy roots increased as the culture period in light lengthened and the light intensity increased. The shoot regeneration was induced by treatment with white or red light, but not with far-red light. Shoot regeneration by red light was inhibited by following treatment with far-red light. Red and far-red light reversibly affected shoot regeneration. Excised roots of nontransformed plants grew quite slowly on phytohormone-free Murashige and Skoog medium and occasionally formed shoots under white light conditions.
Effect of Light on Anthocyanin Levels in Submerged, Harvested Cranberry Fruit
Singh, Bal Ram
2004-01-01
Anthocyanins are a group of plant antioxidants known for their therapeutic use. The effects of natural light, red light, and far-red light on individual as well as total anthocyanin content in cranberry fruit (Vaccinium macrocarpon Ait) were examined in an experimental setting designed to mimic water-harvesting conditions. The reversed-phase high-performance liquid chromatography (HPLC) method was used to separate and analyze the anthocyanins. In contrast to the case of the control sample that was kept in the dark, natural light increased the total anthocyanin level by 75.3% and 87.2% after 24 and 48 hours of water immersion, respectively. Red light and far-red light increased the total anthocyanin level by 41.5% and 34.7%, respectively. The amount of each individual anthocyanin increased differently under natural light, red light, and far-red light, suggesting that expressions of enzymes that catalyze the anthocyanin biosynthesis are regulated differently by environments. PMID:15577187
Phototropin 1 and dim-blue light modulate the red light de-etiolation response
Wang, Yihai; M Folta, Kevin
2014-01-01
Light signals regulate seedling morphological changes during de-etiolation through the coordinated actions of multiple light-sensing pathways. Previously we have shown that red-light-induced hypocotyl growth inhibition can be reversed by addition of dim blue light through the action of phototropin 1 (phot1). Here we further examine the fluence-rate relationships of this blue light effect in short-term (hours) and long-term (days) hypocotyl growth assays. The red stem-growth inhibition and blue promotion is a low-fluence rate response, and blue light delays or attenuates both the red light and far-red light responses. These de-etiolation responses include blue light reversal of red or far-red induced apical hook opening. This response also requires phot1. Cryptochromes (cry1 and cry2) are activated by higher blue light fluence-rates and override phot1's influence on hypocotyl growth promotion. Exogenous application of auxin transport inhibitor naphthylphthalamic acid abolished the blue light stem growth promotion in both hypocotyl growth and hook opening. Results from the genetic tests of this blue light effect in auxin transporter mutants, as well as phytochrome kinase substrate mutants indicated that aux1 may play a role in blue light reversal of red light response. Together, the phot1-mediated adjustment of phytochrome-regulated photomorphogenic events is most robust in dim blue light conditions and is likely modulated by auxin transport through its transporters. PMID:25482790
Digital micromirror device camera with per-pixel coded exposure for high dynamic range imaging.
Feng, Wei; Zhang, Fumin; Wang, Weijing; Xing, Wei; Qu, Xinghua
2017-05-01
In this paper, we overcome the limited dynamic range of the conventional digital camera, and propose a method of realizing high dynamic range imaging (HDRI) from a novel programmable imaging system called a digital micromirror device (DMD) camera. The unique feature of the proposed new method is that the spatial and temporal information of incident light in our DMD camera can be flexibly modulated, and it enables the camera pixels always to have reasonable exposure intensity by DMD pixel-level modulation. More importantly, it allows different light intensity control algorithms used in our programmable imaging system to achieve HDRI. We implement the optical system prototype, analyze the theory of per-pixel coded exposure for HDRI, and put forward an adaptive light intensity control algorithm to effectively modulate the different light intensity to recover high dynamic range images. Via experiments, we demonstrate the effectiveness of our method and implement the HDRI on different objects.
Rapid-Response Low Infrared Emission Broadband Ultrathin Plasmonic Light Absorber
Tagliabue, Giulia; Eghlidi, Hadi; Poulikakos, Dimos
2014-01-01
Plasmonic nanostructures can significantly advance broadband visible-light absorption, with absorber thicknesses in the sub-wavelength regime, much thinner than conventional broadband coatings. Such absorbers have inherently very small heat capacity, hence a very rapid response time, and high light power-to-temperature sensitivity. Additionally, their surface emissivity can be spectrally tuned to suppress infrared thermal radiation. These capabilities make plasmonic absorbers promising candidates for fast light-to-heat applications, such as radiation sensors. Here we investigate the light-to-heat conversion properties of a metal-insulator-metal broadband plasmonic absorber, fabricated as a free-standing membrane. Using a fast IR camera, we show that the transient response of the absorber has a characteristic time below 13 ms, nearly one order of magnitude lower than a similar membrane coated with a commercial black spray. Concurrently, despite the small thickness, due to the large absorption capability, the achieved absorbed light power-to-temperature sensitivity is maintained at the level of a standard black spray. Finally, we show that while black spray has emissivity similar to a black body, the plasmonic absorber features a very low infra-red emissivity of almost 0.16, demonstrating its capability as selective coating for applications with operating temperatures up to 400°C, above which the nano-structure starts to deform. PMID:25418040
DOT National Transportation Integrated Search
2004-12-01
The issue of red light running (RLR) has long been a problem throughout the United States. : There is considerable debate within the general public and public agencies regarding the use of : photographic enforcement to deter red light violations. Man...
Hubble Tracks Clouds on Uranus
NASA Technical Reports Server (NTRS)
1997-01-01
Taking its first peek at Uranus, NASA Hubble Space Telescope's Near Infrared Camera and Multi-Object Spectrometer (NICMOS) has detected six distinct clouds in images taken July 28,1997.
The image on the right, taken 90 minutes after the left-hand image, shows the planet's rotation. Each image is a composite of three near-infrared images. They are called false-color images because the human eye cannot detect infrared light. Therefore, colors corresponding to visible light were assigned to the images. (The wavelengths for the 'blue,' 'green,' and 'red' exposures are 1.1, 1.6, and 1.9 micrometers, respectively.)At visible and near-infrared light, sunlight is reflected from hazes and clouds in the atmosphere of Uranus. However, at near-infrared light, absorption by gases in the Uranian atmosphere limits the view to different altitudes, causing intense contrasts and colors.In these images, the blue exposure probes the deepest atmospheric levels. A blue color indicates clear atmospheric conditions, prevalent at mid-latitudes near the center of the disk. The green exposure is sensitive to absorption by methane gas, indicating a clear atmosphere; but in hazy atmospheric regions, the green color is seen because sunlight is reflected back before it is absorbed. The green color around the south pole (marked by '+') shows a strong local haze. The red exposure reveals absorption by hydrogen, the most abundant gas in the atmosphere of Uranus. Most sunlight shows patches of haze high in the atmosphere. A red color near the limb (edge) of the disk indicates the presence of a high-altitude haze. The purple color to the right of the equator also suggests haze high in the atmosphere with a clear atmosphere below.The five clouds visible near the right limb rotated counterclockwise during the time between both images. They reach high into the atmosphere, as indicated by their red color. Features of such high contrast have never been seen before on Uranus. The clouds are almost as large as continents on Earth, such as Europe. Another cloud (which barely can be seen) rotated along the path shown by the black arrow. It is located at lower altitudes, as indicated by its green color.The rings of Uranus are extremely faint in visible light but quite prominent in the near infrared. The brightest ring, the epsilon ring, has a variable width around its circumference. Its widest and thus brightest part is at the top in this image. Two fainter, inner rings are visible next to the epsilon ring.Eight of the 10 small Uranian satellites, discovered by Voyager 2, can be seen in both images. Their sizes range from about 25 miles (40 kilometers) for Bianca to 100 miles (150 kilometers) for Puck. The smallest of these satellites have not been detected since the departure of Voyager 2 from Uranus in 1986. These eight satellites revolve around Uranus in less than a day. The inner ones are faster than the outer ones. Their motion in the 90 minutes between both images is marked in the right panel. The area outside the rings was slightly enhanced in brightness to improve the visibility of these faint satellites.The Wide Field/Planetary Camera 2 was developed by the Jet Propulsion Laboratory and managed by the Goddard Spaced Flight Center for NASA's Office of Space Science.This image and other images and data received from the Hubble Space Telescope are posted on the World Wide Web on the Space Telescope Science Institute home page at URL http://oposite.stsci.edu/pubinfo/Spatial and Angular Resolution Enhancement of Light Fields Using Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Gul, M. Shahzeb Khan; Gunturk, Bahadir K.
2018-05-01
Light field imaging extends the traditional photography by capturing both spatial and angular distribution of light, which enables new capabilities, including post-capture refocusing, post-capture aperture control, and depth estimation from a single shot. Micro-lens array (MLA) based light field cameras offer a cost-effective approach to capture light field. A major drawback of MLA based light field cameras is low spatial resolution, which is due to the fact that a single image sensor is shared to capture both spatial and angular information. In this paper, we present a learning based light field enhancement approach. Both spatial and angular resolution of captured light field is enhanced using convolutional neural networks. The proposed method is tested with real light field data captured with a Lytro light field camera, clearly demonstrating spatial and angular resolution improvement.
Spatial and Angular Resolution Enhancement of Light Fields Using Convolutional Neural Networks.
Gul, M Shahzeb Khan; Gunturk, Bahadir K
2018-05-01
Light field imaging extends the traditional photography by capturing both spatial and angular distribution of light, which enables new capabilities, including post-capture refocusing, post-capture aperture control, and depth estimation from a single shot. Micro-lens array (MLA) based light field cameras offer a cost-effective approach to capture light field. A major drawback of MLA based light field cameras is low spatial resolution, which is due to the fact that a single image sensor is shared to capture both spatial and angular information. In this paper, we present a learning based light field enhancement approach. Both spatial and angular resolution of captured light field is enhanced using convolutional neural networks. The proposed method is tested with real light field data captured with a Lytro light field camera, clearly demonstrating spatial and angular resolution improvement.
The Use of Light-Emitting Diodes (LEDs) as Green and Red/Far-Red Light Sources in Plant Physiology.
ERIC Educational Resources Information Center
Jackson, David L.; And Others
1985-01-01
The use of green, red, and far-red light-emitting diodes (LEDs) as light sources for plant physiological studies is outlined and evaluated. Indicates that LED lamps have the advantage over conventional light sources in that they are lightweight, low-cost, portable, easily constructed, and do not require color filters. (Author/DH)
V838 Monocerotis revisited: Space phenomenon imitates art
NASA Astrophysics Data System (ADS)
2004-03-01
V838 Monocerotis revisited: Space phenomenon imitates art hi-res Size hi-res: 558 Kb Credits: NASA, the Hubble Heritage Team (AURA/STScI) and ESA V838 Monocerotis revisited: Space phenomenon imitates art "Starry Night", Vincent van Gogh's famous painting, is renowned for its bold whorls of light sweeping across a raging night sky. Although this image of the heavens came only from the artist's restless imagination, a new picture from the NASA/ESA Hubble Space Telescope bears remarkable similarities to the van Gogh work, complete with never-before-seen spirals of dust swirling across trillions of kilometres of interstellar space. This image, obtained with the Advanced Camera for Surveys on February 8, 2004, is Hubble's latest view of an expanding halo of light around a distant star, named V838 Monocerotis (V838 Mon). The illumination of interstellar dust comes from the red supergiant star at the middle of the image, which gave off a flashbulb-like pulse of light two years ago. V838 Mon is located about 20,000 light-years away from Earth in the direction of the constellation Monoceros, placing the star at the outer edge of our Milky Way galaxy V838 Monocerotis revisited: Space phenomenon imitates art hi-res Size hi-res: 1989 kb Credits: NASA, the Hubble Heritage Team (AURA/STScI) and ESA V838 Monocerotis revisited: Space phenomenon imitates art "Starry Night", Vincent van Gogh's famous painting, is renowned for its bold whorls of light sweeping across a raging night sky. Although this image of the heavens came only from the artist's restless imagination, a new picture from the NASA/ESA Hubble Space Telescope bears remarkable similarities to the van Gogh work, complete with never-before-seen spirals of dust swirling across trillions of kilometres of interstellar space. This image, obtained with the Advanced Camera for Surveys on February 8, 2004, is Hubble's latest view of an expanding halo of light around a distant star, named V838 Monocerotis (V838 Mon). The illumination of interstellar dust comes from the red supergiant star at the middle of the image, which gave off a flashbulb-like pulse of light two years ago. V838 Mon is located about 20,000 light-years away from Earth in the direction of the constellation Monoceros, placing the star at the outer edge of our Milky Way galaxy This image, obtained with the Advanced Camera for Surveys on 8 February 2004, is Hubble's latest view of an expanding halo of light around a distant star, named V838 Monocerotis (V838 Mon). The illumination of interstellar dust comes from the red supergiant star at the middle of the image, which gave off a flashbulb-like pulse of light two years ago. V838 Mon is located about 20 000 light-years away from Earth in the direction of the constellation Monoceros, placing the star at the outer edge of our Milky Way galaxy. Called a 'light echo', the expanding illumination of a dusty cloud around the star has been revealing remarkable structures ever since the star suddenly brightened for several weeks in early 2002. Though Hubble has followed the light echo in several snapshots, this new image shows swirls or eddies in the dusty cloud for the first time. These eddies are probably caused by turbulence in the dust and gas around the star as they slowly expand away. The dust and gas were likely ejected from the star in a previous explosion, similar to the 2002 event, which occurred some tens of thousands of years ago. The surrounding dust remained invisible and unsuspected until suddenly illuminated by the brilliant explosion of the central star two years ago. The Hubble Space Telescope has imaged V838 Mon and its light echo several times since the star's outburst in January 2002, in order to follow the constantly changing appearance of the dust as the pulse of illumination continues to expand away from the star at the speed of light. During the outburst event, the normally faint star suddenly brightened, becoming 600 000 times more luminous than our Sun. It was thus one of the most luminous stars in the entire Milky Way, until it faded away again in April 2002. The star has some similarities to a class of objects called 'novae', which suddenly increase in brightness due to thermonuclear explosions at their surfaces; however, the detailed behaviour of V838 Mon, in particular its extremely red colour, has been completely different from any previously known nova. Nature's own piece of performance art, this structure will continue to change its appearance in coming years as the light from the stellar outburst continues to propagate outward and bounce off more distant black clouds of dust. Astronomers expect the echoes to remain visible for at least the rest of the current decade. The colour image is composed of three different exposures through a blue filter (5250 seconds), a green filter (1050 seconds) and a near-infrared filter (300 seconds). Notes for editors: Animations of the discovery and general Hubble Space Telescope background footage are available from: http://www.spacetelescope.org/bin/videos.pl?&string=heic0405 Image credit: NASA, the Hubble Heritage Team (AURA/STScI) and ESA The Hubble Space Telescope is a project of international cooperation between ESA and NASA.
External Mask Based Depth and Light Field Camera
2013-12-08
laid out in the previous light field cameras. A good overview of the sampling of the plenoptic function can be found in the survey work by Wetzstein et...view is shown in Figure 6. 5. Applications High spatial resolution depth and light fields are a rich source of information about the plenoptic ...http://www.pelicanimaging.com/. [4] E. Adelson and J. Wang. Single lens stereo with a plenoptic camera. Pattern Analysis and Machine Intelligence
NASA Astrophysics Data System (ADS)
Szentgyorgyi, Andrew; Baldwin, Daniel; Barnes, Stuart; Bean, Jacob; Ben-Ami, Sagi; Brennan, Patricia; Budynkiewicz, Jamie; Chun, Moo-Young; Conroy, Charlie; Crane, Jeffrey D.; Epps, Harland; Evans, Ian; Evans, Janet; Foster, Jeff; Frebel, Anna; Gauron, Thomas; Guzmán, Dani; Hare, Tyson; Jang, Bi-Ho; Jang, Jeong-Gyun; Jordan, Andres; Kim, Jihun; Kim, Kang-Miin; Mendes de Oliveira, Claudia Mendes; Lopez-Morales, Mercedes; McCracken, Kenneth; McMuldroch, Stuart; Miller, Joseph; Mueller, Mark; Oh, Jae Sok; Onyuksel, Cem; Ordway, Mark; Park, Byeong-Gon; Park, Chan; Park, Sung-Joon; Paxson, Charles; Phillips, David; Plummer, David; Podgorski, William; Seifahrt, Andreas; Stark, Daniel; Steiner, Joao; Uomoto, Alan; Walsworth, Ronald; Yu, Young-Sam
2016-08-01
The GMT-Consortium Large Earth Finder (G-CLEF) will be a cross-dispersed, optical band echelle spectrograph to be delivered as the first light scientific instrument for the Giant Magellan Telescope (GMT) in 2022. G-CLEF is vacuum enclosed and fiber-fed to enable precision radial velocity (PRV) measurements, especially for the detection and characterization of low-mass exoplanets orbiting solar-type stars. The passband of G-CLEF is broad, extending from 3500Å to 9500Å. This passband provides good sensitivity at blue wavelengths for stellar abundance studies and deep red response for observations of high-redshift phenomena. The design of G-CLEF incorporates several novel technical innovations. We give an overview of the innovative features of the current design. G-CLEF will be the first PRV spectrograph to have a composite optical bench so as to exploit that material's extremely low coefficient of thermal expansion, high in-plane thermal conductivity and high stiffness-to-mass ratio. The spectrograph camera subsystem is divided into a red and a blue channel, split by a dichroic, so there are two independent refractive spectrograph cameras. The control system software is being developed in model-driven software context that has been adopted globally by the GMT. G-CLEF has been conceived and designed within a strict systems engineering framework. As a part of this process, we have developed a analytical toolset to assess the predicted performance of G-CLEF as it has evolved through design phases.
Golan, A; Tepper, M; Soudry, E; Horwitz, B A; Gepstein, S
1996-01-01
Cytokinin replaces light in several aspects of the photomorphogenesis of dicot seedlings. Arabidopsis thaliana seedlings grown under red light have been shown to become disoriented, losing the negative hypocotyl gravitropism that has been observed in seedlings grown in darkness or white light. We report here that cytokinin at micromolar concentrations restores gravitropism to seedlings grown under red light. Cytokinin cancels the effect of red light on the gravity-sensing system and at the same time replaces light in the inhibition of hypocotyl elongation. Furthermore, application of the ethylene precursor 1-aminocyclopropane-1-carboxylic acid acts similarly to cytokinin. Cytokinin cannot restore gravitropism under red light to an ethylene-insensitive mutant that is defective at the EIN2 locus. Stimulation of ethylene production, therefore, can explain the action of cytokinin in restoring negative gravitropism to the hypocotyls of Arabidopsis seedlings grown under continuous red light. PMID:8938401
Interactions between red light, abscisic acid, and calcium in gravitropism
NASA Technical Reports Server (NTRS)
Leopold, A. C.; LaFavre, A. K.
1989-01-01
The effect of red light on orthogravitropism of Merit corn (Zea mays L.) roots has been attributed to its effects on the transduction phase of gravitropism (AC Leopold, SH Wettlaufer [1988] Plant Physiol 87:803-805). In an effort to characterize the orthogravitropic transduction system, comparative experiments have been carried out on the effects of red light, calcium, and abscisic acid (ABA). The red light effect can be completely satisfied with added ABA (100 micromolar) or with osmotic shock, which is presumed to increase endogenous ABA. The decay of the red light effect is closely paralleled by the decay of the ABA effect. ABA and exogenous calcium show strong additive effects when applied to either Merit or a line of corn which does not require red light for orthogravitropism. Measurements of the ABA content show marked increases in endogenous ABA in the growing region of the roots after red light. The interpretation is offered that red light or ABA may serve to increase the cytoplasmic concentrations of calcium, and that this may be an integral part of orthogravitropic transduction.
Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe
2013-01-24
The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed.
Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe
2013-01-01
The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed. PMID:23348037
Tozzi, Sabrina; Lercari, Bartolomeo; Angelini, Luciana G
2005-01-01
Isatis tinctoria L. and Isatis indigotica Fort. are biennial herbaceous plants belonging to the family of Cruciferae that are used as a source of natural indigo and show several morphological and genetic differences. Production of indigo (indigotin) precursors, indican (indoxyl beta-D glucoside) and isatan B (indoxyl ketogluconate), together with seed germination ability were compared in Isatis tinctoria and Isatis indigotica grown under six different light conditions (darkness, white, red, far red, blue, yellow light) at 25 degrees C. Light quality influenced both germination and production of indigo precursors in the two Isatis species. Different responsiveness to far red and blue light was observed. Indeed, a detrimental effect on germination by blue and far red light was found in I. tinctoria only. Different amounts of isatan B were produced under red and far red light in the two Isatis species. In I. tinctoria, the level of main indigo precursor isatan B was maximal under red light and minimal under far red light. Whereas in I. indigotica far red light promoted a large accumulation of isatan B. The photon fluence rate dependency for white and yellow light responses showed that the accumulation of indigo precursors was differently influenced in the two Isatis species. In particular, both white and yellow light enhanced above 40 micromol m(-2) s(-1) the production of isatan B in I. indigotica while only white light showed a photon fluence dependency in I. tinctoria. These results suggest a different role played by the labile and stable phytochrome species (phyA and phyB) in the isatan B production in I. tinctoria and I. indigotica. I. indigotica, whose germination percentage was not influenced by light quality, demonstrated higher germination capability compared with I. tinctoria. In fact, I. tinctoria showed high frequency of germination in darkness and under light sources that establish high phytochrome photoequilibrium (red, white and yellow light). Germination in I. tinctoria was negatively affected by far red and blue light. I. indigotica seeds appear to be indifferent to canopy-like light (far red). Our results provide further insights on the distinct behaviour of I. tinctoria and I. indigotica that belong to two different genetic clusters and different original environments.
[Research of spectrum characteristics for light conversion agricultural films].
Zhang, Song-pei; Li, Jian-yu; Chen, Juan; Xiao, Yang; Sun, Yu-e
2004-10-01
The solar spectrum and the function spectrum in chrysanthemum and tomato were determined in this paper. The research for a relation plant growth to solar spectrum showed that the efficiency of plant making use of ultraviolet light of 280-380 nm and yellow-green light of 500-600 nm and near IR spectra over 720 nm are lower, that the blue-purple light of 430-480 nm and red light of 630-690 nm are beneficial to enhancing photosynthesis and promoting plant growth. According to plant photosynthesis and solar spectrum characteristic, the author developed CaS:Cu+, Cl- blue light film, and red light film added with CaS:Eu2+, Mn2+, Cl- to convert green light into red light, and discussed the spectrum characteristic of red-blue double peak in agricultural film and rare earth organic complex which could convert ultraviolet light into red light. Just now, the study on light conversion regents in farm films is going to face new breakthrough and the technology of anti-stocks displacement to study red film which can convert near infrared light are worth to attention.
Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid
2016-06-13
Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.
Maytin, Edward V; Kaw, Urvashi; Ilyas, Muneeb; Mack, Judith A; Hu, Bo
2018-06-01
Photodynamic therapy (PDT) is a non-scarring alternative for treating basal cell carcinoma (BCC) in patients with Basal Cell Nevus Syndrome (BCNS), also known as Gorlin syndrome. In Europe, red light (635 nm) is the predominant source for PDT, whereas in the United States blue light (400 nm) is more widely available. The objective of this study was to conduct a head-to-head comparison of blue light and red light PDT in the same BCNS patients. In a pilot study of three patients with 141 BCC lesions, 5-aminolevulinate (20% solution) was applied to all tumors. After 4 h, half of the tumors were illuminated with blue light and the remainder with red light. To ensure safety while treating this many tumors simultaneously, light doses were escalated gradually. Six treatments were administered in three biweekly sessions over 4 months, with a final evaluation at 6 months. Tumor status was documented with high-resolution photographs. Persistent lesions were biopsied at 6 months. Clearance rates after blue light (98%) were slightly better than after red light (93%), with blue light shown to be statistically non-inferior to red light. Eight suspicious lesions were biopsied, 5 after red light (5/5 were BCC) and 3 after blue light (1 was BCC). Blue light PDT was reportedly less painful. Blue light and red light PDT appear to be equally safe and perhaps equally effective for treating BCC tumors in BCNS patients. Further studies to evaluate long-term clearance after blue light PDT are needed. Copyright © 2018 Elsevier B.V. All rights reserved.
Direct measurement of the transition from edge to core power coupling in a light-ion helicon source
NASA Astrophysics Data System (ADS)
Piotrowicz, P. A.; Caneses, J. F.; Showers, M. A.; Green, D. L.; Goulding, R. H.; Caughman, J. B. O.; Biewer, T. M.; Rapp, J.; Ruzic, D. N.
2018-05-01
We present time-resolved measurements of an edge-to-core power transition in a light-ion (deuterium) helicon discharge in the form of infra-red camera imaging of a thin stainless steel target plate on the Proto-Material Exposure eXperiment device. The time-resolved images measure the two-dimensional distribution of power deposition in the helicon discharge. The discharge displays a mode transition characterized by a significant increase in the on-axis electron density and core power coupling, suppression of edge power coupling, and the formation of a fast-wave radial eigenmode. Although the self-consistent mechanism that drives this transition is not yet understood, the edge-to-core power transition displays characteristics that are consistent with the discharge entering a slow-wave anti-resonant regime. RF magnetic field measurements made across the plasma column, together with the power deposition results, provide direct evidence to support the suppression of the slow-wave in favor of core plasma production by the fast-wave in a light-ion helicon source.
Direct measurement of the transition from edge to core power coupling in a light-ion helicon source
Piotrowicz, Pawel A.; Caneses, Juan F.; Showers, Melissa A.; ...
2018-05-02
Here, we present time-resolved measurements of an edge-to-core power transition in a light-ion (deuterium) helicon discharge in the form of infra-red camera imaging of a thin stainless steel target plate on the Proto-Material Exposure eXperiment device. The time-resolved images measure the two-dimensional distribution of power deposition in the helicon discharge. The discharge displays a mode transition characterized by a significant increase in the on-axis electron density and core power coupling, suppression of edge power coupling, and the formation of a fast-wave radial eigenmode. Although the self-consistent mechanism that drives this transition is not yet understood, the edge-to-core power transition displaysmore » characteristics that are consistent with the discharge entering a slow-wave anti-resonant regime. RF magnetic field measurements made across the plasma column, together with the power deposition results, provide direct evidence to support the suppression of the slow-wave in favor of core plasma production by the fast-wave in a light-ion helicon source.« less
Direct measurement of the transition from edge to core power coupling in a light-ion helicon source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piotrowicz, Pawel A.; Caneses, Juan F.; Showers, Melissa A.
Here, we present time-resolved measurements of an edge-to-core power transition in a light-ion (deuterium) helicon discharge in the form of infra-red camera imaging of a thin stainless steel target plate on the Proto-Material Exposure eXperiment device. The time-resolved images measure the two-dimensional distribution of power deposition in the helicon discharge. The discharge displays a mode transition characterized by a significant increase in the on-axis electron density and core power coupling, suppression of edge power coupling, and the formation of a fast-wave radial eigenmode. Although the self-consistent mechanism that drives this transition is not yet understood, the edge-to-core power transition displaysmore » characteristics that are consistent with the discharge entering a slow-wave anti-resonant regime. RF magnetic field measurements made across the plasma column, together with the power deposition results, provide direct evidence to support the suppression of the slow-wave in favor of core plasma production by the fast-wave in a light-ion helicon source.« less
A Red-Light Running Prevention System Based on Artificial Neural Network and Vehicle Trajectory Data
Li, Pengfei; Li, Yan; Guo, Xiucheng
2014-01-01
The high frequency of red-light running and complex driving behaviors at the yellow onset at intersections cannot be explained solely by the dilemma zone and vehicle kinematics. In this paper, the author presented a red-light running prevention system which was based on artificial neural networks (ANNs) to approximate the complex driver behaviors during yellow and all-red clearance and serve as the basis of an innovative red-light running prevention system. The artificial neural network and vehicle trajectory are applied to identify the potential red-light runners. The ANN training time was also acceptable and its predicting accurate rate was over 80%. Lastly, a prototype red-light running prevention system with the trained ANN model was described. This new system can be directly retrofitted into the existing traffic signal systems. PMID:25435870
Li, Pengfei; Li, Yan; Guo, Xiucheng
2014-01-01
The high frequency of red-light running and complex driving behaviors at the yellow onset at intersections cannot be explained solely by the dilemma zone and vehicle kinematics. In this paper, the author presented a red-light running prevention system which was based on artificial neural networks (ANNs) to approximate the complex driver behaviors during yellow and all-red clearance and serve as the basis of an innovative red-light running prevention system. The artificial neural network and vehicle trajectory are applied to identify the potential red-light runners. The ANN training time was also acceptable and its predicting accurate rate was over 80%. Lastly, a prototype red-light running prevention system with the trained ANN model was described. This new system can be directly retrofitted into the existing traffic signal systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenbuerger, S.; Brandt, C.; Brochard, F.
2010-06-15
Fast visible imaging is used on a cylindrical magnetized argon plasma produced by thermionic discharge in the Mirabelle device. To link the information collected with the camera to a physical quantity, fast camera movies of plasma structures are compared to Langmuir probe measurements. High correlation is found between light fluctuations and plasma density fluctuations. Contributions from neutral argon and ionized argon to the overall light intensity are separated by using interference filters and a light intensifier. Light emitting transitions are shown to involve a metastable neutral argon state that can be excited by thermal plasma electrons, thus explaining the goodmore » correlation between light and density fluctuations. The propagation velocity of plasma structures is calculated by adapting velocimetry methods to the fast camera movies. The resulting estimates of instantaneous propagation velocity are in agreement with former experiments. The computation of mean velocities is discussed.« less
NASA Astrophysics Data System (ADS)
Oldenbürger, S.; Brandt, C.; Brochard, F.; Lemoine, N.; Bonhomme, G.
2010-06-01
Fast visible imaging is used on a cylindrical magnetized argon plasma produced by thermionic discharge in the Mirabelle device. To link the information collected with the camera to a physical quantity, fast camera movies of plasma structures are compared to Langmuir probe measurements. High correlation is found between light fluctuations and plasma density fluctuations. Contributions from neutral argon and ionized argon to the overall light intensity are separated by using interference filters and a light intensifier. Light emitting transitions are shown to involve a metastable neutral argon state that can be excited by thermal plasma electrons, thus explaining the good correlation between light and density fluctuations. The propagation velocity of plasma structures is calculated by adapting velocimetry methods to the fast camera movies. The resulting estimates of instantaneous propagation velocity are in agreement with former experiments. The computation of mean velocities is discussed.
Chromatic aberration correction: an enhancement to the calibration of low-cost digital dermoscopes.
Wighton, Paul; Lee, Tim K; Lui, Harvey; McLean, David; Atkins, M Stella
2011-08-01
We present a method for calibrating low-cost digital dermoscopes that corrects for color and inconsistent lighting and also corrects for chromatic aberration. Chromatic aberration is a form of radial distortion that often occurs in inexpensive digital dermoscopes and creates red and blue halo-like effects on edges. Being radial in nature, distortions due to chromatic aberration are not constant across the image, but rather vary in both magnitude and direction. As a result, distortions are not only visually distracting but could also mislead automated characterization techniques. Two low-cost dermoscopes, based on different consumer-grade cameras, were tested. Color is corrected by imaging a reference and applying singular value decomposition to determine the transformation required to ensure accurate color reproduction. Lighting is corrected by imaging a uniform surface and creating lighting correction maps. Chromatic aberration is corrected using a second-order radial distortion model. Our results for color and lighting calibration are consistent with previously published results, while distortions due to chromatic aberration can be reduced by 42-47% in the two systems considered. The disadvantages of inexpensive dermoscopy can be quickly substantially mitigated with a suitable calibration procedure. © 2011 John Wiley & Sons A/S.
Carvalho, Sofia D; Folta, Kevin M
2014-01-01
Different light wavelengths have specific effects on plant growth and development. Narrow-bandwidth light-emitting diode (LED) lighting may be used to directionally manipulate size, color and metabolites in high-value fruits and vegetables. In this report, Red Russian kale (Brassica napus) seedlings were grown under specific light conditions and analyzed for photomorphogenic responses, pigment accumulation and nutraceutical content. The results showed that this genotype responds predictably to darkness, blue and red light, with suppression of hypocotyl elongation, development of pigments and changes in specific metabolites. However, these seedlings were relatively hypersensitive to far-red light, leading to uncharacteristically short hypocotyls and high pigment accumulation, even after growth under very low fluence rates (<1 μmol m−2 s−1). General antioxidant levels and aliphatic glucosinolates are elevated by far-red light treatments. Sequential treatments of darkness, blue light, red light and far-red light were applied throughout sprout development to alter final product quality. These results indicate that sequential treatment with narrow-bandwidth light may be used to affect key economically important traits in high-value crops. PMID:26504531
Multi-pulse shadowgraphic RGB illumination and detection for flow tracking
NASA Astrophysics Data System (ADS)
Menser, Jan; Schneider, Florian; Dreier, Thomas; Kaiser, Sebastian A.
2018-06-01
This work demonstrates the application of a multi-color LED and a consumer color camera for visualizing phase boundaries in two-phase flows, in particular for particle tracking velocimetry. The LED emits a sequence of short light pulses, red, green, then blue (RGB), and through its color-filter array, the camera captures all three pulses on a single RGB frame. In a backlit configuration, liquid droplets appear as shadows in each color channel. Color reversal and color cross-talk correction yield a series of three frozen-flow images that can be used for further analysis, e.g., determining the droplet velocity by particle tracking. Three example flows are presented, solid particles suspended in water, the penetrating front of a gasoline direct-injection spray, and the liquid break-up region of an "air-assisted" nozzle. Because of the shadowgraphic arrangement, long path lengths through scattering media lower image contrast, while visualization of phase boundaries with high resolution is a strength of this method. Apart from a pulse-and-delay generator, the overall system cost is very low.
Prediction of Viking lander camera image quality
NASA Technical Reports Server (NTRS)
Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.
1976-01-01
Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.
Optical stereo video signal processor
NASA Technical Reports Server (NTRS)
Craig, G. D. (Inventor)
1985-01-01
An otpical video signal processor is described which produces a two-dimensional cross-correlation in real time of images received by a stereo camera system. The optical image of each camera is projected on respective liquid crystal light valves. The images on the liquid crystal valves modulate light produced by an extended light source. This modulated light output becomes the two-dimensional cross-correlation when focused onto a video detector and is a function of the range of a target with respect to the stereo camera. Alternate embodiments utilize the two-dimensional cross-correlation to determine target movement and target identification.
ERIC Educational Resources Information Center
Fisher, Diane K.; Novati, Alexander
2009-01-01
On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…
Spitzer Reveals Stellar 'Family Tree'
NASA Technical Reports Server (NTRS)
2008-01-01
[figure removed for brevity, see original site] High resolution poster version Generations of stars can be seen in this new infrared portrait from NASA's Spitzer Space Telescope. In this wispy star-forming region, called W5, the oldest stars can be seen as blue dots in the centers of the two hollow cavities (other blue dots are background and foreground stars not associated with the region). Younger stars line the rims of the cavities, and some can be seen as pink dots at the tips of the elephant-trunk-like pillars. The white knotty areas are where the youngest stars are forming. Red shows heated dust that pervades the region's cavities, while green highlights dense clouds. W5 spans an area of sky equivalent to four full moons and is about 6,500 light-years away in the constellation Cassiopeia. The Spitzer picture was taken over a period of 24 hours. Like other massive star-forming regions, such as Orion and Carina, W5 contains large cavities that were carved out by radiation and winds from the region's most massive stars. According to the theory of triggered star-formation, the carving out of these cavities pushes gas together, causing it to ignite into successive generations of new stars. This image contains some of the best evidence yet for the triggered star-formation theory. Scientists analyzing the photo have been able to show that the ages of the stars become progressively and systematically younger with distance from the center of the cavities. This is a three-color composite showing infrared observations from two Spitzer instruments. Blue represents 3.6-micron light and green shows light of 8 microns, both captured by Spitzer's infrared array camera. Red is 24-micron light detected by Spitzer's multiband imaging photometer.NASA Technical Reports Server (NTRS)
2004-01-01
In the quest to better understand the birth of stars and the formation of new worlds, astronomers have used NASA's Spitzer Space Telescope to examine the massive stars contained in a cloudy region called Sharpless 140. This cloud is a fascinating microcosm of a star-forming region since it exhibits, within a relatively small area, all of the classic manifestations of stellar birth. Sharpless 140 lies almost 3000 light-years from Earth in the constellation Cepheus. At its heart is a cluster of three deeply embedded young stars, which are each several thousand times brighter than the Sun. Though they are strikingly visible in this image from Spitzer's infrared array camera, they are completely obscured in visible light, buried within the core of the surrounding dust cloud. The extreme youth of at least one of these stars is indicated by the presence of a stream of gas moving at high velocities. Such outflows are signatures of the processes surrounding a star that is still gobbling up material as part of its formation. The bright red bowl, or arc, seen in this image traces the outer surface of the dense dust cloud encasing the young stars. This arc is made up primarily of organic compounds called polycyclic aromatic hydrocarbons, which glow on the surface of the cloud. Ultraviolet light from a nearby bright star outside of the image is 'eating away' at these molecules. Eventually, this light will destroy the dust envelope and the masked young stars will emerge. This false-color image was taken on Oct. 11, 2003 and is composed of photographs obtained at four wavelengths: 3.6 microns (blue), 4.5 microns (green), 5.8 microns (orange) and 8 microns (red).Spitzer Spies Spectacular Sombrero
2005-05-04
NASA's Spitzer Space Telescope set its infrared eyes on one of the most famous objects in the sky, Messier 104, also called the Sombrero galaxy. In this striking infrared picture, Spitzer sees an exciting new view of a galaxy that in visible light has been likened to a "sombrero," but here looks more like a "bulls-eye." Recent observations using Spitzer's infrared array camera uncovered the bright, smooth ring of dust circling the galaxy, seen in red. In visible light, because this galaxy is seen nearly edge-on, only the near rim of dust can be clearly seen in silhouette. Spitzer's full view shows the disk is warped, which is often the result of a gravitational encounter with another galaxy, and clumpy areas spotted in the far edges of the ring indicate young star-forming regions. Spitzer's infrared view of the starlight from this galaxy, seen in blue, can pierce through obscuring murky dust that dominates in visible light. As a result, the full extent of the bulge of stars and an otherwise hidden disk of stars within the dust ring are easily seen. The Sombrero galaxy is located some 28 million light years away. Viewed from Earth, it is just six degrees south of its equatorial plane. Spitzer detected infrared emission not only from the ring, but from the center of the galaxy too, where there is a huge black hole, believed to be a billion times more massive than our Sun. This picture is composed of four images taken at 3.6 (blue), 4.5 (green), 5.8 (orange), and 8.0 (red) microns. The contribution from starlight (measured at 3.6 microns) has been subtracted from the 5.8 and 8-micron images to enhance the visibility of the dust features. http://photojournal.jpl.nasa.gov/catalog/PIA07899
2004-05-11
In the quest to better understand the birth of stars and the formation of new worlds, astronomers have used NASA's Spitzer Space Telescope to examine the massive stars contained in a cloudy region called Sharpless 140. This cloud is a fascinating microcosm of a star-forming region since it exhibits, within a relatively small area, all of the classic manifestations of stellar birth. Sharpless 140 lies almost 3000 light-years from Earth in the constellation Cepheus. At its heart is a cluster of three deeply embedded young stars, which are each several thousand times brighter than the Sun. Though they are strikingly visible in this image from Spitzer's infrared array camera, they are completely obscured in visible light, buried within the core of the surrounding dust cloud. The extreme youth of at least one of these stars is indicated by the presence of a stream of gas moving at high velocities. Such outflows are signatures of the processes surrounding a star that is still gobbling up material as part of its formation. The bright red bowl, or arc, seen in this image traces the outer surface of the dense dust cloud encasing the young stars. This arc is made up primarily of organic compounds called polycyclic aromatic hydrocarbons, which glow on the surface of the cloud. Ultraviolet light from a nearby bright star outside of the image is "eating away" at these molecules. Eventually, this light will destroy the dust envelope and the masked young stars will emerge. This false-color image was taken on Oct. 11, 2003 and is composed of photographs obtained at four wavelengths: 3.6 microns (blue), 4.5 microns (green), 5.8 microns (orange) and 8 microns (red). http://photojournal.jpl.nasa.gov/catalog/PIA05878
Photodynamic therapy of acne vulgaris.
NASA Astrophysics Data System (ADS)
Ershova, Ekaterina Y.; Karimova, Lubov N.; Kharnas, Sergey S.; Kuzmin, Sergey G.; Loschenov, Victor B.
2003-06-01
Photodynamic therapy (PDT) with topical 5-aminolevulinic acid (ALA) was tested for the treatment of acne vulgaris. Patients with acne were treated with ALA plus red light. Ten percent water solution of ALA was applied with 1,5-2 h occlusion and then 18-45 J/cm2 630 nm light was given. Bacterial endogenous porphyrins fluorescence also was used for acne therapy. Treatment control and diagnostics was realized by fluorescence spectra and fluorescence image. Light sources and diagnostic systems were used: semiconductor laser (λ=630 nm, Pmax=1W), (LPhT-630-01-BIOSPEC); LED system for PDT and diagnostics with fluorescent imager (λ=635 nm, P=2W, p=50 mW/cm2), (UFPh-630-01-BIOSPEC); high sensitivity CCD video camera with narrow-band wavelength filter (central wavelength 630 nm); laser electronic spectrum analyzer for fluorescent diagnostics and photodynamic therapy monitoring (LESA-01-BIOSPEC). Protoporphyrin IX (PP IX) and endogenous porphyrins concentrations were measured by fluorescence at wavelength, correspondingly, 700 nm and 650 nm. It was shown that topical ALA is converted into PP IX in hair follicles, sebaceous glands and acne scars. The amount of resulting PP IX is sufficient for effective PDT. There was good clinical response and considerable clearance of acne lesion. ALA-PDT also had good cosmetic effect in treatment acne scars. PDT with ALA and red light assist in opening corked pores, destroying Propionibacterium acnes and decreasing sebum secretion. PDT treatment associated with several adverse effects: oedema and/or erytema for 3-5 days after PDT, epidermal exfoliation from 5th to 10th day and slight pigmentation during 1 month after PDT. ALA-PDT is effective for acne and can be used despite several side effects.
Dataset of red light induced pupil constriction superimposed on post-illumination pupil response.
Lei, Shaobo; Goltz, Herbert C; Sklar, Jaime C; Wong, Agnes M F
2016-09-01
We collected and analyzed pupil diameter data from of 7 visually normal participants to compare the maximum pupil constriction (MPC) induced by "Red Only" vs. "Blue+Red" visual stimulation conditions. The "Red Only" condition consisted of red light (640±10 nm) stimuli of variable intensity and duration presented to dark-adapted eyes with pupils at resting state. This condition stimulates the cone-driven activity of the intrinsically photosensitive retinal ganglion cells (ipRGC). The "Blue+Red" condition consisted of the same red light stimulus presented during ongoing blue (470±17 nm) light-induced post-illumination pupil response (PIPR), representing the cone-driven ipRGC activity superimposed on the melanopsin-driven intrinsic activity of the ipRGCs ("The Absence of Attenuating Effect of Red light Exposure on Pre-existing Melanopsin-Driven Post-illumination Pupil Response" Lei et al. (2016) [1]). MPC induced by the "Red Only" condition was compared with the MPC induced by the "Blue+Red" condition by multiple paired sample t -tests with Bonferroni correction.
Kurt W. Gottschalk
1987-01-01
Northern red oak, black oak, black cherry, and red maple seedlings were grown under light treatments ranging from 8 to 94% of full sunlight for 2 years. Growth was least at the lowest light level and total dry weight increased with increasing light. Total dry-weight rankings (largest to smallest) at all light levels were black cherry, northern red oak, black oak, and...
Choi, Myoung-Soon; Yun, Sook Jung; Beom, Hee Ju; Park, Hyoung Ryun; Lee, Jee-Bum
2011-07-01
Propionibacterium acnes naturally produces endogenous porphyrins that are composed of coproporphyrin III (CPIII) and protoporphyrin IX (PpIX). Red light alone and photodynamic therapy (PDT) improve acne vulgaris clinically, but there remains a paucity of quantitative data that directly examine the bactericidal effects that result from PDT on P. acnes itself in vitro. The purpose of this study was to measure the difference of bactericidal effects of 5-aminolevulinic acid (ALA)-PDT with red and blue light on P. acnes. P. acnes were cultured under anaerobic conditions and divided into two groups (ALA-treated group and control group), and were then illuminated with blue (415 nm) and red (635 nm) lights using a light-emitting diode (LED). The cultured P. acnes were killed with both blue and red LED light illumination. The efficacy increased with larger doses of light and a greater number of consecutive illuminations. We demonstrated that red light phototherapy was less effective for the eradication of P. acnes than blue light phototherapy without the addition of ALA. However, pretreatment with ALA could enhance markedly the efficacy of red light phototherapy. © 2010 Japanese Dermatological Association.
Kim, Yeo Jin; Kim, Hyoung-June; Kim, Hye Lim; Kim, Hyo Jeong; Kim, Hyun Soo; Lee, Tae Ryong; Shin, Dong Wook; Seo, Young Rok
2017-02-01
The phototherapeutic effects of visible red light on skin have been extensively investigated, but the underlying biological mechanisms remain poorly understood. We aimed to elucidate the protective mechanism of visible red light in terms of DNA repair of UV-induced oxidative damage in normal human dermal fibroblasts. The protective effect of visible red light on UV-induced DNA damage was identified by several assays in both two-dimensional and three-dimensional cell culture systems. With regard to the protective mechanism of visible red light, our data showed alterations in base excision repair mediated by growth arrest and DNA damage inducible, alpha (GADD45A). We also observed an enhancement of the physical activity of GADD45A and apurinic/apyrimidinic endonuclease 1 (APE1) by visible red light. Moreover, UV-induced DNA damages were diminished by visible red light in an APE1-dependent manner. On the basis of the decrease in GADD45A-APE1 interaction in the activating transcription factor-2 (ATF2)-knockdown system, we suggest a role for ATF2 modulation in GADD45A-mediated DNA repair upon visible red light exposure. Thus, the enhancement of GADD45A-mediated base excision repair modulated by ATF2 might be a potential protective mechanism of visible red light. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
A Simple Spectrophotometer Using Common Materials and a Digital Camera
ERIC Educational Resources Information Center
Widiatmoko, Eko; Widayani; Budiman, Maman; Abdullah, Mikrajuddin; Khairurrijal
2011-01-01
A simple spectrophotometer was designed using cardboard, a DVD, a pocket digital camera, a tripod and a computer. The DVD was used as a diffraction grating and the camera as a light sensor. The spectrophotometer was calibrated using a reference light prior to use. The spectrophotometer was capable of measuring optical wavelengths with a…
Creating and Using a Camera Obscura
ERIC Educational Resources Information Center
Quinnell, Justin
2012-01-01
The camera obscura (Latin for "darkened room") is the earliest optical device and goes back over 2500 years. The small pinhole or lens at the front of the room allows light to enter and this is then "projected" onto a screen inside the room. This differs from a camera, which projects its image onto light-sensitive material.…
An assessment of the utility of a non-metric digital camera for measuring standing trees
Neil Clark; Randolph H. Wynne; Daniel L. Schmoldt; Matthew F. Winn
2000-01-01
Images acquired with a commercially available digital camera were used to make measurements on 20 red oak (Quercus spp.) stems. The ranges of diameter at breast height (DBH) and height to a 10 cm upper-stem diameter were 16-66 cm and 12-20 m, respectively. Camera stations located 3, 6, 9, 12, and 15 m from the stem were studied to determine the best distance to be...
Deng, Mingdan; Qian, Hongmei; Chen, Lili; Sun, Bo; Chang, Jiaqi; Miao, Huiying; Cai, Congxi; Wang, Qiaomei
2017-05-01
The effects of pre-harvest red light irradiation on main healthy phytochemicals as well as antioxidant activity of Chinese kale sprouts during postharvest storage were investigated. 6-day-old sprouts were treated by red light for 24h before harvest and sampled for further analysis of nutritional quality on the first, second and third day after harvest. The results indicated that red light exposure notably postponed the degradation of aliphatic, indole, and total glucosinolates during postharvest storage. The vitamin C level was remarkably higher in red light treated sprouts on the first and second day after harvest when compared with the control. In addition, red light treatment also enhanced the accumulation of total phenolics and maintained higher level of antioxidant activity than the control. All above results suggested that pre-harvest red light treatment might provide a new strategy to maintain the nutritive value of Chinese kale sprouts during postharvest storage. Copyright © 2016 Elsevier Ltd. All rights reserved.
Molecular mechanisms and ecological function of far-red light signalling.
Sheerin, David J; Hiltbrunner, Andreas
2017-11-01
Land plants possess the ability to sense and respond to far-red light (700-760 nm), which serves as an important environmental cue. Due to the nature of far-red light, it is not absorbed by chlorophyll and thus is enriched in canopy shade and will also penetrate deeper into soil than other visible wavelengths. Far-red light responses include regulation of seed germination, suppression of hypocotyl growth, induction of flowering and accumulation of anthocyanins, which depend on one member of the phytochrome photoreceptor family, phytochrome A (phyA). Here, we review the current understanding of the underlying molecular mechanisms of how plants sense far-red light through phyA and the physiological responses to this light quality. Light-activated phytochromes act on two primary pathways within the nucleus; suppression of the E3 ubiquitin ligase complex CUL4/DDB1 COP1/SPA and inactivation of the PHYTOCHROME INTERACTING FACTOR (PIF) family of bHLH transcription factors. These pathways integrate with other signal transduction pathways, including phytohormones, for tissue and developmental stage specific responses. Unlike other phytochromes that mediate red-light responses, phyA is transported from the cytoplasm to the nucleus in far-red light by the shuttle proteins FAR-RED ELONGATED HYPOCOTYL 1 (FHY1) and FHY1-LIKE (FHL). However, additional mechanisms must exist that shift the action of phyA to far-red light; current hypotheses are discussed. © 2017 John Wiley & Sons Ltd.
Green light in photomorphogenic development
NASA Astrophysics Data System (ADS)
Maruhnich, Stefanie Anne
Light quality, quantity, and duration provide essential environmental cues that shape plant growth and development. Over the last century, researchers have worked to discover how plants sense, integrate, and respond to red, blue, and far-red light. Green light is often considered a “benign” wavelength with little to no effect in plant development. However, sparse experiments in the literature demonstrate that green effects are often counterintuitive to normal light responses and oppose red- and blue-light-induced responses. Green light effects on plant growth and development are described here through the use of custom, tunable LED, light-emitting diode, chambers. These light sources allow for specific light qualities and quantities to be administered. The effects of green wavebands were assessed when red and blue photomorphogenic systems were active to answer the question: Are the effects of an inhibitor (green light) more evident in the presence of inducers (red and blue light)? In seedlings, supplemental green light increased hypocotyl elongation opposite to classical inhibition of hypocotyl elongation associated with growth in light and induced by red and blue wavebands. Results indicate that added green light induced a reversion of light-grown phenotypes. In mature plants, supplemental green light induced phenotypes typical of the shade-avoidance syndrome, including elongated petioles, smaller leaf areas, and leaf hyponasty. These responses are typical of lower-light conditions or far-red enriched environments. Contrary to far-red-light-induced shade-avoidance, data indicate green delays flowering. In Arabidopsis and strawberry plants, anthocyanin levels also decreased when green light was added to red and blue light treatments, which is again opposite to normal light-induced phenotypes. Photoreceptor mutants were tested and indicate green light effects in early development are cryptochromedependent. However, green-light-induced shade-avoidance responses were cryptochrome-independent. A candidate gene approach was used to identify other elements required for green light sensing and/or response. Defects in some green light responses were observed for mutants in CCD8/Max4, a putative carotenoid cleavage enzyme with high sequence similarity to a critical enzyme in animal vision. These data support a role for green light in plant development which opposes normal light-induced responses and indicate the existence of at least two green light sensing systems.
Ho Mien, Ivan; Chua, Eric Chern-Pin; Lau, Pauline; Tan, Luuan-Chin; Lee, Ivan Tian-Guang; Yeo, Sing-Chen; Tan, Sara Shuhui; Gooley, Joshua J
2014-01-01
Exposure to light is a major determinant of sleep timing and hormonal rhythms. The role of retinal cones in regulating circadian physiology remains unclear, however, as most studies have used light exposures that also activate the photopigment melanopsin. Here, we tested the hypothesis that exposure to alternating red light and darkness can enhance circadian resetting responses in humans by repeatedly activating cone photoreceptors. In a between-subjects study, healthy volunteers (n = 24, 21-28 yr) lived individually in a laboratory for 6 consecutive days. Circadian rhythms of melatonin, cortisol, body temperature, and heart rate were assessed before and after exposure to 6 h of continuous red light (631 nm, 13 log photons cm(-2) s(-1)), intermittent red light (1 min on/off), or bright white light (2,500 lux) near the onset of nocturnal melatonin secretion (n = 8 in each group). Melatonin suppression and pupillary constriction were also assessed during light exposure. We found that circadian resetting responses were similar for exposure to continuous versus intermittent red light (P = 0.69), with an average phase delay shift of almost an hour. Surprisingly, 2 subjects who were exposed to red light exhibited circadian responses similar in magnitude to those who were exposed to bright white light. Red light also elicited prolonged pupillary constriction, but did not suppress melatonin levels. These findings suggest that, for red light stimuli outside the range of sensitivity for melanopsin, cone photoreceptors can mediate circadian phase resetting of physiologic rhythms in some individuals. Our results also show that sensitivity thresholds differ across non-visual light responses, suggesting that cones may contribute differentially to circadian resetting, melatonin suppression, and the pupillary light reflex during exposure to continuous light.
NASA Astrophysics Data System (ADS)
Sajjadi, Seyed; Buelna, Xavier; Eloranta, Jussi
2018-01-01
Application of inexpensive light emitting diodes as backlight sources for time-resolved shadowgraph imaging is demonstrated. The two light sources tested are able to produce light pulse sequences in the nanosecond and microsecond time regimes. After determining their time response characteristics, the diodes were applied to study the gas bubble formation around laser-heated copper nanoparticles in superfluid helium at 1.7 K and to determine the local cavitation bubble dynamics around fast moving metal micro-particles in the liquid. A convolutional neural network algorithm for analyzing the shadowgraph images by a computer is presented and the method is validated against the results from manual image analysis. The second application employed the red-green-blue light emitting diode source that produces light pulse sequences of the individual colors such that three separate shadowgraph frames can be recorded onto the color pixels of a charge-coupled device camera. Such an image sequence can be used to determine the moving object geometry, local velocity, and acceleration/deceleration. These data can be used to calculate, for example, the instantaneous Reynolds number for the liquid flow around the particle. Although specifically demonstrated for superfluid helium, the technique can be used to study the dynamic response of any medium that exhibits spatial variations in the index of refraction.
Organic-on-silicon complementary metal-oxide-semiconductor colour image sensors.
Lim, Seon-Jeong; Leem, Dong-Seok; Park, Kyung-Bae; Kim, Kyu-Sik; Sul, Sangchul; Na, Kyoungwon; Lee, Gae Hwang; Heo, Chul-Joon; Lee, Kwang-Hee; Bulliard, Xavier; Satoh, Ryu-Ichi; Yagi, Tadao; Ro, Takkyun; Im, Dongmo; Jung, Jungkyu; Lee, Myungwon; Lee, Tae-Yon; Han, Moon Gyu; Jin, Yong Wan; Lee, Sangyoon
2015-01-12
Complementary metal-oxide-semiconductor (CMOS) colour image sensors are representative examples of light-detection devices. To achieve extremely high resolutions, the pixel sizes of the CMOS image sensors must be reduced to less than a micron, which in turn significantly limits the number of photons that can be captured by each pixel using silicon (Si)-based technology (i.e., this reduction in pixel size results in a loss of sensitivity). Here, we demonstrate a novel and efficient method of increasing the sensitivity and resolution of the CMOS image sensors by superposing an organic photodiode (OPD) onto a CMOS circuit with Si photodiodes, which consequently doubles the light-input surface area of each pixel. To realise this concept, we developed organic semiconductor materials with absorption properties selective to green light and successfully fabricated highly efficient green-light-sensitive OPDs without colour filters. We found that such a top light-receiving OPD, which is selective to specific green wavelengths, demonstrates great potential when combined with a newly designed Si-based CMOS circuit containing only blue and red colour filters. To demonstrate the effectiveness of this state-of-the-art hybrid colour image sensor, we acquired a real full-colour image using a camera that contained the organic-on-Si hybrid CMOS colour image sensor.
Organic-on-silicon complementary metal–oxide–semiconductor colour image sensors
Lim, Seon-Jeong; Leem, Dong-Seok; Park, Kyung-Bae; Kim, Kyu-Sik; Sul, Sangchul; Na, Kyoungwon; Lee, Gae Hwang; Heo, Chul-Joon; Lee, Kwang-Hee; Bulliard, Xavier; Satoh, Ryu-Ichi; Yagi, Tadao; Ro, Takkyun; Im, Dongmo; Jung, Jungkyu; Lee, Myungwon; Lee, Tae-Yon; Han, Moon Gyu; Jin, Yong Wan; Lee, Sangyoon
2015-01-01
Complementary metal–oxide–semiconductor (CMOS) colour image sensors are representative examples of light-detection devices. To achieve extremely high resolutions, the pixel sizes of the CMOS image sensors must be reduced to less than a micron, which in turn significantly limits the number of photons that can be captured by each pixel using silicon (Si)-based technology (i.e., this reduction in pixel size results in a loss of sensitivity). Here, we demonstrate a novel and efficient method of increasing the sensitivity and resolution of the CMOS image sensors by superposing an organic photodiode (OPD) onto a CMOS circuit with Si photodiodes, which consequently doubles the light-input surface area of each pixel. To realise this concept, we developed organic semiconductor materials with absorption properties selective to green light and successfully fabricated highly efficient green-light-sensitive OPDs without colour filters. We found that such a top light-receiving OPD, which is selective to specific green wavelengths, demonstrates great potential when combined with a newly designed Si-based CMOS circuit containing only blue and red colour filters. To demonstrate the effectiveness of this state-of-the-art hybrid colour image sensor, we acquired a real full-colour image using a camera that contained the organic-on-Si hybrid CMOS colour image sensor. PMID:25578322
Automatic diagnostic system for measuring ocular refractive errors
NASA Astrophysics Data System (ADS)
Ventura, Liliane; Chiaradia, Caio; de Sousa, Sidney J. F.; de Castro, Jarbas C.
1996-05-01
Ocular refractive errors (myopia, hyperopia and astigmatism) are automatic and objectively determined by projecting a light target onto the retina using an infra-red (850 nm) diode laser. The light vergence which emerges from the eye (light scattered from the retina) is evaluated in order to determine the corresponding ametropia. The system basically consists of projecting a target (ring) onto the retina and analyzing the scattered light with a CCD camera. The light scattered by the eye is divided into six portions (3 meridians) by using a mask and a set of six prisms. The distance between the two images provided by each of the meridians, leads to the refractive error of the referred meridian. Hence, it is possible to determine the refractive error at three different meridians, which gives the exact solution for the eye's refractive error (spherical and cylindrical components and the axis of the astigmatism). The computational basis used for the image analysis is a heuristic search, which provides satisfactory calculation times for our purposes. The peculiar shape of the target, a ring, provides a wider range of measurement and also saves parts of the retina from unnecessary laser irradiation. Measurements were done in artificial and in vivo eyes (using cicloplegics) and the results were in good agreement with the retinoscopic measurements.
2004-09-07
Lonely Mimas swings around Saturn, seeming to gaze down at the planet's splendid rings. The outermost, narrow F ring is visible here and exhibits some clumpy structure near the bottom of the frame. The shadow of Saturn's southern hemisphere stretches almost entirely across the rings. Mimas is 398 kilometers (247 miles) wide. The image was taken with the Cassini spacecraft narrow angle camera on August 15, 2004, at a distance of 8.8 million kilometers (5.5 million miles) from Saturn, through a filter sensitive to visible red light. The image scale is 53 kilometers (33 miles) per pixel. Contrast was slightly enhanced to aid visibility.almost entirely across the rings. Mimas is 398 kilometers (247 miles) wide. http://photojournal.jpl.nasa.gov/catalog/PIA06471
Three-dimensional particle tracking via tunable color-encoded multiplexing.
Duocastella, Martí; Theriault, Christian; Arnold, Craig B
2016-03-01
We present a novel 3D tracking approach capable of locating single particles with nanometric precision over wide axial ranges. Our method uses a fast acousto-optic liquid lens implemented in a bright field microscope to multiplex light based on color into different and selectable focal planes. By separating the red, green, and blue channels from an image captured with a color camera, information from up to three focal planes can be retrieved. Multiplane information from the particle diffraction rings enables precisely locating and tracking individual objects up to an axial range about 5 times larger than conventional single-plane approaches. We apply our method to the 3D visualization of the well-known coffee-stain phenomenon in evaporating water droplets.
Study of smartphone suitability for mapping of skin chromophores
NASA Astrophysics Data System (ADS)
Kuzmina, Ilona; Lacis, Matiss; Spigulis, Janis; Berzina, Anna; Valeine, Lauma
2015-09-01
RGB (red-green-blue) technique for mapping skin chromophores by smartphones is proposed and studied. Three smartphones of different manufacturers were tested on skin phantoms and in vivo on benign skin lesions using a specially designed light source for illumination. Hemoglobin and melanin indices obtained by these smartphones showed differences in both tests. In vitro tests showed an increment of hemoglobin and melanin indices with the concentration of chromophores in phantoms. In vivo tests indicated higher hemoglobin index in hemangiomas than in nevi and healthy skin, and nevi showed higher melanin index compared to the healthy skin. Smartphones that allow switching off the automatic camera settings provided useful data, while those with "embedded" automatic settings appear to be useless for distant skin chromophore mapping.
Study of smartphone suitability for mapping of skin chromophores.
Kuzmina, Ilona; Lacis, Matiss; Spigulis, Janis; Berzina, Anna; Valeine, Lauma
2015-09-01
RGB (red-green-blue) technique for mapping skin chromophores by smartphones is proposed and studied. Three smartphones of different manufacturers were tested on skin phantoms and in vivo on benign skin lesions using a specially designed light source for illumination. Hemoglobin and melanin indices obtained by these smartphones showed differences in both tests. In vitro tests showed an increment of hemoglobin and melanin indices with the concentration of chromophores in phantoms. In vivo tests indicated higher hemoglobin index in hemangiomas than in nevi and healthy skin, and nevi showed higher melanin index compared to the healthy skin. Smartphones that allow switching off the automatic camera settings provided useful data, while those with “embedded” automatic settings appear to be useless for distant skin chromophore mapping.
Large Plant Growth Chambers: Flying Soon on a Space Station near You!
NASA Technical Reports Server (NTRS)
Massa, Gioia D.; Morrow, Robert C.; Levine, Howard G.
2014-01-01
The International Space Station (ISS) now has platforms for conducting research on horticultural plant species, and those capabilities continue to grow. The Veggie vegetable production system will be deployed to the ISS in Spring of 2014 to act as an applied research platform with goals of studying food production in space, providing the crew with a source of fresh food, allowing behavioral health and plant microbiology experimentation, and being a source of recreation and enjoyment for the crew. Veggie was conceived, designed, and constructed by Orbital Technologies Corporation (ORBITEC, Madison, WI). Veggie is the largest plant growth chamber that NASA has flown to date, and is capable of growing a wide array of horticultural crops. It was designed for low energy usage, low launch mass and stowage volume, and minimal crew time requirements. The Veggie flight hardware consists of a light cap containing red (630 nanometers), blue, (455 nanometers) and green (530 nanometers) light emitting diodes. Interfacing with the light cap is an extendable bellows baseplate secured to the light cap via magnetic closures and stabilized with extensible flexible arms. The baseplate contains vents allowing air from the ISS cabin to be pulled through the plant growth area by a fan in the light cap. The baseplate holds a Veggie root mat reservoir that will supply water to plant pillows attached via elastic cords. Plant pillows are packages of growth media and seeds that will be sent to ISS dry and installed and hydrated on orbit. Pillows can be constructed in various sizes for different plant types. Watering will be via passive wicking from the root mat to the pillows. Science procedures will include photography or videography, plant thinning, pollination, harvesting, microbial sampling, water sampling, etcetera. Veggie is one of the ISS flight options currently available for research investigations on plants. The Plant Habitat (PH) is being designed and constructed through a NASA-ORBITEC collaboration, and is scheduled to fly on ISS around 2016. This large plant chamber will control light quality, level, and timing, temperature, CO2, relative humidity, and irrigation, while scrubbing ethylene. Additional monitoring capabilities include leaf temperature sensing and root zone moisture and oxygen sensing. The PH light cap will have red (630 nanometers), blue (450 nanometers), green (525 nanometers), far red (730 nanometers) and broad spectrum white light emitting diodes. There will be several internal cameras to monitor and record plant growth and operations.
Zhang, Yunting; Jiang, Leiyu; Li, Yali; Chen, Qing; Ye, Yuntian; Zhang, Yong; Luo, Ya; Sun, Bo; Wang, Xiaorong; Tang, Haoru
2018-04-03
Light conditions can cause quantitative and qualitative changes in anthocyanin. However, little is known about the underlying mechanism of light quality-regulated anthocyanin accumulation in fruits. In this study, light-emitting diodes (LEDs) were applied to explore the effect of red and blue light on strawberry coloration. The results showed contents of total anthocyanins (TA), pelargonidin 3-glucoside (Pg3G) and pelargonidin 3-malonylglucoside (Pg3MG) significantly increased after blue and red light treatment. Pg3G was the major anthocyanin component in strawberry fruits, accounting for more than 80% of TA, whereas Pg3MG accounted for a smaller proportion. Comparative transcriptome analysis was conducted using libraries from the treated strawberries. A total of 1402, 5034, and 3764 differentially-expressed genes (DEGs) were identified in three pairwise comparisons (red light versus white light, RL-VS-WL; blue light versus white light, BL-VS-WL; blue light versus red light, BL-VS-RL), respectively. Photoreceptors and light transduction components remained dynamic to up-regulate the expression of regulatory factors and structural genes related to anthocyanin biosynthesis under red and white light, whereas most genes had low expression levels that were not consistent with the highest total anthocyanin content under blue light. Therefore, the results indicated that light was an essential environmental factor for anthocyanin biosynthesis before the anthocyanin concentration reached saturation in strawberry fruits, and blue light could quickly stimulate the accumulation of anthocyanin in the fruit. In addition, red light might contribute to the synthesis of proanthocyanidins by inducing LAR and ANR .
X-ray detectors at the Linac Coherent Light Source.
Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; Carron, Sebastian; Dragone, Angelo; Freytag, Dietrich; Haller, Gunther; Hart, Philip; Hasi, Jasmine; Herbst, Ryan; Herrmann, Sven; Kenney, Chris; Markovic, Bojan; Nishimura, Kurtis; Osier, Shawn; Pines, Jack; Reese, Benjamin; Segal, Julie; Tomada, Astrid; Weaver, Matt
2015-05-01
Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a new generation of cameras under development at SLAC, is introduced.
X-ray detectors at the Linac Coherent Light Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella
Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a newmore » generation of cameras under development at SLAC, is introduced.« less
X-ray detectors at the Linac Coherent Light Source
Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; ...
2015-04-21
Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a newmore » generation of cameras under development at SLAC, is introduced.« less
Space telescope low scattered light camera - A model
NASA Technical Reports Server (NTRS)
Breckinridge, J. B.; Kuper, T. G.; Shack, R. V.
1982-01-01
A design approach for a camera to be used with the space telescope is given. Camera optics relay the system pupil onto an annular Gaussian ring apodizing mask to control scattered light. One and two dimensional models of ripple on the primary mirror were calculated. Scattered light calculations using ripple amplitudes between wavelength/20 wavelength/200 with spatial correlations of the ripple across the primary mirror between 0.2 and 2.0 centimeters indicate that the detection of an object a billion times fainter than a bright source in the field is possible. Detection of a Jovian type planet in orbit about alpha Centauri with a camera on the space telescope may be possible.
X-ray detectors at the Linac Coherent Light Source
Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; Carron, Sebastian; Dragone, Angelo; Freytag, Dietrich; Haller, Gunther; Hart, Philip; Hasi, Jasmine; Herbst, Ryan; Herrmann, Sven; Kenney, Chris; Markovic, Bojan; Nishimura, Kurtis; Osier, Shawn; Pines, Jack; Reese, Benjamin; Segal, Julie; Tomada, Astrid; Weaver, Matt
2015-01-01
Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a new generation of cameras under development at SLAC, is introduced. PMID:25931071
Computer-generated hologram calculation for real scenes using a commercial portable plenoptic camera
NASA Astrophysics Data System (ADS)
Endo, Yutaka; Wakunami, Koki; Shimobaba, Tomoyoshi; Kakue, Takashi; Arai, Daisuke; Ichihashi, Yasuyuki; Yamamoto, Kenji; Ito, Tomoyoshi
2015-12-01
This paper shows the process used to calculate a computer-generated hologram (CGH) for real scenes under natural light using a commercial portable plenoptic camera. In the CGH calculation, a light field captured with the commercial plenoptic camera is converted into a complex amplitude distribution. Then the converted complex amplitude is propagated to a CGH plane. We tested both numerical and optical reconstructions of the CGH and showed that the CGH calculation from captured data with the commercial plenoptic camera was successful.
NASA Technical Reports Server (NTRS)
1996-01-01
PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.
Lercari, B; Bertram, L
2004-02-01
The interactions of phytochrome A (phyA), phytochrome B1 (phyB1) and phytochrome B2 (phyB2) in light-dependent shoot regeneration from the hypocotyl of tomato was analysed using all eight possible homozygous allelic combinations of the null mutants. The donor plants were pre-grown either in the dark or under red or far-red light for 8 days after sowing; thereafter hypocotyl segments (apical, middle and basal portions) were transferred onto hormone-free medium for culture under different light qualities. Etiolated apical segments cultured in vitro under white light showed a very high frequency of regeneration for all of the genotypes tested besides phyB1phyB2, phyAphyB1 and phyAphyB1phyB2 mutants. Evidence is provided of a specific interference of phyB2 with phyA-mediated HIR to far-red and blue light in etiolated explants. Pre-treatment of donor plants by growth under red light enhanced the competence of phyB1phyB2, phyAphyB1 and phyAphyB1phyB2 mutants for shoot regeneration, whereas pre-irradiation with far-red light enhanced the frequency of regeneration only in the phyAphyB1 mutant. Multiple phytochromes are involved in red light- and far-red light-dependent acquisition of competence for shoot regeneration. The position of the segments along the hypocotyl influenced the role of the various phytochromes and the interactions between them. The culture of competent hypocotyl segments under red, far-red or blue light reduced the frequency of explants forming shoots compared to those cultured under white light, with different genotypes having different response patterns.
'No Organics' Zone Circles Pinwheel
NASA Technical Reports Server (NTRS)
2008-01-01
The Pinwheel galaxy, otherwise known as Messier 101, sports bright reddish edges in this new infrared image from NASA's Spitzer Space Telescope. Research from Spitzer has revealed that this outer red zone lacks organic molecules present in the rest of the galaxy. The red and blue spots outside of the spiral galaxy are either foreground stars or more distant galaxies. The organics, called polycyclic aromatic hydrocarbons, are dusty, carbon-containing molecules that help in the formation of stars. On Earth, they are found anywhere combustion reactions take place, such as barbeque pits and exhaust pipes. Scientists also believe this space dust has the potential to be converted into the stuff of life. Spitzer found that the polycyclic aromatic hydrocarbons decrease in concentration toward the outer portion of the Pinwheel galaxy, then quickly drop off and are no longer detected at its very outer rim. According to astronomers, there's a threshold at the rim where the organic material is being destroyed by harsh radiation from stars. Radiation is more damaging at the far reaches of a galaxy because the stars there have less heavy metals, and metals dampen the radiation. The findings help researchers understand how stars can form in these harsh environments, where polycyclic aromatic hydrocarbons are lacking. Under normal circumstances, the polycyclic aromatic hydrocarbons help cool down star-forming clouds, allowing them to collapse into stars. In regions like the rim of the Pinwheel as well as the very early universe stars form without the organic dust. Astronomers don't know precisely how this works, so the rim of the Pinwheel provides them with a laboratory for examining the process relatively close up. In this image, infrared light with a wavelength of 3.6 microns is colored blue; 8-micron light is green; and 24-micron light is red. All three of Spitzer's instruments were used in the study: the infrared array camera, the multiband imaging photometer and the infrared spectrograph.Image quality prediction - An aid to the Viking lander imaging investigation on Mars
NASA Technical Reports Server (NTRS)
Huck, F. O.; Wall, S. D.
1976-01-01
Image quality criteria and image quality predictions are formulated for the multispectral panoramic cameras carried by the Viking Mars landers. Image quality predictions are based on expected camera performance, Mars surface radiance, and lighting and viewing geometry (fields of view, Mars lander shadows, solar day-night alternation), and are needed in diagnosis of camera performance, in arriving at a preflight imaging strategy, and revision of that strategy should the need arise. Landing considerations, camera control instructions, camera control logic, aspects of the imaging process (spectral response, spatial response, sensitivity), and likely problems are discussed. Major concerns include: degradation of camera response by isotope radiation, uncertainties in lighting and viewing geometry and in landing site local topography, contamination of camera window by dust abrasion, and initial errors in assigning camera dynamic ranges (gains and offsets).
The design of red-blue 3D video fusion system based on DM642
NASA Astrophysics Data System (ADS)
Fu, Rongguo; Luo, Hao; Lv, Jin; Feng, Shu; Wei, Yifang; Zhang, Hao
2016-10-01
Aiming at the uncertainty of traditional 3D video capturing including camera focal lengths, distance and angle parameters between two cameras, a red-blue 3D video fusion system based on DM642 hardware processing platform is designed with the parallel optical axis. In view of the brightness reduction of traditional 3D video, the brightness enhancement algorithm based on human visual characteristics is proposed and the luminance component processing method based on YCbCr color space is also proposed. The BIOS real-time operating system is used to improve the real-time performance. The video processing circuit with the core of DM642 enhances the brightness of the images, then converts the video signals of YCbCr to RGB and extracts the R component from one camera, so does the other video and G, B component are extracted synchronously, outputs 3D fusion images finally. The real-time adjustments such as translation and scaling of the two color components are realized through the serial communication between the VC software and BIOS. The system with the method of adding red-blue components reduces the lost of the chrominance components and makes the picture color saturation reduce to more than 95% of the original. Enhancement algorithm after optimization to reduce the amount of data fusion in the processing of video is used to reduce the fusion time and watching effect is improved. Experimental results show that the system can capture images in near distance, output red-blue 3D video and presents the nice experiences to the audience wearing red-blue glasses.
Odden, Morten; Linnell, John D. C.; Odden, John
2017-01-01
Sarcoptic mange is a widely distributed disease that affects numerous mammalian species. We used camera traps to investigate the apparent prevalence and spatiotemporal dynamics of sarcoptic mange in a red fox population in southeastern Norway. We monitored red foxes for five years using 305 camera traps distributed across an 18000 km2 area. A total of 6581 fox events were examined to visually identify mange compatible lesions. We investigated factors associated with the occurrence of mange by using logistic models within a Bayesian framework, whereas the spatiotemporal dynamics of the disease were analysed with space-time scan statistics. The apparent prevalence of the disease fluctuated over the study period with a mean of 3.15% and credible interval [1.25, 6.37], and our best logistic model explaining the presence of red foxes with mange-compatible lesions included time since the beginning of the study and the interaction between distance to settlement and season as explanatory variables. The scan analyses detected several potential clusters of the disease that varied in persistence and size, and the locations in the cluster with the highest probability were closer to human settlements than the other survey locations. Our results indicate that red foxes in an advanced stage of the disease are most likely found closer to human settlements during periods of low wild prey availability (winter). We discuss different potential causes. Furthermore, the disease appears to follow a pattern of small localized outbreaks rather than sporadic isolated events. PMID:28423011
Botswana: Ntwetwe and Sua Pans
Atmospheric Science Data Center
2013-04-15
... of red band imagery in which the 45-degree aft camera data are displayed in blue, 45-degree forward as green, and vertical as red. ... coat the surface and turn it bright ("sua" means salt). The mining town of Sowa is located where the Sua Spit (a finger of grassland ...
Hubble Camera Resumes Science Operation With Picture Of 'Butterfly' In Space.
NASA Technical Reports Server (NTRS)
2002-01-01
he Hubble Space Telescope's Wide Field and Planetary Camera 2 (WFPC2) is back at work, capturing this black-and-white image of the 'butterfly wing'-shaped nebula, NGC 2346. The nebula is about 2,000 light-years away from Earth in the direction of the constellation Monoceros. It represents the spectacular 'last gasp' of a binary star system at the nebula's center. The image was taken on March 6, as part of the recommissioning of the Hubble Space Telescope's previously installed scientific instruments following the successful servicing of the HST by NASA astronauts in February. WFPC2 was installed in HST during the servicing mission in 1993. At the center of the nebula lies a pair of stars that are so close together that they orbit around each other every 16 days. This is so close that, even with Hubble, the pair of stars cannot be resolved into its two components. One component of this binary is the hot core of a star that has ejected most of its outer layers, producing the surrounding nebula. Astronomers believe that this star, when it evolved and expanded to become a red giant, actually swallowed its companion star in an act of stellar cannibalism. The resulting interaction led to a spiraling together of the two stars, culminating in ejection of the outer layers of the red giant. Most of the outer layers were ejected into a dense disk, which can still be seen in the Hubble image, surrounding the central star. Later the hot star developed a fast stellar wind. This wind, blowing out into the surrounding disk, has inflated the large, wispy hourglass-shaped wings perpendicular to the disk. These wings produce the butterfly appearance when seen in projection. The total diameter of the nebula is about one-third of a light-year, or 2 trillion miles. Our own Sun will eject a nebula about 5 billion years from now. However, the Sun is not a double star, so its nebula may well be more spherical in shape. The image was taken through a filter that shows the light of glowing nitrogen atoms. Scientists are still testing and calibrating the newly installed instruments on Hubble , the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) and the Space Telescope Imaging Spectrograph (STIS). These instruments will be ready to make observations in a few weeks. Credit: Massimo Stiavelli (STScI), and NASA other team member: Inge Heyer (STScI) Image files in GIF and JPEG format and captions may be accessed on the Internet via anonymous ftp from oposite.stsci.edu in /pubinfo.
Kanegae, Takeshi; Kimura, Izumi
2015-08-01
In the fern Adiantum capillus-veneris, the phototropic response of the protonemal cells is induced by blue light and partially inhibited by subsequent irradiation with far-red light. This observation strongly suggests the existence of a phytochrome that mediates this blue/far-red reversible response; however, the phytochrome responsible for this response has not been identified. PHY3/NEO1, one of the three phytochrome genes identified in Adiantum, encodes a chimeric photoreceptor composed of both a phytochrome and a phototropin domain. It was demonstrated that phy3 mediates the red light-dependent phototropic response of Adiantum, and that phy3 potentially functions as a phototropin. These findings suggest that phy3 is the phytochrome that mediates the blue/far-red response in Adiantum protonemata. In the present study, we expressed Adiantum phy3 in a phot1 phot2 phototropin-deficient Arabidopsis line, and investigated the ability of phy3 to induce phototropic responses under various light conditions. Blue light irradiation clearly induced a phototropic response in the phy3-expressing transgenic seedlings, and this effect was fully inhibited by simultaneous irradiation with far-red light. In addition, experiments using amino acid-substituted phy3 indicated that FMN-cysteinyl adduct formation in the light, oxygen, voltage (LOV) domain was not necessary for the induction of blue light-dependent phototropism by phy3. We thus demonstrate that phy3 is the phytochrome that mediates the blue/far-red reversible phototropic response in Adiantum. Furthermore, our results imply that phy3 can function as a phototropin, but that it acts principally as a phytochrome that mediates both the red/far-red and blue/far-red light responses. © 2015 The Authors The Plant Journal © 2015 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Jin Ah; Kim, Na Na; Choi, Young Jae
We investigated the effect of light spectra on retinal damage and stress in goldfish using green (530 nm) and red (620 nm) light emitting diodes (LEDs) at three intensities each (0.5, 1.0, and 1.5 W/m{sup 2}). We measured the change in the levels of plasma cortisol and H{sub 2}O{sub 2} and expression and levels of caspase-3. The apoptotic response of green and red LED spectra was assessed using the terminal transferase dUTP nick end labeling (TUNEL) assay. Stress indicator (cortisol and H{sub 2}O{sub 2}) and apoptosis-related genes (caspase-3) decreased in green light, but increased in red light with higher light intensities over time.more » The TUNEL assay revealed that more apoptotic cells were detected in outer nuclear layers after exposure to red LED over time with the increase in light intensity, than the other spectra. These results indicate that green light efficiently reduces retinal damage and stress, whereas red light induces it. Therefore, red light-induced retina damage may induce apoptosis in goldfish retina. -- Highlights: •Green light efficiently reduces retinal damage and stress. •Green spectra reduce caspase production and apoptosis. •Red light-induced retina damage may induce apoptosis in goldfish retina. •The retina of goldfish recognizes green spectra as a stable environment.« less
NASA Technical Reports Server (NTRS)
2007-01-01
[figure removed for brevity, see original site] 3-Panel Version Figure 1 [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] Visible Light Figure 2 Infrared (IRAC) Figure 3 Combined Figure 4 Two rambunctious young stars are destroying their natal dust cloud with powerful jets of radiation, in an infrared image from NASA's Spitzer Space Telescope. The stars are located approximately 600 light-years away in a cosmic cloud called BHR 71. In visible light (left panel), BHR 71 is just a large black structure. The burst of yellow light toward the bottom of the cloud is the only indication that stars might be forming inside. In infrared light (center panel), the baby stars are shown as the bright yellow smudges toward the center. Both of these yellow spots have wisps of green shooting out of them. The green wisps reveal the beginning of a jet. Like a rainbow, the jet begins as green, then transitions to orange, and red toward the end. The combined visible-light and infrared composite (right panel) shows that a young star's powerful jet is responsible for the rupture at the bottom of the dense cloud in the visible-light image. Astronomers know this because burst of light in the visible-light image overlaps exactly with a jet spouting-out of the left star, in the infrared image. The jets' changing colors reveal a cooling effect, and may suggest that the young stars are spouting out radiation in regular bursts. The green tints at the beginning of the jet reveal really hot hydrogen gas, the orange shows warm gas, and the reddish wisps at the end represent the coolest gas. The fact that gas toward the beginning of the jet is hotter than gas near the middle suggests that the stars must give off regular bursts of energy -- and the material closest to the star is being heated by shockwaves from a recent stellar outburst. Meanwhile, the tints of orange reveal gas that is currently being heated by shockwaves from a previous stellar outburst. By the time these shockwaves reach the end of the jet, they have slowed down so significantly that the gas is only heated a little, and looks red. The combination of views also brings out some striking details that evaded visible-light detection. For example, the yellow dots scattered throughout the image are actually young stars forming inside BHR 71. Spitzer also uncovered another young star with jets, located to the right of the powerful jet seen in the visible-light image. Spitzer can see details that visible-light telescopes don't, because its infrared instruments are sensitive to 'heat.' The infrared image is made up of data from Spitzer's infrared array camera. Blue shows infrared light at 3.6 microns, green is light at 4.5 microns, and red is light at 8.0 microns.Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung
2017-05-08
Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR) illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1) and two open databases (Korea advanced institute of science and technology (KAIST) and computer vision center (CVC) databases), as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.
Common fluorescent proteins for single-molecule localization microscopy
NASA Astrophysics Data System (ADS)
Klementieva, Natalia V.; Bozhanova, Nina G.; Mishina, Natalie M.; Zagaynova, Elena V.; Lukyanov, Konstantin A.; Mishin, Alexander S.
2015-07-01
Super-resolution techniques for breaking the diffraction barrier are spread out over multiple studies nowadays. Single-molecule localization microscopy such as PALM, STORM, GSDIM, etc allow to get super-resolved images of cell ultrastructure by precise localization of individual fluorescent molecules via their temporal isolation. However, these methods are supposed the use of fluorescent dyes and proteins with special characteristics (photoactivation/photoconversion). At the same time, there is a need for retaining high photostability of fluorophores during long-term acquisition. Here, we first showed the potential of common red fluorescent protein for single-molecule localization microscopy based on spontaneous intrinsic blinking. Also, we assessed the effect of different imaging media on photobleaching of these fluorescent proteins. Monomeric orange and red fluorescent proteins were examined for stochastic switching from a dark state to a bright fluorescent state. We studied fusions with cytoskeletal proteins in NIH/3T3 and HeLa cells. Imaging was performed on the Nikon N-STORM system equipped with EMCCD camera. To define the optimal imaging conditions we tested several types of cell culture media and buffers. As a result, high-resolution images of cytoskeleton structure were obtained. Essentially, low-intensity light was sufficient to initiate the switching of tested red fluorescent protein reducing phototoxicity and provide long-term live-cell imaging.
Stomatal Responses to Light and Drought Stress in Variegated Leaves of Hedera helix1
Aphalo, Pedro J.; Sánchez, Rodolfo A.
1986-01-01
Direct and indirect mechanisms underlying the light response of stomata were studied in variegated leaves of the juvenile phase of Hedera helix L. Dose response curves of leaf conductance were measured with blue and red light in leaves kept in normal or in an inverted position. In the green portions of the leaves, the sensitivity to blue light was nearly 100 times higher than that to red light. No response to red light was observed in the white portions of the leaves up to 90 micromoles per square meter per second. Red light indirectly affected leaf conductance while blue light had a direct effect. Leaf conductance was found to be more sensitive to drought stress and showed a more persistent aftereffect in the white portions of the leaves. A differential effect of drought stress on the responses to blue and red light was also observed. PMID:16664900
Stomatal Responses to Light and Drought Stress in Variegated Leaves of Hedera helix.
Aphalo, P J; Sánchez, R A
1986-07-01
Direct and indirect mechanisms underlying the light response of stomata were studied in variegated leaves of the juvenile phase of Hedera helix L. Dose response curves of leaf conductance were measured with blue and red light in leaves kept in normal or in an inverted position. In the green portions of the leaves, the sensitivity to blue light was nearly 100 times higher than that to red light. No response to red light was observed in the white portions of the leaves up to 90 micromoles per square meter per second. Red light indirectly affected leaf conductance while blue light had a direct effect. Leaf conductance was found to be more sensitive to drought stress and showed a more persistent aftereffect in the white portions of the leaves. A differential effect of drought stress on the responses to blue and red light was also observed.
Kreslavski, Vladimir D; Lyubimov, Valery Yu; Shirshikova, Galina N; Shmarev, Alexander N; Kosobryukhov, Anatoly A; Schmitt, Franz-Josef; Friedrich, Thomas; Allakhverdiev, Suleyman I
2013-05-05
Seedlings of 10-day-old lettuce (Lactuca sativa L., cultivar Berlin) were preilluminated by low intensity red light (λmax=660 nm, 10 min, 5 μmol quanta m(-2) s(-1)) and far-red light (λmax=730 nm, 10 min, 5 μmol quanta m(-2) s(-1)) to study the effect of pre-treatment on photosynthesis, photochemical activity of photosystem II (PSII), the contents of photosynthetic and UV-A-absorbing pigments (UAPs) and H2O2, as well as total and ascorbate peroxidase activities in cotyledonary leaves of seedlings exposed to UV-A. UV radiation reduced the photosynthetic rate (Pn), the activity of PSII, and the contents of Chl a and b, carotenoids and UAPs in the leaves, but increased the content of H2O2 and the total peroxidase activity. Preillumination with red light removed these effects of UV. In turn, the illumination with red light, then far-red light removed the effect of the red light. Illumination with red light alone increased the content of UAPs, as well as peroxidase activity. It is suggested that higher resistance of the lettuce photosynthetic apparatus to UV-A radiation is associated with involvement of the active form of phytochrome B, thereby increasing peroxidase activities as well as UAPs and saving preservation of photosynthetic pigment contents due to pre-illumination with red light. Copyright © 2013 Elsevier B.V. All rights reserved.
Navigating surgical fluorescence cameras using near-infrared optical tracking.
van Oosterom, Matthias; den Houting, David; van de Velde, Cornelis; van Leeuwen, Fijs
2018-05-01
Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
The Next Generation Transit Survey (NGTS)
NASA Astrophysics Data System (ADS)
Wheatley, Peter J.; West, Richard G.; Goad, Michael R.; Jenkins, James S.; Pollacco, Don L.; Queloz, Didier; Rauer, Heike; Udry, Stéphane; Watson, Christopher A.; Chazelas, Bruno; Eigmüller, Philipp; Lambert, Gregory; Genolet, Ludovic; McCormac, James; Walker, Simon; Armstrong, David J.; Bayliss, Daniel; Bento, Joao; Bouchy, François; Burleigh, Matthew R.; Cabrera, Juan; Casewell, Sarah L.; Chaushev, Alexander; Chote, Paul; Csizmadia, Szilárd; Erikson, Anders; Faedi, Francesca; Foxell, Emma; Gänsicke, Boris T.; Gillen, Edward; Grange, Andrew; Günther, Maximilian N.; Hodgkin, Simon T.; Jackman, James; Jordán, Andrés; Louden, Tom; Metrailler, Lionel; Moyano, Maximiliano; Nielsen, Louise D.; Osborn, Hugh P.; Poppenhaeger, Katja; Raddi, Roberto; Raynard, Liam; Smith, Alexis M. S.; Soto, Maritza; Titz-Weider, Ruth
2018-04-01
We describe the Next Generation Transit Survey (NGTS), which is a ground-based project searching for transiting exoplanets orbiting bright stars. NGTS builds on the legacy of previous surveys, most notably WASP, and is designed to achieve higher photometric precision and hence find smaller planets than have previously been detected from the ground. It also operates in red light, maximizing sensitivity to late K and early M dwarf stars. The survey specifications call for photometric precision of 0.1 per cent in red light over an instantaneous field of view of 100 deg2, enabling the detection of Neptune-sized exoplanets around Sun-like stars and super-Earths around M dwarfs. The survey is carried out with a purpose-built facility at Cerro Paranal, Chile, which is the premier site of the European Southern Observatory (ESO). An array of twelve 20 cm f/2.8 telescopes fitted with back-illuminated deep-depletion CCD cameras is used to survey fields intensively at intermediate Galactic latitudes. The instrument is also ideally suited to ground-based photometric follow-up of exoplanet candidates from space telescopes such as TESS, Gaia and PLATO. We present observations that combine precise autoguiding and the superb observing conditions at Paranal to provide routine photometric precision of 0.1 per cent in 1 h for stars with I-band magnitudes brighter than 13. We describe the instrument and data analysis methods as well as the status of the survey, which achieved first light in 2015 and began full-survey operations in 2016. NGTS data will be made publicly available through the ESO archive.
Active Lifting During Martian Dust Storm
2017-03-09
This false-color scene from the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity documents movement of dust as a regional dust storm approached the rover's location on Feb. 24, 2017, during the 4,653rd Martian day, or sol, of the rover's work on Mars. Key to detecting the movement is that Pancam color images are combinations of different images taken a short time apart through different color filters. Note that along the horizon, the left portion of the image has a bluish band (with label and arrow in Figure 1). The component image admitting blue light was taken about 150 seconds after the component image admitting red light. A layer of dust-carrying wind hadn't reached this location by the earlier exposure, but had by the later one. This Sol 4653 Opportunity view is toward the north from the rover's location on the western rim of Endeavour Crater in the Meridiani Planum region of Mars. http://photojournal.jpl.nasa.gov/catalog/PIA21485
The optics of microscope image formation.
Wolf, David E
2013-01-01
Although geometric optics gives a good understanding of how the microscope works, it fails in one critical area, which is explaining the origin of microscope resolution. To accomplish this, one must consider the microscope from the viewpoint of physical optics. This chapter describes the theory of the microscope-relating resolution to the highest spatial frequency that a microscope can collect. The chapter illustrates how Huygens' principle or construction can be used to explain the propagation of a plane wave. It is shown that this limit increases with increasing numerical aperture (NA). As a corollary to this, resolution increases with decreasing wavelength because of how NA depends on wavelength. The resolution is higher for blue light than red light. Resolution is dependent on contrast, and the higher the contrast, the higher the resolution. This last point relates to issues of signal-to-noise and dynamic range. The use of video and new digital cameras has necessitated redefining classical limits such as those of Rayleigh's criterion. Copyright © 2007 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] Annotated Version This portion of an image acquired by the Mars Reconnaissance Orbiter's High Resolution Imaging Science Experiment camera shows the Spirit rover's winter campaign site. Spirit was parked on a slope tilted 11 degrees to the north to maximize sunlight during the southern winter season. 'Tyrone' is an area where the rover's wheels disturbed light-toned soils. Remote sensing and in-situ analyses found the light-toned soil at Tyrone to be sulfate rich and hydrated. The original picture is catalogued as PSP_001513_1655_red and was taken on Sept. 29, 2006. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The High Resolution Imaging Science Experiment is operated by the University of Arizona, Tucson, and the instrument was built by Ball Aerospace and Technology Corp., Boulder, Colo.Optical design of the SuMIRe/PFS spectrograph
NASA Astrophysics Data System (ADS)
Pascal, Sandrine; Vives, Sébastien; Barkhouser, Robert; Gunn, James E.
2014-07-01
The SuMIRe Prime Focus Spectrograph (PFS), developed for the 8-m class SUBARU telescope, will consist of four identical spectrographs, each receiving 600 fibers from a 2394 fiber robotic positioner at the telescope prime focus. Each spectrograph includes three spectral channels to cover the wavelength range [0.38-1.26] um with a resolving power ranging between 2000 and 4000. A medium resolution mode is also implemented to reach a resolving power of 5000 at 0.8 um. Each spectrograph is made of 4 optical units: the entrance unit which produces three corrected collimated beams and three camera units (one per spectral channel: "blue, "red", and "NIR"). The beam is split by using two large dichroics; and in each arm, the light is dispersed by large VPH gratings (about 280x280mm). The proposed optical design was optimized to achieve the requested image quality while simplifying the manufacturing of the whole optical system. The camera design consists in an innovative Schmidt camera observing a large field-of-view (10 degrees) with a very fast beam (F/1.09). To achieve such a performance, the classical spherical mirror is replaced by a catadioptric mirror (i.e meniscus lens with a reflective surface on the rear side of the glass, like a Mangin mirror). This article focuses on the optical architecture of the PFS spectrograph and the perfornance achieved. We will first described the global optical design of the spectrograph. Then, we will focus on the Mangin-Schmidt camera design. The analysis of the optical performance and the results obtained are presented in the last section.
Lei, Shaobo; Goltz, Herbert C; Sklar, Jaime C; Wong, Agnes M F
2016-07-01
It has been proposed that after activation by blue light, activated melanopsin is converted back to its resting state by long wavelength red light exposure, a putative mechanism of melanopsin chromophore recovery in vivo. We tested this hypothesis by investigating whether red light attenuates the ongoing post-illumination pupil response (PIPR) induced by melanopsin-activating blue light. Pupillary light responses were tested using "Blue+Red" double flashes and "Blue Only" single flash stimuli in 10 visually normal subjects. For "Blue+Red" conditions, PIPR was induced with an intense blue flash, followed by experimental red light exposure of variable intensity and duration (Experiment 1) immediately or 9s after the offset of the blue flash (Experiment 2). For "Blue Only" conditions, only the PIPR-inducing blue stimuli were presented (reference condition). PIPR was defined as the mean pupil size from 10 to 30s (Experiment 1) and from 25 to 60s (Experiment 2) after the offset of blue light stimuli. The results showed that PIPR from "Blue+Red" conditions did not differ significantly from those of "Blue Only" conditions (p=0.55) in Experiment 1. The two stimulation conditions also did not differ in Experiment 2 (p=0.38). We therefore conclude that red light exposure does not alter the time course of PIPR induced by blue light. This finding does not support the hypothesis that long wavelength red light reverses activated melanopsin; rather it lends support to the hypothesis that the wavelengths of stimuli driving both the forward and backward reactions of melanopsin may be similar. Copyright © 2016. Published by Elsevier Ltd.
Kim, Michele M; Zhu, Timothy C
2013-02-02
During HPPH-mediated pleural photodynamic therapy (PDT), it is critical to determine the anatomic geometry of the pleural surface quickly as there may be movement during treatment resulting in changes with the cavity. We have developed a laser scanning device for this purpose, which has the potential to obtain the surface geometry in real-time. A red diode laser with a holographic template to create a pattern and a camera with auto-focusing abilities are used to scan the cavity. In conjunction with a calibration with a known surface, we can use methods of triangulation to reconstruct the surface. Using a chest phantom, we are able to obtain a 360 degree scan of the interior in under 1 minute. The chest phantom scan was compared to an existing CT scan to determine its accuracy. The laser-camera separation can be determined through the calibration with 2mm accuracy. The device is best suited for environments that are on the scale of a chest cavity (between 10cm and 40cm). This technique has the potential to produce cavity geometry in real-time during treatment. This would enable PDT treatment dosage to be determined with greater accuracy. Works are ongoing to build a miniaturized device that moves the light source and camera via a fiber-optics bundle commonly used for endoscopy with increased accuracy.
Fluorescent image tracking velocimeter
Shaffer, Franklin D.
1994-01-01
A multiple-exposure fluorescent image tracking velocimeter (FITV) detects and measures the motion (trajectory, direction and velocity) of small particles close to light scattering surfaces. The small particles may follow the motion of a carrier medium such as a liquid, gas or multi-phase mixture, allowing the motion of the carrier medium to be observed, measured and recorded. The main components of the FITV include: (1) fluorescent particles; (2) a pulsed fluorescent excitation laser source; (3) an imaging camera; and (4) an image analyzer. FITV uses fluorescing particles excited by visible laser light to enhance particle image detectability near light scattering surfaces. The excitation laser light is filtered out before reaching the imaging camera allowing the fluoresced wavelengths emitted by the particles to be detected and recorded by the camera. FITV employs multiple exposures of a single camera image by pulsing the excitation laser light for producing a series of images of each particle along its trajectory. The time-lapsed image may be used to determine trajectory and velocity and the exposures may be coded to derive directional information.
Coggins, Lewis G; Bacheler, Nathan M; Gwinn, Daniel C
2014-01-01
Occupancy models using incidence data collected repeatedly at sites across the range of a population are increasingly employed to infer patterns and processes influencing population distribution and dynamics. While such work is common in terrestrial systems, fewer examples exist in marine applications. This disparity likely exists because the replicate samples required by these models to account for imperfect detection are often impractical to obtain when surveying aquatic organisms, particularly fishes. We employ simultaneous sampling using fish traps and novel underwater camera observations to generate the requisite replicate samples for occupancy models of red snapper, a reef fish species. Since the replicate samples are collected simultaneously by multiple sampling devices, many typical problems encountered when obtaining replicate observations are avoided. Our results suggest that augmenting traditional fish trap sampling with camera observations not only doubled the probability of detecting red snapper in reef habitats off the Southeast coast of the United States, but supplied the necessary observations to infer factors influencing population distribution and abundance while accounting for imperfect detection. We found that detection probabilities tended to be higher for camera traps than traditional fish traps. Furthermore, camera trap detections were influenced by the current direction and turbidity of the water, indicating that collecting data on these variables is important for future monitoring. These models indicate that the distribution and abundance of this species is more heavily influenced by latitude and depth than by micro-scale reef characteristics lending credence to previous characterizations of red snapper as a reef habitat generalist. This study demonstrates the utility of simultaneous sampling devices, including camera traps, in aquatic environments to inform occupancy models and account for imperfect detection when describing factors influencing fish population distribution and dynamics.
Coggins, Lewis G.; Bacheler, Nathan M.; Gwinn, Daniel C.
2014-01-01
Occupancy models using incidence data collected repeatedly at sites across the range of a population are increasingly employed to infer patterns and processes influencing population distribution and dynamics. While such work is common in terrestrial systems, fewer examples exist in marine applications. This disparity likely exists because the replicate samples required by these models to account for imperfect detection are often impractical to obtain when surveying aquatic organisms, particularly fishes. We employ simultaneous sampling using fish traps and novel underwater camera observations to generate the requisite replicate samples for occupancy models of red snapper, a reef fish species. Since the replicate samples are collected simultaneously by multiple sampling devices, many typical problems encountered when obtaining replicate observations are avoided. Our results suggest that augmenting traditional fish trap sampling with camera observations not only doubled the probability of detecting red snapper in reef habitats off the Southeast coast of the United States, but supplied the necessary observations to infer factors influencing population distribution and abundance while accounting for imperfect detection. We found that detection probabilities tended to be higher for camera traps than traditional fish traps. Furthermore, camera trap detections were influenced by the current direction and turbidity of the water, indicating that collecting data on these variables is important for future monitoring. These models indicate that the distribution and abundance of this species is more heavily influenced by latitude and depth than by micro-scale reef characteristics lending credence to previous characterizations of red snapper as a reef habitat generalist. This study demonstrates the utility of simultaneous sampling devices, including camera traps, in aquatic environments to inform occupancy models and account for imperfect detection when describing factors influencing fish population distribution and dynamics. PMID:25255325
Beel, Benedikt; Prager, Katja; Spexard, Meike; Sasso, Severin; Weiss, Daniel; Müller, Nico; Heinnickel, Mark; Dewez, David; Ikoma, Danielle; Grossman, Arthur R.; Kottke, Tilman; Mittag, Maria
2012-01-01
Cryptochromes are flavoproteins that act as sensory blue light receptors in insects, plants, fungi, and bacteria. We have investigated a cryptochrome from the green alga Chlamydomonas reinhardtii with sequence homology to animal cryptochromes and (6-4) photolyases. In response to blue and red light exposure, this animal-like cryptochrome (aCRY) alters the light-dependent expression of various genes encoding proteins involved in chlorophyll and carotenoid biosynthesis, light-harvesting complexes, nitrogen metabolism, cell cycle control, and the circadian clock. Additionally, exposure to yellow but not far-red light leads to comparable increases in the expression of specific genes; this expression is significantly reduced in an acry insertional mutant. These in vivo effects are congruent with in vitro data showing that blue, yellow, and red light, but not far-red light, are absorbed by the neutral radical state of flavin in aCRY. The aCRY neutral radical is formed following blue light absorption of the oxidized flavin. Red illumination leads to conversion to the fully reduced state. Our data suggest that aCRY is a functionally important blue and red light–activated flavoprotein. The broad spectral response implies that the neutral radical state functions as a dark form in aCRY and expands the paradigm of flavoproteins and cryptochromes as blue light sensors to include other light qualities. PMID:22773746
USDA-ARS?s Scientific Manuscript database
This research developed a multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet/blue LED excitation for detection of fecal contamination on Golden Delicious apples. Using a hyperspectral line-scan imaging system consisting of an EMCCD camera, spectrograph, an...
Baby Picture of our Solar System
NASA Technical Reports Server (NTRS)
2007-01-01
[figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] Click on image for Poster VersionClick on image for Visible Light ImageClick on image for Animation A rare, infrared view of a developing star and its flaring jets taken by NASA's Spitzer Space Telescope shows us what our own solar system might have looked like billions of years ago. In visible light, this star and its surrounding regions are completely hidden in darkness. Stars form out of spinning clouds, or envelopes, of gas and dust. As the envelopes flatten and collapse, jets of gas stream outward and a swirling disk of planet-forming material takes shape around the forming star. Eventually, the envelope and jets disappear, leaving a newborn star with a suite of planets. This process takes millions of years. The Spitzer image shows a developing sun-like star, called L1157, that is only thousands of years old (for comparison, our solar system is around 4.5 billion years old). Why is the young system only visible in infrared light? The answer has to do with the fact that stars are born in the darkest and dustiest corners of space, where little visible light can escape. But the heat, or infrared light, of an object can be detected through the dust. In Spitzer's infrared view of L1157, the star itself is hidden but its envelope is visible in silhouette as a thick black bar. While Spitzer can peer through this region's dust, it cannot penetrate the envelope itself. Hence, the envelope appears black. The thickest part of the envelope can be seen as the black line crossing the giant jets. This L1157 portrait provides the first clear look at stellar envelope that has begun to flatten. The color white shows the hottest parts of the jets, with temperatures around 100 degrees Celsius (212 degrees Fahrenheit). Most of the material in the jets, seen in orange, is roughly zero degrees on the Celsius and Fahrenheit scales. The reddish haze all around the picture is dust. The white dots are other stars, mostly in the background. L1157 is located 800 light-years away in the constellation Cepheus. This image was taken by Spitzer's infrared array camera. Infrared light of 8 microns is colored red; 4.5-micron infrared light is green; and 3.6-micron infrared light is blue. The visible-light picture is from the Palomar Observatory-Space Telescope Science Institute Digitized Sky Survey. Blue visible light is blue; red visible light is green, and near-infrared light is red. The artist's animation begins by showing a dark and dusty corner of space where little visible light can escape. The animation then transitions to the infrared view taken by NASA's Spitzer Space Telescope, revealing the embryonic star and its dramatic jets.Light at Night and Measures of Alertness and Performance: Implications for Shift Workers.
Figueiro, Mariana G; Sahin, Levent; Wood, Brittany; Plitnick, Barbara
2016-01-01
Rotating-shift workers, particularly those working at night, are likely to experience sleepiness, decreased productivity, and impaired safety while on the job. Light at night has been shown to have acute alerting effects, reduce sleepiness, and improve performance. However, light at night can also suppress melatonin and induce circadian disruption, both of which have been linked to increased health risks. Previous studies have shown that long-wavelength (red) light exposure increases objective and subjective measures of alertness at night, without suppressing nocturnal melatonin. This study investigated whether exposure to red light at night would not only increase measures of alertness but also improve performance. It was hypothesized that exposure to both red (630 nm) and white (2,568 K) lights would improve performance but that only white light would significantly affect melatonin levels. Seventeen individuals participated in a 3-week, within-subjects, nighttime laboratory study. Compared to remaining in dim light, participants had significantly faster reaction times in the GO/NOGO test after exposure to both red light and white light. Compared to dim light exposure, power in the alpha and alpha-theta regions was significantly decreased after exposure to red light. Melatonin levels were significantly suppressed by white light only. Results show that not only can red light improve measures of alertness, but it can also improve certain types of performance at night without affecting melatonin levels. These findings could have significant practical applications for nurses; red light could help nurses working rotating shifts maintain nighttime alertness, without suppressing melatonin or changing their circadian phase. © The Author(s) 2015.
First look at rock & soil properties
NASA Technical Reports Server (NTRS)
1997-01-01
The earliest survey of spectral properties of the rocks and soils surrounding Pathfinder was acquired as a narrow strip covering the region just beyond the where the rover made its egress from the lander. The wavelength filters used, all in the binocular camera's right eye, cover mainly visible wavelengths. These data reveal at least five kinds of rocks and soil in the immediate vicinity of the lander. All of the spectra are ratioed to the mean spectrum of bright red drift to highlight the differences. Different occurrences of drift (pink spectra) are closely similar. Most of the rocks (black spectra) have a dark gray color, and are both darker and less red than the drift, suggesting less weathering. Typical soils (green spectra) are intermediate in properties to the rocks and drift. Both these data and subsequent higher resolution images show that the typical soil consists of a mixture of drift and small dark gray particles resembling the rock. However, two other kinds of materials are significantly different from the rocks and drift. Pinkish or whitish pebbles and crusts on some of the rocks (blue spectra) are brighter in blue light and darker in near-infrared light than is the drift, and they lack the spectral characteristics closely associated with iron minerals. Dark red soils in the lee of several rocks are about as red as the drift, but consistently darker. The curvature in the spectrum at visible wavelengths suggests either more ferric iron minerals than in the drift or a larger particle size.
Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator. JPL is an operating division of the California Institute of Technology (Caltech).VizieR Online Data Catalog: JHK lightcurves of red giants in the SMC (Takayama+, 2015)
NASA Astrophysics Data System (ADS)
Takayama, M.; Wood, P. R.; Ita, Y.
2017-11-01
This is JHK light curves of 7 oxygen rich stars and 14 carbon stars which show the variability of prominent long secondary periods (LSPs). Those stars are cross-identified with OGLE LSP variables in the Small Magellanic Cloud (Soszynski et al. 2011, J/AcA/61/217). A long-term multiband near-IR photometric survey for variable stars in the Large and Small Magellanic Clouds has been carried out at the South African Astronomical Observatory at Sutherland (Ita et al., in preparation). The SIRIUS camera attached to the IRSF 1.4 m telescope was used for this survey and more than 10 yr of observations in the near-IR bands J(1.25 μm), H(1.63 μm) and KS(2.14 μm) band were obtained. In this work, we select the SMC stars from the SIRIUS data base. We obtained the V- and I-band time series of SMC red giants from the OGLE project (Soszynski et al. 2011, J/AcA/61/217). (2 data files).
Simultaneous three wavelength imaging with a scanning laser ophthalmoscope.
Reinholz, F; Ashman, R A; Eikelboom, R H
1999-11-01
Various imaging properties of scanning laser ophthalmoscopes (SLO) such as contrast or depth discrimination, are superior to those of the traditional photographic fundus camera. However, most SLO are monochromatic whereas photographic systems produce colour images, which inherently contain information over a broad wavelength range. An SLO system has been modified to allow simultaneous three channel imaging. Laser light sources in the visible and infrared spectrum were concurrently launched into the system. Using different wavelength triads, digital fundus images were acquired at high frame rates. Favourable wavelengths combinations were established and high contrast, true (red, green, blue) or false (red, green, infrared) colour images of the retina were recorded. The monochromatic frames which form the colour image exhibit improved distinctness of different retinal structures such as the nerve fibre layer, the blood vessels, and the choroid. A multi-channel SLO combines the advantageous imaging properties of a tunable, monochrome SLO with the benefits and convenience of colour ophthalmoscopy. The options to modify parameters such as wavelength, intensity, gain, beam profile, aperture sizes, independently for every channel assign a high degree of versatility to the system. Copyright 1999 Wiley-Liss, Inc.
Misimi, E; Mathiassen, J R; Erikson, U
2007-01-01
Computer vision method was used to evaluate the color of Atlantic salmon (Salmo salar) fillets. Computer vision-based sorting of fillets according to their color was studied on 2 separate groups of salmon fillets. The images of fillets were captured using a digital camera of high resolution. Images of salmon fillets were then segmented in the regions of interest and analyzed in red, green, and blue (RGB) and CIE Lightness, redness, and yellowness (Lab) color spaces, and classified according to the Roche color card industrial standard. Comparisons of fillet color between visual evaluations were made by a panel of human inspectors, according to the Roche SalmoFan lineal standard, and the color scores generated from computer vision algorithm showed that there were no significant differences between the methods. Overall, computer vision can be used as a powerful tool to sort fillets by color in a fast and nondestructive manner. The low cost of implementing computer vision solutions creates the potential to replace manual labor in fish processing plants with automation.
Extreme ultra-violet movie camera for imaging microsecond time scale magnetic reconnection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, Kil-Byoung; Bellan, Paul M.
2013-12-15
An ultra-fast extreme ultra-violet (EUV) movie camera has been developed for imaging magnetic reconnection in the Caltech spheromak/astrophysical jet experiment. The camera consists of a broadband Mo:Si multilayer mirror, a fast decaying YAG:Ce scintillator, a visible light block, and a high-speed visible light CCD camera. The camera can capture EUV images as fast as 3.3 × 10{sup 6} frames per second with 0.5 cm spatial resolution. The spectral range is from 20 eV to 60 eV. EUV images reveal strong, transient, highly localized bursts of EUV radiation when magnetic reconnection occurs.
Light regulation of the growth response in corn root gravitropism
NASA Technical Reports Server (NTRS)
Kelly, M. O.; Leopold, A. C.
1992-01-01
Roots of Merit variety corn (Zea mays L.) require red light for orthogravitropic curvature. Experiments were undertaken to identify the step in the pathway from gravity perception to asymmetric growth on which light may act. Red light was effective in inducing gravitropism whether it was supplied concomitant with or as long as 30 minutes after the gravity stimulus (GS). The presentation time was the same whether the GS was supplied in red light or in darkness. Red light given before the GS slightly enhanced the rate of curvature but had little effect on the lag time or on the final curvature. This enhancement was expanded by a delay between the red light pulse and the GS. These results indicate that gravity perception and at least the initial transduction steps proceed in the dark. Light may regulate the final growth (motor) phase of gravitropism. The time required for full expression of the light enhancement of curvature is consistent with its involvement in some light-stimulated biosynthetic event.
Optical correlator method and apparatus for particle image velocimetry processing
NASA Technical Reports Server (NTRS)
Farrell, Patrick V. (Inventor)
1991-01-01
Young's fringes are produced from a double exposure image of particles in a flowing fluid by passing laser light through the film and projecting the light onto a screen. A video camera receives the image from the screen and controls a spatial light modulator. The spatial modulator has a two dimensional array of cells the transmissiveness of which are controlled in relation to the brightness of the corresponding pixel of the video camera image of the screen. A collimated beam of laser light is passed through the spatial light modulator to produce a diffraction pattern which is focused onto another video camera, with the output of the camera being digitized and provided to a microcomputer. The diffraction pattern formed when the laser light is passed through the spatial light modulator and is focused to a point corresponds to the two dimensional Fourier transform of the Young's fringe pattern projected onto the screen. The data obtained fro This invention was made with U.S. Government support awarded by the Department of the Army (DOD) and NASA grand number(s): DOD #DAAL03-86-K0174 and NASA #NAG3-718. The U.S. Government has certain rights in this invention.
New light field camera based on physical based rendering tracing
NASA Astrophysics Data System (ADS)
Chung, Ming-Han; Chang, Shan-Ching; Lee, Chih-Kung
2014-03-01
Even though light field technology was first invented more than 50 years ago, it did not gain popularity due to the limitation imposed by the computation technology. With the rapid advancement of computer technology over the last decade, the limitation has been uplifted and the light field technology quickly returns to the spotlight of the research stage. In this paper, PBRT (Physical Based Rendering Tracing) was introduced to overcome the limitation of using traditional optical simulation approach to study the light field camera technology. More specifically, traditional optical simulation approach can only present light energy distribution but typically lack the capability to present the pictures in realistic scenes. By using PBRT, which was developed to create virtual scenes, 4D light field information was obtained to conduct initial data analysis and calculation. This PBRT approach was also used to explore the light field data calculation potential in creating realistic photos. Furthermore, we integrated the optical experimental measurement results with PBRT in order to place the real measurement results into the virtually created scenes. In other words, our approach provided us with a way to establish a link of virtual scene with the real measurement results. Several images developed based on the above-mentioned approaches were analyzed and discussed to verify the pros and cons of the newly developed PBRT based light field camera technology. It will be shown that this newly developed light field camera approach can circumvent the loss of spatial resolution associated with adopting a micro-lens array in front of the image sensors. Detailed operational constraint, performance metrics, computation resources needed, etc. associated with this newly developed light field camera technique were presented in detail.
NASA Technical Reports Server (NTRS)
Barnes, Heidi L. (Inventor); Smith, Harvey S. (Inventor)
1998-01-01
A system for imaging a flame and the background scene is discussed. The flame imaging system consists of two charge-coupled-device (CCD) cameras. One camera uses a 800 nm long pass filter which during overcast conditions blocks sufficient background light so the hydrogen flame is brighter than the background light, and the second CCD camera uses a 1100 nm long pass filter, which blocks the solar background in full sunshine conditions such that the hydrogen flame is brighter than the solar background. Two electronic viewfinders convert the signal from the cameras into a visible image. The operator can select the appropriate filtered camera to use depending on the current light conditions. In addition, a narrow band pass filtered InGaAs sensor at 1360 nm triggers an audible alarm and a flashing LED if the sensor detects a flame, providing additional flame detection so the operator does not overlook a small flame.
Song, Jin Ah; Kim, Na Na; Choi, Young Jae; Choi, Cheol Young
2016-07-22
We investigated the effect of light spectra on retinal damage and stress in goldfish using green (530 nm) and red (620 nm) light emitting diodes (LEDs) at three intensities each (0.5, 1.0, and 1.5 W/m(2)). We measured the change in the levels of plasma cortisol and H2O2 and expression and levels of caspase-3. The apoptotic response of green and red LED spectra was assessed using the terminal transferase dUTP nick end labeling (TUNEL) assay. Stress indicator (cortisol and H2O2) and apoptosis-related genes (caspase-3) decreased in green light, but increased in red light with higher light intensities over time. The TUNEL assay revealed that more apoptotic cells were detected in outer nuclear layers after exposure to red LED over time with the increase in light intensity, than the other spectra. These results indicate that green light efficiently reduces retinal damage and stress, whereas red light induces it. Therefore, red light-induced retina damage may induce apoptosis in goldfish retina. Copyright © 2016 Elsevier Inc. All rights reserved.
Light inhibits spore germination through phytochrome in Aspergillus nidulans.
Röhrig, Julian; Kastner, Christian; Fischer, Reinhard
2013-05-01
Aspergillus nidulans responds to light in several aspects. The balance between sexual and asexual development as well as the amount of secondary metabolites produced is controlled by light. Here, we show that germination is largely delayed by blue (450 nm), red (700 nm), and far-red light (740 nm). The largest effect was observed with far-red light. Whereas 60 % of the conidia produced a germ tube after 20 h in the dark, less than 5 % of the conidia germinated under far-red light conditions. Because swelling of conidia was not affected, light appears to act at the stage of germ-tube formation. In the absence of nutrients, far-red light even inhibited swelling of conidia, whereas in the dark, conidia did swell and germinated after prolonged incubation. The blue-light signaling components, LreA (WC-1) and LreB (WC-2), and also the cryptochrome/photolyase CryA were not required for germination inhibition. However, in the phytochrome mutant, ∆fphA, the germination delay was released, but germination was delayed in the dark in comparison to wild type. This suggests a novel function of phytochrome as far-red light sensor and as activator of polarized growth in the dark.
PHOTO ILLUSTRATION OF COMET P/SHOEMAKER-LEVY 9 and PLANET JUPITER
NASA Technical Reports Server (NTRS)
2002-01-01
This is a composite photo, assembled from separate images of Jupiter and comet P/Shoemaker-Levy 9, as imaged by the Wide Field and Planetary Camera-2 (WFPC-2), aboard NASA's Hubble Space Telescope (HST). Jupiter was imaged on May 18, 1994, when the giant planet was at a distance of 420 million miles (670 million km) from Earth. This 'true-color' picture was assembled from separate HST exposures in red, blue, and green light. Jupiter's rotation between exposures creates the blue and red fringe on either side of the disk. HST can resolve details in Jupiter's magnificent cloud belts and zones as small as 200 miles (320 km) across (wide field mode). This detailed view is only surpassed by images from spacecraft that have traveled to Jupiter. The dark spot on the disk of Jupiter is the shadow of the inner moon Io. This volcanic moon appears as an orange and yellow disk just to the upper right of the shadow. Though Io is approximately the size of Earth's Moon (but 2,000 times farther away), HST can resolve surface details. When the comet was observed on May 17, its train of 21 icy fragments stretched across 710 thousand miles (1.1 million km) of space, or 3 times the distance between Earth and the Moon. This required six WFPC exposures along the comet train to include all the nuclei. The image was taken in red light. The apparent angular size of Jupiter relative to the comet, and its angular separation from the comet when the images were taken, have been modified for illustration purposes. Credit: H.A. Weaver, T.E. Smith (Space Telescope Science Institute) and J.T. Trauger, R.W. Evans (Jet Propulsion Laboratory), and NASA
1994-07-07
This is a composite photo, assembled from separate images of Jupiter and Comet P/Shoemaker-Levy 9 as imaged by the Wide Field & Planetary Camera-2 (WFPC-2), aboard NASA's Hubble Space Telescope (HST). Jupiter was imaged on May 18, 1994, when the giant planet was at a distance of 420 million miles (670 million KM) from Earth. This 'true-color' picture was assembled from separate HST exposures in red, blue, and green light. Jupiter's rotation between exposures creates the blue and red fringe on either side of the disk. HST can resolve details in Jpiter's magnifient cloud belts and zones as small as 200 miles (320 km) across (wide field mode). This detailed view is only surpassed by images from spacecraft that have traveled to Jupiter. The dark spot on the disk of Jupiter is the shadow of the inner moon Io. This volcanic moon appears as an orange and yellow disk just to the upper right of the shadow. Though Io is approximately the size of Earth's Moon (but 2,000 times farther away), HST can resolve surface details. When the comet was observed on May 17, its train of 21 icy fragments stretched across 710 thousand miles (1.1 million km) of space, or 3 times the distance between Earth and the Moon. This required six WFPC exposures along the comet train to include all the nuclei. The image was taken in red light. The apparent angular size of Jupiter relative to the comet, and its angular separation from the comet when the images were taken, have been modified for illustration purposes. CREDIT: H.A. Weaver, T.E. Smith (Space Telescope Science Institute (STSI)) and J.T. Tranuger, R.W. Evans (Jet Propulsion Laboratory (JPL)) and NASA. (HST ref: STSci-PR94-26a)
Kamarudin, Nur Diyana; Ooi, Chia Yee; Kawanabe, Tadaaki; Odaguchi, Hiroshi; Kobayashi, Fuminori
2017-01-01
In tongue diagnosis, colour information of tongue body has kept valuable information regarding the state of disease and its correlation with the internal organs. Qualitatively, practitioners may have difficulty in their judgement due to the instable lighting condition and naked eye's ability to capture the exact colour distribution on the tongue especially the tongue with multicolour substance. To overcome this ambiguity, this paper presents a two-stage tongue's multicolour classification based on a support vector machine (SVM) whose support vectors are reduced by our proposed k -means clustering identifiers and red colour range for precise tongue colour diagnosis. In the first stage, k -means clustering is used to cluster a tongue image into four clusters of image background (black), deep red region, red/light red region, and transitional region. In the second-stage classification, red/light red tongue images are further classified into red tongue or light red tongue based on the red colour range derived in our work. Overall, true rate classification accuracy of the proposed two-stage classification to diagnose red, light red, and deep red tongue colours is 94%. The number of support vectors in SVM is improved by 41.2%, and the execution time for one image is recorded as 48 seconds.
Ooi, Chia Yee; Kawanabe, Tadaaki; Odaguchi, Hiroshi; Kobayashi, Fuminori
2017-01-01
In tongue diagnosis, colour information of tongue body has kept valuable information regarding the state of disease and its correlation with the internal organs. Qualitatively, practitioners may have difficulty in their judgement due to the instable lighting condition and naked eye's ability to capture the exact colour distribution on the tongue especially the tongue with multicolour substance. To overcome this ambiguity, this paper presents a two-stage tongue's multicolour classification based on a support vector machine (SVM) whose support vectors are reduced by our proposed k-means clustering identifiers and red colour range for precise tongue colour diagnosis. In the first stage, k-means clustering is used to cluster a tongue image into four clusters of image background (black), deep red region, red/light red region, and transitional region. In the second-stage classification, red/light red tongue images are further classified into red tongue or light red tongue based on the red colour range derived in our work. Overall, true rate classification accuracy of the proposed two-stage classification to diagnose red, light red, and deep red tongue colours is 94%. The number of support vectors in SVM is improved by 41.2%, and the execution time for one image is recorded as 48 seconds. PMID:29065640
3D digital image correlation using a single 3CCD colour camera and dichroic filter
NASA Astrophysics Data System (ADS)
Zhong, F. Q.; Shao, X. X.; Quan, C.
2018-04-01
In recent years, three-dimensional digital image correlation methods using a single colour camera have been reported. In this study, we propose a simplified system by employing a dichroic filter (DF) to replace the beam splitter and colour filters. The DF can be used to combine two views from different perspectives reflected by two planar mirrors and eliminate their interference. A 3CCD colour camera is then used to capture two different views simultaneously via its blue and red channels. Moreover, the measurement accuracy of the proposed method is higher since the effect of refraction is reduced. Experiments are carried out to verify the effectiveness of the proposed method. It is shown that the interference between the blue and red views is insignificant. In addition, the measurement accuracy of the proposed method is validated on the rigid body displacement. The experimental results demonstrate that the measurement accuracy of the proposed method is higher compared with the reported methods using a single colour camera. Finally, the proposed method is employed to measure the in- and out-of-plane displacements of a loaded plastic board. The re-projection errors of the proposed method are smaller than those of the reported methods using a single colour camera.
The Little Red Spot: Closest View Yet
NASA Technical Reports Server (NTRS)
2007-01-01
This is a mosaic of three New Horizons images of Jupiter's Little Red Spot, taken with the spacecraft's Long Range Reconnaissance Imager (LORRI) camera at 17:41 Universal Time on February 26 from a range of 3.5 million kilometers (2.1 million miles). The image scale is 17 kilometers (11 miles) per pixel, and the area covered measures 33,000 kilometers (20,000 miles) from top to bottom, two and one-half times the diameter of Earth. The Little Red Spot, a smaller cousin of the famous Great Red Spot, formed in the past decade from the merger of three smaller Jovian storms, and is now the second-largest storm on Jupiter. About a year ago its color, formerly white, changed to a reddish shade similar to the Great Red Spot, perhaps because it is now powerful enough to dredge up reddish material from deeper inside Jupiter. These are the most detailed images ever taken of the Little Red Spot since its formation, and will be combined with even sharper images taken by New Horizons 10 hours later to map circulation patterns around and within the storm. LORRI took the images as the Sun was about to set on the Little Red Spot. The LORRI camera was designed to look at Pluto, where sunlight is much fainter than it is at Jupiter, so the images would have been overexposed if LORRI had looked at the storm when it was illuminated by the noonday Sun. The dim evening illumination helped the LORRI camera obtain well-exposed images. The New Horizons team used predictions made by amateur astronomers in 2006, based on their observations of the motion of the Little Red Spot with backyard telescopes, to help them accurately point LORRI at the storm. These are among a handful of Jupiter system images already returned by New Horizons during its close approach to Jupiter. Most of the data being gathered by the spacecraft are stored onboard and will be downlinked to Earth during March and April 2007.40 CFR 82.110 - Form of label bearing warning statement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... contrast are: black letters on a dark blue or dark green background, dark red letters on a light red background, light red letters on a reflective silver background, and white letters on a light gray or tan...
40 CFR 82.110 - Form of label bearing warning statement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... contrast are: black letters on a dark blue or dark green background, dark red letters on a light red background, light red letters on a reflective silver background, and white letters on a light gray or tan...
The Sensor Irony: How Reliance on Sensor Technology is Limiting Our View of the Battlefield
2010-05-10
thermal ) camera, as well as a laser illuminator/range finder.73 Similar to the MQ- 1 , the MQ-9 Reaper is primarily a strike asset for emerging targets...Wescam 14TS. 1 Both systems have an Electro-optical (daylight) TV camera, an Infra-red ( thermal ) camera, as well as a laser illuminator/range finder...Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour
NASA Technical Reports Server (NTRS)
2007-01-01
This is a montage of New Horizons images of Jupiter and its volcanic moon Io, taken during the spacecraft's Jupiter flyby in early 2007. The Jupiter image is an infrared color composite taken by the spacecraft's near-infrared imaging spectrometer, the Linear Etalon Imaging Spectral Array (LEISA) at 1:40 UT on Feb. 28, 2007. The infrared wavelengths used (red: 1.59 um, green: 1.94 um, blue: 1.85 um) highlight variations in the altitude of the Jovian cloud tops, with blue denoting high-altitude clouds and hazes, and red indicating deeper clouds. The prominent bluish-white oval is the Great Red Spot. The observation was made at a solar phase angle of 75 degrees but has been projected onto a crescent to remove distortion caused by Jupiter's rotation during the scan. The Io image, taken at 00:25 UT on March 1st 2007, is an approximately true-color composite taken by the panchromatic Long-Range Reconnaissance Imager (LORRI), with color information provided by the 0.5 um ('blue') and 0.9 um ('methane') channels of the Multispectral Visible Imaging Camera (MVIC). The image shows a major eruption in progress on Io's night side, at the northern volcano Tvashtar. Incandescent lava glows red beneath a 330-kilometer high volcanic plume, whose uppermost portions are illuminated by sunlight. The plume appears blue due to scattering of light by small particles in the plume This montage appears on the cover of the Oct. 12, 2007, issue of Science magazine.Study of light signal receptor of Stephanopyxis palmeriana during sexual reproduction
NASA Astrophysics Data System (ADS)
Hu, Ren; Lin, Junmin; Lin, Qiuqi; Han, Boping
2005-09-01
We collected centric diatom Stephanopyxis palmeriana samples in coastal waters of Xiamen for characteristic red light/far red light (R/FR) phytochrome reactions to identify its photoreceptor in the course of sexual reproduction. The result showed that pre-illumination of 2 3h red light before darkness could induce sexualization of S. palmeriana, while the follow-up illumination of far red light could reverse the effect of red light, which is a featured reaction of phytochrome. The Southern Dot Blot was carried out to identify the type of phytochrome that induces the sexualization. The result also showed high homogeneity of DNA fragment of S. palmeriana with phyB, but phyA. This means the photoreceptor in the process of sexual reproduction of S. palmeriana is phytochrome B (phyB).
NICMOS PEELS AWAY LAYERS OF DUST TO SHOW INNER REGION OF DUSTY NEBULA
NASA Technical Reports Server (NTRS)
2002-01-01
The revived Near Infrared Camera and Multi-Object Spectrometer (NICMOS) aboard NASA's Hubble Space Telescope has penetrated layers of dust in a star-forming cloud to uncover a dense, craggy edifice of dust and gas . This region is called the Cone Nebula (NGC 2264), so named because, in ground-based images, it has a conical shape. NICMOS enables the Hubble telescope to see in near-infrared wavelengths of light, so that it can penetrate the dust that obscures the nebula's inner regions. But the Cone is so dense that even the near-infared 'eyes' of NICMOS can't penetrate all the way through it. The image shows the upper 0.5 light-years of the nebula. The entire nebula is 7 light-years long. The Cone resides in a turbulent star-forming region, located 2,500 light-years away in the constellation Monoceros. Radiation from hot, young stars [located beyond the top of the image] has slowly eroded the nebula over millions of years. Ultraviolet light heats the edges of the dark cloud, releasing gas into the relatively empty region of surrounding space. NICMOS has peeled away the outer layers of dust to reveal even denser dust. The denser regions give the nebula a more three-dimensional structure than can be seen in the visible-light picture at left, taken by the Advanced Camera for Surveys aboard the Hubble telescope. In peering through the dusty facade to the nebula's inner regions, NICMOS has unmasked several stars [yellow dots at upper right]. Astronomers don't know whether these stars are behind the dusty nebula or embedded in it. The four bright stars lined up on the left are in front of the nebula. The human eye cannot see infrared light, so colors have been assigned to correspond with near-infrared wavelengths. The blue light represents shorter near-infrared wavelengths and the red light corresponds to longer wavelengths. The NICMOS color composite image was made by combining photographs taken in J-band, H-band, and Paschen-alpha filters. The NICMOS images were taken on May 11, 2002. Credits for NICMOS image: NASA, the NICMOS Group (STScI, ESA), and the NICMOS Science Team (University of Arizona) Credits for ACS image: NASA, H. Ford (JHU), G. Illingworth (UCSC/LO), M.Clampin (STScI), G. Hartig (STScI), the ACS Science Team, and ESA
Red light and the sleep quality and endurance performance of Chinese female basketball players.
Zhao, Jiexiu; Tian, Ye; Nie, Jinlei; Xu, Jincheng; Liu, Dongsen
2012-01-01
Good sleep is an important recovery method for prevention and treatment of overtraining in sport practice. Whether sleep is regulated by melatonin after red-light irradiation in athletes is unknown. To determine the effect of red light on sleep quality and endurance performance of Chinese female basketball players. Cohort study. Athletic training facility of the Chinese People's Liberation Army and research laboratory of the China Institute of Sport Science. Patients or Other Participants: Twenty athletes of the Chinese People's Liberation Army team (age = 18.60 6 3.60 years) took part in the study. Participants were divided into red-light treatment (n = 10) and placebo (n = 10) groups. The red-light treatment participants received 30 minutes of irradiation from a red-light therapy instrument every night for 14 days. The placebo group did not receive light illumination. The Pittsburgh Sleep Quality Index (PSQI) questionnaire was completed, serum melatonin was assessed, and 12-minute run was performed at preintervention (baseline) and postintervention (14 days). The 14-day whole-body irradiation with red-light treatment improved the sleep, serum melatonin level, and endurance performance of the elite female basketball players (P < .05). We found a correlation between changes in global Pittsburgh Sleep Quality Index and serum melatonin levels (r = -0.695, P = .006). Our study confirmed the effectiveness of body irradiation with red light in improving the quality of sleep of elite female basketball players and offered a nonpharmacologic and noninvasive therapy to prevent sleep disorders after training.
Comets Kick up Dust in Helix Nebula
NASA Technical Reports Server (NTRS)
2007-01-01
This infrared image from NASA's Spitzer Space Telescope shows the Helix nebula, a cosmic starlet often photographed by amateur astronomers for its vivid colors and eerie resemblance to a giant eye. The nebula, located about 700 light-years away in the constellation Aquarius, belongs to a class of objects called planetary nebulae. Discovered in the 18th century, these colorful beauties were named for their resemblance to gas-giant planets like Jupiter. Planetary nebulae are the remains of stars that once looked a lot like our sun. When sun-like stars die, they puff out their outer gaseous layers. These layers are heated by the hot core of the dead star, called a white dwarf, and shine with infrared and visible colors. Our own sun will blossom into a planetary nebula when it dies in about five billion years. In Spitzer's infrared view of the Helix nebula, the eye looks more like that of a green monster's. Infrared light from the outer gaseous layers is represented in blues and greens. The white dwarf is visible as a tiny white dot in the center of the picture. The red color in the middle of the eye denotes the final layers of gas blown out when the star died. The brighter red circle in the very center is the glow of a dusty disk circling the white dwarf (the disk itself is too small to be resolved). This dust, discovered by Spitzer's infrared heat-seeking vision, was most likely kicked up by comets that survived the death of their star. Before the star died, its comets and possibly planets would have orbited the star in an orderly fashion. But when the star blew off its outer layers, the icy bodies and outer planets would have been tossed about and into each other, resulting in an ongoing cosmic dust storm. Any inner planets in the system would have burned up or been swallowed as their dying star expanded. So far, the Helix nebula is one of only a few dead-star systems in which evidence for comet survivors has been found. This image is made up of data from Spitzer's infrared array camera and multiband imaging photometer. Blue shows infrared light of 3.6 to 4.5 microns; green shows infrared light of 5.8 to 8 microns; and red shows infrared light of 24 microns.NASA Technical Reports Server (NTRS)
2006-01-01
This false-color composite image shows the Cartwheel galaxy as seen by the Galaxy Evolution Explorer's far ultraviolet detector (blue); the Hubble Space Telescope's wide field and planetary camera 2 in B-band visible light (green); the Spitzer Space Telescope's infrared array camera at 8 microns (red); and the Chandra X-ray Observatory's advanced CCD imaging spectrometer-S array instrument (purple). Approximately 100 million years ago, a smaller galaxy plunged through the heart of Cartwheel galaxy, creating ripples of brief star formation. In this image, the first ripple appears as an ultraviolet-bright blue outer ring. The blue outer ring is so powerful in the Galaxy Evolution Explorer observations that it indicates the Cartwheel is one of the most powerful UV-emitting galaxies in the nearby universe. The blue color reveals to astronomers that associations of stars 5 to 20 times as massive as our sun are forming in this region. The clumps of pink along the outer blue ring are regions where both X-rays and ultraviolet radiation are superimposed in the image. These X-ray point sources are very likely collections of binary star systems containing a blackhole (called massive X-ray binary systems). The X-ray sources seem to cluster around optical/ultraviolet-bright supermassive star clusters. The yellow-orange inner ring and nucleus at the center of the galaxy result from the combination of visible and infrared light, which is stronger towards the center. This region of the galaxy represents the second ripple, or ring wave, created in the collision, but has much less star formation activity than the first (outer) ring wave. The wisps of red spread throughout the interior of the galaxy are organic molecules that have been illuminated by nearby low-level star formation. Meanwhile, the tints of green are less massive, older visible-light stars. Although astronomers have not identified exactly which galaxy collided with the Cartwheel, two of three candidate galaxies can be seen in this image to the bottom left of the ring, one as a neon blob and the other as a green spiral. Previously, scientists believed the ring marked the outermost edge of the galaxy, but the latest GALEX observations detect a faint disk, not visible in this image, that extends to twice the diameter of the ring.Development of a single-photon-counting camera with use of a triple-stacked micro-channel plate.
Yasuda, Naruomi; Suzuki, Hitoshi; Katafuchi, Tetsuro
2016-01-01
At the quantum-mechanical level, all substances (not merely electromagnetic waves such as light and X-rays) exhibit wave–particle duality. Whereas students of radiation science can easily understand the wave nature of electromagnetic waves, the particle (photon) nature may elude them. Therefore, to assist students in understanding the wave–particle duality of electromagnetic waves, we have developed a photon-counting camera that captures single photons in two-dimensional images. As an image intensifier, this camera has a triple-stacked micro-channel plate (MCP) with an amplification factor of 10(6). The ultra-low light of a single photon entering the camera is first converted to an electron through the photoelectric effect on the photocathode. The electron is intensified by the triple-stacked MCP and then converted to a visible light distribution, which is measured by a high-sensitivity complementary metal oxide semiconductor image sensor. Because it detects individual photons, the photon-counting camera is expected to provide students with a complete understanding of the particle nature of electromagnetic waves. Moreover, it measures ultra-weak light that cannot be detected by ordinary low-sensitivity cameras. Therefore, it is suitable for experimental research on scintillator luminescence, biophoton detection, and similar topics.
Plenoptic Image Motion Deblurring.
Chandramouli, Paramanand; Jin, Meiguang; Perrone, Daniele; Favaro, Paolo
2018-04-01
We propose a method to remove motion blur in a single light field captured with a moving plenoptic camera. Since motion is unknown, we resort to a blind deconvolution formulation, where one aims to identify both the blur point spread function and the latent sharp image. Even in the absence of motion, light field images captured by a plenoptic camera are affected by a non-trivial combination of both aliasing and defocus, which depends on the 3D geometry of the scene. Therefore, motion deblurring algorithms designed for standard cameras are not directly applicable. Moreover, many state of the art blind deconvolution algorithms are based on iterative schemes, where blurry images are synthesized through the imaging model. However, current imaging models for plenoptic images are impractical due to their high dimensionality. We observe that plenoptic cameras introduce periodic patterns that can be exploited to obtain highly parallelizable numerical schemes to synthesize images. These schemes allow extremely efficient GPU implementations that enable the use of iterative methods. We can then cast blind deconvolution of a blurry light field image as a regularized energy minimization to recover a sharp high-resolution scene texture and the camera motion. Furthermore, the proposed formulation can handle non-uniform motion blur due to camera shake as demonstrated on both synthetic and real light field data.
The Endockscope Using Next Generation Smartphones: "A Global Opportunity".
Tse, Christina; Patel, Roshan M; Yoon, Renai; Okhunov, Zhamshid; Landman, Jaime; Clayman, Ralph V
2018-06-02
The Endockscope combines a smartphone, a battery powered flashlight and a fiberoptic cystoscope allowing for mobile videocystoscopy. We compared conventional videocystoscopy to the Endockscope paired with next generation smartphones in an ex-vivo porcine bladder model to evaluate its image quality. The Endockscope consists of a three-dimensional (3D) printed attachment that connects a smartphone to a flexible fiberoptic cystoscope plus a 1000 lumen light-emitting diode (LED) cordless light source. Video recordings of porcine cystoscopy with a fiberoptic flexible cystoscope (Storz) were captured for each mobile device (iPhone 6, iPhone 6S, iPhone 7, Samsung S8, and Google Pixel) and for the high-definition H3-Z versatile camera (HD) set-up with both the LED light source and the xenon light (XL) source. Eleven faculty urologists, blinded to the modality used, evaluated each video for image quality/resolution, brightness, color quality, sharpness, overall quality, and acceptability for diagnostic use. When comparing the Endockscope coupled to an Galaxy S8, iPhone 7, and iPhone 6S with the LED portable light source to the HD camera with XL, there were no statistically significant differences in any metric. 82% and 55% of evaluators considered the iPhone 7 + LED light source and iPhone 6S + LED light, respectively, appropriate for diagnostic purposes as compared to 100% who considered the HD camera with XL appropriate. The iPhone 6 and Google Pixel coupled with the LED source were both inferior to the HD camera with XL in all metrics. The Endockscope system with a LED light source when coupled with either an iPhone 7 or Samsung S8 (total cost: $750) is comparable to conventional videocystoscopy with a standard camera and XL light source (total cost: $45,000).
The system analysis of light field information collection based on the light field imaging
NASA Astrophysics Data System (ADS)
Wang, Ye; Li, Wenhua; Hao, Chenyang
2016-10-01
Augmented reality(AR) technology is becoming the study focus, and the AR effect of the light field imaging makes the research of light field camera attractive. The micro array structure was adopted in most light field information acquisition system(LFIAS) since emergence of light field camera, micro lens array(MLA) and micro pinhole array(MPA) system mainly included. It is reviewed in this paper the structure of the LFIAS that the Light field camera commonly used in recent years. LFIAS has been analyzed based on the theory of geometrical optics. Meanwhile, this paper presents a novel LFIAS, plane grating system, we call it "micro aperture array(MAA." And the LFIAS are analyzed based on the knowledge of information optics; This paper proves that there is a little difference in the multiple image produced by the plane grating system. And the plane grating system can collect and record the amplitude and phase information of the field light.
Daytime light exposure: effects on biomarkers, measures of alertness, and performance.
Sahin, Levent; Wood, Brittany M; Plitnick, Barbara; Figueiro, Mariana G
2014-11-01
Light can elicit an alerting response in humans, independent from acute melatonin suppression. Recent studies have shown that red light significantly increases daytime and nighttime alertness. The main goal of the present study was to further investigate the effects of daytime light exposure on performance, biomarkers and measures of alertness. It was hypothesized that, compared to remaining in dim light, daytime exposure to narrowband long-wavelength (red) light or polychromatic (2568K) light would induce greater alertness and shorter response times. Thirteen subjects experienced three lighting conditions: dim light (<5lux), red light (λmax=631nm, 213lux, 1.1W/m(2)), and white light (2568K, 361lux, 1.1W/m(2)). The presentation order of the lighting conditions was counterbalanced across the participants and each participant saw a different lighting condition each week. Our results demonstrate, for the first time, that red light can increase short-term performance as shown by the significant (p<0.05) reduced response time and higher throughput in performance tests during the daytime. There was a significant decrease (p<0.05) in alpha power and alpha-theta power after exposure to the white light, but this alerting effect did not translate to better performance. Alpha power was significantly reduced after red light exposure in the middle of the afternoon. There was no significant effect of light on cortisol and alpha amylase. The present results suggest that red light can be used to increase daytime performance. Copyright © 2014 Elsevier B.V. All rights reserved.
Video Capture of Plastic Surgery Procedures Using the GoPro HERO 3+.
Graves, Steven Nicholas; Shenaq, Deana Saleh; Langerman, Alexander J; Song, David H
2015-02-01
Significant improvements can be made in recoding surgical procedures, particularly in capturing high-quality video recordings from the surgeons' point of view. This study examined the utility of the GoPro HERO 3+ Black Edition camera for high-definition, point-of-view recordings of plastic and reconstructive surgery. The GoPro HERO 3+ Black Edition camera was head-mounted on the surgeon and oriented to the surgeon's perspective using the GoPro App. The camera was used to record 4 cases: 2 fat graft procedures and 2 breast reconstructions. During cases 1-3, an assistant remotely controlled the GoPro via the GoPro App. For case 4 the GoPro was linked to a WiFi remote, and controlled by the surgeon. Camera settings for case 1 were as follows: 1080p video resolution; 48 fps; Protune mode on; wide field of view; 16:9 aspect ratio. The lighting contrast due to the overhead lights resulted in limited washout of the video image. Camera settings were adjusted for cases 2-4 to a narrow field of view, which enabled the camera's automatic white balance to better compensate for bright lights focused on the surgical field. Cases 2-4 captured video sufficient for teaching or presentation purposes. The GoPro HERO 3+ Black Edition camera enables high-quality, cost-effective video recording of plastic and reconstructive surgery procedures. When set to a narrow field of view and automatic white balance, the camera is able to sufficiently compensate for the contrasting light environment of the operating room and capture high-resolution, detailed video.
The effect of 630-nm light stimulation on the sEMG signal of forearm muscle
NASA Astrophysics Data System (ADS)
Yang, Dan D.; Hou, W. Sheng; Wu, Xiao Y.; Zheng, Xiao L.; Zheng, Jun; Jiang, Ying T.
2010-11-01
This study aimed to explore if the red light irradiation can affect the electrophysiology performance of flexor digitorum superficialis (FDS) and fatigue recovery. Four healthy volunteers were randomly divided into two groups. In the designed force-tracking tasks, all subjects performed the four fingertip isometric force production except thumb with a load of 30% of the maximum voluntary contraction (MVC) force until exhaustion. Subsequently, for the red light group, red light irradiation (640 nm wavelength, 0.23J/cm2, 20 min) was used on the right forearm; for the control group, the subjects relaxed without red light irradiation. Then subjects were required to perform fatigue trail again, and sEMG signal was collected simultaneously from FDS during finger force production. Average rectified value (ARV) and median frequency (MF) of sEMG were calculated. Compared to the control group, the red light irradiation induced more smoother value of ARV between 30% and 40%, and the value of MF was obviously large and smooth. The above electrophysiological markers indicated that recovery from muscle fatigue may be positively affected by the red light irradiation, suggesting that sEMG would become a power tool for exploring the effect of red light irradiation on local muscle fatigue.
46 CFR 113.25-10 - Emergency red-flashing lights.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 4 2010-10-01 2010-10-01 false Emergency red-flashing lights. 113.25-10 Section 113.25... lights. (a) In a space described in § 113.25-9(a), where the general emergency alarm signal cannot be heard over the background noise, there must be a red-flashing light or rotating beacon, in addition to...
46 CFR 113.25-10 - Emergency red-flashing lights.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 46 Shipping 4 2013-10-01 2013-10-01 false Emergency red-flashing lights. 113.25-10 Section 113.25... lights. (a) In a space described in § 113.25-9(a), where the general emergency alarm signal cannot be heard over the background noise, there must be a red-flashing light or rotating beacon, in addition to...
46 CFR 113.25-10 - Emergency red-flashing lights.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 4 2011-10-01 2011-10-01 false Emergency red-flashing lights. 113.25-10 Section 113.25... lights. (a) In a space described in § 113.25-9(a), where the general emergency alarm signal cannot be heard over the background noise, there must be a red-flashing light or rotating beacon, in addition to...
46 CFR 113.25-10 - Emergency red-flashing lights.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 4 2012-10-01 2012-10-01 false Emergency red-flashing lights. 113.25-10 Section 113.25... lights. (a) In a space described in § 113.25-9(a), where the general emergency alarm signal cannot be heard over the background noise, there must be a red-flashing light or rotating beacon, in addition to...
46 CFR 113.25-10 - Emergency red-flashing lights.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 4 2014-10-01 2014-10-01 false Emergency red-flashing lights. 113.25-10 Section 113.25... lights. (a) In a space described in § 113.25-9(a), where the general emergency alarm signal cannot be heard over the background noise, there must be a red-flashing light or rotating beacon, in addition to...
Baxter, M; Joseph, N; Osborne, V R; Bédécarrats, G Y
2014-05-01
Photoperiod is essential in manipulating sexual maturity and reproductive performance in avian species. Light can be perceived by photoreceptors in the retina of the eye, pineal gland, and hypothalamus. However, the relative sensitivity and specificity of each organ to wavelength, and consequently the physiological effects, may differ. The purpose of this experiment was to test the impacts of light wavelengths on reproduction, growth, and stress in laying hens maintained in cages and to determine whether the retina of the eye is necessary. Individual cages in 3 optically isolated sections of a single room were equipped with LED strips providing either pure green, pure red or white light (red, green, and blue) set to 10 lx (hens levels). The involvement of the retina on mediating the effects of light wavelength was assessed by using a naturally blind line (Smoky Joe) of chickens. Red and white lights resulted in higher estradiol concentrations after photostimulation, indicating stronger ovarian activation, which translated into a significantly lower age at first egg when compared with the green light. Similarly, hens maintained under red and white lights had a longer and higher peak production and higher cumulative egg number than hens under green light. No significant difference in BW gain was observed until sexual maturation. However, from 23 wk of age onward, birds exposed to green light showed higher body growth, which may be the result of their lower egg production. Although corticosterone levels were higher at 20 wk of age in hens under red light, concentrations were below levels that can be considered indicative of stress. Because no significant differences were observed between blind and sighted birds maintained under red and white light, the retina of the eye did not participate in the activation of reproduction. In summary, red light was required to stimulate the reproductive axis whereas green light was ineffective, and the effects of stimulatory wavelengths do not appear to require a functional retina of the eye.
Red light-induced suppression of gravitropism in moss protonemata
NASA Astrophysics Data System (ADS)
Kern, V. D.; Sack, F. D.
1999-01-01
Moss protonemata are among the few cell types known that both sense and respond to gravity and light. Apical cells of Ceratodon protonemata grow by oriented tip growth which is negatively gravitropic in the dark or positively phototropic in unilateral red light. Phototropism is phytochrome-mediated. To determine whether any gravitropism persists during irradiation, cultures were turned at various angles with respect to gravity and illuminated so that the light and gravity vectors acted either in the same or in different directions. Red light for 24h (≥140nmol m-2s-1) caused the protonemata to be oriented directly towards the light. Similarly, protonemata grew directly towards the light regardless of light position with respect to gravity indicating that all growth is oriented strictly by phototropism, not gravitropism. At light intensities ≤100nmol m-2s-1, no phototropism occurs and the mean protonemal tip angle remains above the horizontal, which is the criterion for negative gravitropism. But those protonemata are not as uniformly upright as they would be in the dark indicating that low intensity red light permits gravitropism but also modulates the response. Protonemata of the aphototropic mutant ptr1 that lacks a functional Pfr chromophore, exhibit gravitropism regardless of red light intensity. This indicates that red light acts via Pfr to modulate gravitropism at low intensities and to suppress gravitropism at intensities ≥140nmol m-2s-1.
NASA Astrophysics Data System (ADS)
Olweny, Ephrem O.; Tan, Yung K.; Faddegon, Stephen; Jackson, Neil; Wehner, Eleanor F.; Best, Sara L.; Park, Samuel K.; Thapa, Abhas; Cadeddu, Jeffrey A.; Zuzak, Karel J.
2012-03-01
Digital light processing hyperspectral imaging (DLP® HSI) was adapted for use during laparoscopic surgery by coupling a conventional laparoscopic light guide with a DLP-based Agile Light source (OL 490, Optronic Laboratories, Orlando, FL), incorporating a 0° laparoscope, and a customized digital CCD camera (DVC, Austin, TX). The system was used to characterize renal ischemia in a porcine model.
Beam measurements using visible synchrotron light at NSLS2 storage ring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Weixing, E-mail: chengwx@bnl.gov; Bacha, Bel; Singh, Om
2016-07-27
Visible Synchrotron Light Monitor (SLM) diagnostic beamline has been designed and constructed at NSLS2 storage ring, to characterize the electron beam profile at various machine conditions. Due to the excellent alignment, SLM beamline was able to see the first visible light when beam was circulating the ring for the first turn. The beamline has been commissioned for the past year. Besides a normal CCD camera to monitor the beam profile, streak camera and gated camera are used to measure the longitudinal and transverse profile to understand the beam dynamics. Measurement results from these cameras will be presented in this paper.more » A time correlated single photon counting system (TCSPC) has also been setup to measure the single bunch purity.« less
Bell, Margaret Carol; Galatioto, Fabio; Giuffrè, Tullio; Tesoriere, Giovanni
2012-05-01
Building on previous research a conceptual framework, based on potential conflicts analysis, has provided a quantitative evaluation of 'proneness' to red-light running behaviour at urban signalised intersections of different geometric, flow and driver characteristics. The results provided evidence that commonly used violation rates could cause inappropriate evaluation of the extent of the red-light running phenomenon. Initially, an in-depth investigation of the functional form of the mathematical relationship between the potential and actual red-light runners was carried out. The application of the conceptual framework was tested on a signalised intersection in order to quantify the proneness to red-light running. For the particular junction studied proneness for daytime was found to be 0.17 north and 0.16 south for opposing main road approaches and 0.42 east and 0.59 west for the secondary approaches. Further investigations were carried out using a traffic microsimulation model, to explore those geometric features and traffic volumes (arrival patterns at the stop-line) that significantly affect red-light running. In this way the prediction capability of the proposed potential conflict model was improved. A degree of consistency in the measured and simulated red-light running was observed and the conceptual framework was tested through a sensitivity analysis applied to different stop-line positions and traffic volume variations. The microsimulation, although at its early stages of development, has shown promise in its ability to model unintentional red light running behaviour and following further work through application to other junctions, potentially provides a tool for evaluation of alternative junction designs on proneness. In brief, this paper proposes and applies a novel approach to model red-light running using a microsimulation and demonstrates consistency with the observed and theoretical results. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yamamoto, Seiichi; Suzuki, Mayumi; Kato, Katsuhiko; Watabe, Tadashi; Ikeda, Hayato; Kanai, Yasukazu; Ogata, Yoshimune; Hatazawa, Jun
2016-09-01
Although iodine 131 (I-131) is used for radionuclide therapy, high resolution images are difficult to obtain with conventional gamma cameras because of the high energy of I-131 gamma photons (364 keV). Cerenkov-light imaging is a possible method for beta emitting radionuclides, and I-131 (606 MeV maximum beta energy) is a candidate to obtain high resolution images. We developed a high energy gamma camera system for I-131 radionuclide and combined it with a Cerenkov-light imaging system to form a gamma-photon/Cerenkov-light hybrid imaging system to compare the simultaneously measured images of these two modalities. The high energy gamma imaging detector used 0.85-mm×0.85-mm×10-mm thick GAGG scintillator pixels arranged in a 44×44 matrix with a 0.1-mm thick reflector and optical coupled to a Hamamatsu 2 in. square position sensitive photomultiplier tube (PSPMT: H12700 MOD). The gamma imaging detector was encased in a 2 cm thick tungsten shield, and a pinhole collimator was mounted on its top to form a gamma camera system. The Cerenkov-light imaging system was made of a high sensitivity cooled CCD camera. The Cerenkov-light imaging system was combined with the gamma camera using optical mirrors to image the same area of the subject. With this configuration, we simultaneously imaged the gamma photons and the Cerenkov-light from I-131 in the subjects. The spatial resolution and sensitivity of the gamma camera system for I-131 were respectively 3 mm FWHM and 10 cps/MBq for the high sensitivity collimator at 10 cm from the collimator surface. The spatial resolution of the Cerenkov-light imaging system was 0.64 mm FWHM at 10 cm from the system surface. Thyroid phantom and rat images were successfully obtained with the developed gamma-photon/Cerenkov-light hybrid imaging system, allowing direct comparison of these two modalities. Our developed gamma-photon/Cerenkov-light hybrid imaging system will be useful to evaluate the advantages and disadvantages of these two modalities.
2014-12-15
Tethys appears to be peeking out from behind Rhea, watching the watcher. Scientists believe that Tethys' surprisingly high albedo is due to the water ice jets emerging from its neighbor, Enceladus. The fresh water ice becomes the E ring and can eventually arrive at Tethys, giving it a fresh surface layer of clean ice. Lit terrain seen here is on the anti-Saturn side of Rhea. North on Rhea is up. The image was taken in red light with the Cassini spacecraft narrow-angle camera on April 20, 2012. The view was obtained at a distance of approximately 1.1 million miles (1.8 million kilometers) from Rhea and at a Sun-Rhea-spacecraft, or phase, angle of 59 degrees. Image scale is 7 miles (11 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18293
NASA Technical Reports Server (NTRS)
2005-01-01
On its 449th martian day, or sol (April 29, 2005), NASA's Mars rover Opportunity woke up approximately an hour after sunset and took this picture of the fading twilight as the stars began to come out. Set against the fading red glow of the sky, the pale dot near the center of the picture is not a star, but a planet -- Earth. Earth appears elongated because it moved slightly during the 15-second exposures. The faintly blue light from the Earth combines with the reddish sky glow to give the pale white appearance. The images were taken with Opportunity's panoramic camera, using 440-nanometer, 530-nanometer, and 750-nanometer color filters. In processing on the ground, the images were shifted slightly to compensate for Earth's motion between one image and the next.Green-light supplementation for enhanced lettuce growth under red- and blue-light-emitting diodes
NASA Technical Reports Server (NTRS)
Kim, Hyeon-Hye; Goins, Gregory D.; Wheeler, Raymond M.; Sager, John C.
2004-01-01
Plants will be an important component of future long-term space missions. Lighting systems for growing plants will need to be lightweight, reliable, and durable, and light-emitting diodes (LEDs) have these characteristics. Previous studies demonstrated that the combination of red and blue light was an effective light source for several crops. Yet the appearance of plants under red and blue lighting is purplish gray making visual assessment of any problems difficult. The addition of green light would make the plant leave appear green and normal similar to a natural setting under white light and may also offer a psychological benefit to the crew. Green supplemental lighting could also offer benefits, since green light can better penetrate the plant canopy and potentially increase plant growth by increasing photosynthesis from the leaves in the lower canopy. In this study, four light sources were tested: 1) red and blue LEDs (RB), 2) red and blue LEDs with green fluorescent lamps (RGB), 3) green fluorescent lamps (GF), and 4) cool-white fluorescent lamps (CWF), that provided 0%, 24%, 86%, and 51% of the total PPF in the green region of the spectrum, respectively. The addition of 24% green light (500 to 600 nm) to red and blue LEDs (RGB treatment) enhanced plant growth. The RGB treatment plants produced more biomass than the plants grown under the cool-white fluorescent lamps (CWF treatment), a commonly tested light source used as a broad-spectrum control.
SU-E-T-161: SOBP Beam Analysis Using Light Output of Scintillation Plate Acquired by CCD Camera.
Cho, S; Lee, S; Shin, J; Min, B; Chung, K; Shin, D; Lim, Y; Park, S
2012-06-01
To analyze Bragg-peak beams in SOBP (spread-out Bragg-peak) beam using CCD (charge-coupled device) camera - scintillation screen system. We separated each Bragg-peak beam using light output of high sensitivity scintillation material acquired by CCD camera and compared with Bragg-peak beams calculated by Monte Carlo simulation. In this study, CCD camera - scintillation screen system was constructed with a high sensitivity scintillation plate (Gd2O2S:Tb) and a right-angled prismatic PMMA phantom, and a Marlin F-201B, EEE-1394 CCD camera. SOBP beam irradiated by the double scattering mode of a PROTEUS 235 proton therapy machine in NCC is 8 cm width, 13 g/cm 2 range. The gain, dose rate and current of this beam is 50, 2 Gy/min and 70 nA, respectively. Also, we simulated the light output of scintillation plate for SOBP beam using Geant4 toolkit. We evaluated the light output of high sensitivity scintillation plate according to intergration time (0.1 - 1.0 sec). The images of CCD camera during the shortest intergration time (0.1 sec) were acquired automatically and randomly, respectively. Bragg-peak beams in SOBP beam were analyzed by the acquired images. Then, the SOBP beam used in this study was calculated by Geant4 toolkit and Bragg-peak beams in SOBP beam were obtained by ROOT program. The SOBP beam consists of 13 Bragg-peak beams. The results of experiment were compared with that of simulation. We analyzed Bragg-peak beams in SOBP beam using light output of scintillation plate acquired by CCD camera and compared with that of Geant4 simulation. We are going to study SOBP beam analysis using more effective the image acquisition technique. © 2012 American Association of Physicists in Medicine.
Xu, Yanfeng; Han, Yunlin; Jiang, Binbin; Huang, Lan; Zhu, Hua; Xu, Yuhuan; Yang, Weiling; Qin, Chuan
2016-01-01
The biological effects of different wavelengths of light emitting diode (LED) light tend to vary from each other. Research into use of photobiomodulation for treatment of skin wounds and the underlying mechanisms has been largely lacking. We explored the histopathological basis of the therapeutic effect of photobiomodulation and the relation between duration of exposure and photobiomodulation effect of different wavelengths of LED in a Japanese big-ear white rabbit skin-wound model. Skin wound model was established in 16 rabbits (three wounds per rabbit: one served as control, the other two wounds were irradiated by red and blue LED lights, respectively). Rabbits were then divided into 2 equal groups based on the duration of exposure to LED lights (15 and 30 min/exposure). The number of wounds that showed healing and the percentage of healed wound area were recorded. Histopathological examination and skin expression levels of fibroblast growth factor (FGF), epidermal growth factor (EGF), endothelial marker (CD31), proliferating cell nuclear antigen (Ki67) and macrophagocyte (CD68) infiltration, and the proliferation of skin collagen fibers was assessed. On days 16 and 17 of irradiation, the healing rates in red (15 min and 30 min) and blue (15 min and 30 min) groups were 50%, 37.5%, 25% and 37.5%, respectively, while the healing rate in the control group was 12.5%. The percentage healed area in the red light groups was significantly higher than those in other groups. Collagen fiber and skin thickness were significantly increased in both red light groups; expression of EGF, FGF, CD31 and Ki67 in the red light groups was significantly higher than those in other groups; the expression of FGF in red (30 min) group was not significantly different from that in the blue light and control groups. The effect of blue light on wound healing was poorer than that of red light. Red light appeared to hasten wound healing by promoting fibrous tissue, epidermal and endothelial cell proliferation. An increase in the exposure time to 30 min did not confer any additional benefit in both red and blue light groups. This study provides a theoretical basis for the potential therapeutic application of LED light in clinical settings. PMID:27347879
Intramolecular co-action of two independent photosensory modules in the fern phytochrome 3.
Kanegae, Takeshi
2015-01-01
Fern phytochrome3/neochrome1 (phy3/neo1) is a chimeric photoreceptor composed of a phytochrome-chromophore binding domain and an almost full-length phototropin. phy3 thus contains two different light-sensing modules; a red/far-red light receptor phytochrome and a blue light receptor phototropin. phy3 induces both red light- and blue light-dependent phototropism in phototropin-deficient Arabidopsis thaliana (phot1 phot2) seedlings. The red-light response is dependent on the phytochrome module of phy3, and the blue-light response is dependent on the phototropin module. We recently showed that both the phototropin-sensing module and the phytochrome-sensing module mediate the blue light-dependent phototropic response. Particularly under low-light conditions, these two light-sensing modules cooperate to induce the blue light-dependent phototropic response. This intramolecular co-action of two independent light-sensing modules in phy3 enhances light sensitivity, and perhaps allowed ferns to adapt to the low-light canopy conditions present in angiosperm forests.
Stray light lessons learned from the Mars reconnaissance orbiter's optical navigation camera
NASA Astrophysics Data System (ADS)
Lowman, Andrew E.; Stauder, John L.
2004-10-01
The Optical Navigation Camera (ONC) is a technical demonstration slated to fly on NASA"s Mars Reconnaissance Orbiter in 2005. Conventional navigation methods have reduced accuracy in the days immediately preceding Mars orbit insertion. The resulting uncertainty in spacecraft location limits rover landing sites to relatively safe areas, away from interesting features that may harbor clues to past life on the planet. The ONC will provide accurate navigation on approach for future missions by measuring the locations of the satellites of Mars relative to background stars. Because Mars will be a bright extended object just outside the camera"s field of view, stray light control at small angles is essential. The ONC optomechanical design was analyzed by stray light experts and appropriate baffles were implemented. However, stray light testing revealed significantly higher levels of light than expected at the most critical angles. The primary error source proved to be the interface between ground glass surfaces (and the paint that had been applied to them) and the polished surfaces of the lenses. This paper will describe troubleshooting and correction of the problem, as well as other lessons learned that affected stray light performance.
NASA Astrophysics Data System (ADS)
Javh, Jaka; Slavič, Janko; Boltežar, Miha
2018-02-01
Instantaneous full-field displacement fields can be measured using cameras. In fact, using high-speed cameras full-field spectral information up to a couple of kHz can be measured. The trouble is that high-speed cameras capable of measuring high-resolution fields-of-view at high frame rates prove to be very expensive (from tens to hundreds of thousands of euro per camera). This paper introduces a measurement set-up capable of measuring high-frequency vibrations using slow cameras such as DSLR, mirrorless and others. The high-frequency displacements are measured by harmonically blinking the lights at specified frequencies. This harmonic blinking of the lights modulates the intensity changes of the filmed scene and the camera-image acquisition makes the integration over time, thereby producing full-field Fourier coefficients of the filmed structure's displacements.
Tang, Qing-Qing; Fang, Zhi-Guo; Ji, Wen-Wen; Xia, Hui-Long
2014-11-01
Effect of light quality, including red light, blue light, white light, red and blue mixing light with ratios of 8: 1, 8:2 and 8 : 3, on the growth characteristics and biochenmical composition of Chlorella pyrenoidosa was investigated based on light emitting diode (LED). Results showed that Chlorella pyrenoidosa grew best under blue light, and the optical density, specific growth rate and biomass of Chlorella pyrenoidosa was about 2.4, 0.10 d(-1) and 0.64 g x L(-1), respectively, while the optical density of Chlorella pyrenoidosa was between 1.0 and 1.7, the specific growth rate was between 0.07-0.10 d(-1) and the biomass was between 0.27 and 0.38 g x L(-1) under other light quality after 30 days of cultivation. Under blue light, the optical density, specific growth rate and biomass of Chlorella pyrenoidosa was approximately 2.05 times, 1.33 times and 2.06 times higher than red light, respectively. Moreover, red and blue mixing light was conducive to the synthesis of chlorophyll a and β-carotene of Chlorella pyrenoidosa, and blue light could promote the synthesis of chlorophyll b. Chlorophyll a and carotenoids content of Chlorella pyrenoidosa was 13.5 mg xg(-1) and 5.8 mg x g(-1) respectively under red and blue mixing light with a ratio of 8:1, while it was 8.4 mg x g(-1) and 3.6 mg x g(-1) respectively under blue light. Red and blue mixing light was more conducive to protein and total lipid content per dry cell of Chlorella pyrenoidosa. Protein and total lipid content was 489.3 mg x g(-1) and 311.2 mg x g(-1) under red and blue mixing light with a ratio of 8 : 3, while it was 400.9 mg x g(-1) and 231.9 mg x g(-1) respectively under blue light.
Phototoxic effects of lysosome-associated genetically encoded photosensitizer KillerRed
NASA Astrophysics Data System (ADS)
Serebrovskaya, Ekaterina O.; Ryumina, Alina P.; Boulina, Maria E.; Shirmanova, Marina V.; Zagaynova, Elena V.; Bogdanova, Ekaterina A.; Lukyanov, Sergey A.; Lukyanov, Konstantin A.
2014-07-01
KillerRed is a unique phototoxic red fluorescent protein that can be used to induce local oxidative stress by green-orange light illumination. Here we studied phototoxicity of KillerRed targeted to cytoplasmic surface of lysosomes via fusion with Rab7, a small GTPase that is known to be attached to membranes of late endosomes and lysosomes. It was found that lysosome-associated KillerRed ensures efficient light-induced cell death similar to previously reported mitochondria- and plasma membrane-localized KillerRed. Inhibitory analysis demonstrated that lysosomal cathepsins play an important role in the manifestation of KillerRed-Rab7 phototoxicity. Time-lapse monitoring of cell morphology, membrane integrity, and nuclei shape allowed us to conclude that KillerRed-Rab7-mediated cell death occurs via necrosis at high light intensity or via apoptosis at lower light intensity. Potentially, KillerRed-Rab7 can be used as an optogenetic tool to direct target cell populations to either apoptosis or necrosis.
Yang, Ying; Weathers, Pamela
2015-01-01
Ettlia oleoabundans, a freshwater unicellular green microalga, was grown under different light qualities ± carbon dioxide-enriched air to determine the combined effects on growth and lipid production of this oleaginous species. Keeping total light intensity constant, when a portion of the cool white was replaced by red, volumetric lipid yield increased 2.8-fold mainly due to the greater yield of oleic acid, a desirable biodiesel precursor. Only 30 min of red light treatment was sufficient to increase lipid yield and quality to the same level as cultures provided red light for >14 days, indicating the potential role of red light in stimulating lipid production of this species. Carbon dioxide enrichment via air sparging enhanced exponential growth, carbon conversion efficiency, and nutrient consumption. Together, these results showed that light quality plays an important role in microalgal lipid production. Adjustment in light quality and gas delivery efficiency with carbon dioxide enrichment improved lipid yield and quality in this and possibly other oleaginous algal species.
47 CFR 17.23 - Aviation Red Obstruction Lighting [Reserved
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 1 2012-10-01 2012-10-01 false Aviation Red Obstruction Lighting [Reserved] 17.23 Section 17.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL CONSTRUCTION, MARKING..., May 20, 1999, as amended at 69 FR 18803, Apr. 9, 2004] Aviation Red Obstruction Lighting [Reserved] ...
47 CFR 17.23 - Aviation Red Obstruction Lighting [Reserved
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 1 2014-10-01 2014-10-01 false Aviation Red Obstruction Lighting [Reserved] 17.23 Section 17.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL CONSTRUCTION, MARKING... Commission. Aviation Red Obstruction Lighting [Reserved] Effective Date Note: At 79 FR 56986, Sept. 24, 2014...
47 CFR 17.23 - Aviation Red Obstruction Lighting [Reserved
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Aviation Red Obstruction Lighting [Reserved] 17.23 Section 17.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL CONSTRUCTION, MARKING..., May 20, 1999, as amended at 69 FR 18803, Apr. 9, 2004] Aviation Red Obstruction Lighting [Reserved] ...
47 CFR 17.23 - Aviation Red Obstruction Lighting [Reserved
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 1 2013-10-01 2013-10-01 false Aviation Red Obstruction Lighting [Reserved] 17.23 Section 17.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL CONSTRUCTION, MARKING..., May 20, 1999, as amended at 69 FR 18803, Apr. 9, 2004] Aviation Red Obstruction Lighting [Reserved] ...
47 CFR 17.23 - Aviation Red Obstruction Lighting [Reserved
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Aviation Red Obstruction Lighting [Reserved] 17.23 Section 17.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL CONSTRUCTION, MARKING..., May 20, 1999, as amended at 69 FR 18803, Apr. 9, 2004] Aviation Red Obstruction Lighting [Reserved] ...
Red light running : a policy review
DOT National Transportation Integrated Search
2003-03-01
There are more than 100,000 red light running crashes per year in the U.S., resulting in some 90,000 people injured and 1,000 people killed. More than half of red light running-related fatalities are pedestrians and occupants in other vehicles who ar...
Ueno, Yoshifumi; Aikawa, Shimpei; Kondo, Akihiko; Akimoto, Seiji
2015-08-01
Photosynthetic organisms change the quantity and/or quality of their pigment-protein complexes and the interactions among these complexes in response to light conditions. In the present study, we analyzed light adaptation of the unicellular red alga Cyanidioschyzon merolae, whose pigment composition is similar to that of cyanobacteria because its phycobilisomes (PBS) lack phycoerythrin. C. merolae were grown under different light qualities, and their responses were measured by steady-state absorption, steady-state fluorescence, and picosecond time-resolved fluorescence spectroscopies. Cells were cultivated under four monochromatic light-emitting diodes (blue, green, yellow, and red), and changes in pigment composition and energy transfer were observed. Cells grown under blue and green light increased their relative phycocyanin levels compared with cells cultured under white light. Energy-transfer processes to photosystem I (PSI) were sensitive to yellow and red light. The contribution of direct energy transfer from PBS to PSI increased only under yellow light, while red light induced a reduction in energy transfer from photosystem II to PSI and an increase in energy transfer from light-harvesting chlorophyll protein complex I to PSI. Differences in pigment composition, growth, and energy transfer under different light qualities are discussed.
Effect of Light Quality on Stomatal Opening in Leaves of Xanthium strumarium L.
Sharkey, T D; Raschke, K
1981-11-01
Flux response curves were determined at 16 wavelengths of light for the conductance for water vapor of the lower epidermis of detached leaves of Xanthium strumarium L. An action spectrum of stomatal opening resulted in which blue light (wavelengths between 430 and 460 nanometers) was nearly ten times more effective than red light (wavelengths between 630 and 680 nanometers) in producing a conductance of 15 centimoles per square meter per second. Stomata responded only slightly to green light. An action spectrum of stomatal responses to red light corresponded to that of CO(2) assimilation; the inhibitors of photosynthetic electron transport, cyanazine (2-chloro-4[1-cyano-1-methylethylamino]-6-ethylamino-s-triazine) and 3-(3,4-dichlorophenyl)-1,1-dimethylurea, eliminated the response to red light. This indicates that light absorption by chlorophyll is the cause of stomatal sensitivity to red light. Determination of flux response curves on leaves in the normal position (upper epidermis facing the light) or in the inverted position (lower epidermis facing the light) led to the conclusion that the photoreceptors for blue as well as for red light are located on or near the surfaces of the leaves; presumably they are in the guard cells themselves.
Kiyota, Seiichiro; Xie, Xianzhi; Takano, Makoto
2012-02-01
Phytochromes are red/far-red photoreceptors encoded by a small gene family in higher plants. Differences in phenotype among mutants suggest distinct functions among phytochrome subfamilies. We attempted to find distinct functions among phytochromes by oligo-microarray analysis of single, double, and triple mutants in rice. In most cases, gene expression was redundantly regulated by phytochromes A and B after irradiation by a red light pulse in etiolated rice shoots. However, we found that several genes were specifically regulated by phytochromes A and C. Most of them were expressed immediately after the red light pulse in a transient manner. They are stress-related genes that may be involved in resistance to light stress when etiolated seedlings are exposed to light. These genes were not expressed in green leaves after the red light pulse, suggesting that they have a function specific to etiolated seedlings. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
Huang, Junhui; Xue, Qi; Wang, Zhao; Gao, Jianmin
2016-09-03
While color-coding methods have improved the measuring efficiency of a structured light three-dimensional (3D) measurement system, they decreased the measuring accuracy significantly due to lateral chromatic aberration (LCA). In this study, the LCA in a structured light measurement system is analyzed, and a method is proposed to compensate the error caused by the LCA. Firstly, based on the projective transformation, a 3D error map of LCA is constructed in the projector images by using a flat board and comparing the image coordinates of red, green and blue circles with the coordinates of white circles at preselected sample points within the measurement volume. The 3D map consists of the errors, which are the equivalent errors caused by LCA of the camera and projector. Then in measurements, error values of LCA are calculated and compensated to correct the projector image coordinates through the 3D error map and a tri-linear interpolation method. Eventually, 3D coordinates with higher accuracy are re-calculated according to the compensated image coordinates. The effectiveness of the proposed method is verified in the following experiments.
Huang, Junhui; Xue, Qi; Wang, Zhao; Gao, Jianmin
2016-01-01
While color-coding methods have improved the measuring efficiency of a structured light three-dimensional (3D) measurement system, they decreased the measuring accuracy significantly due to lateral chromatic aberration (LCA). In this study, the LCA in a structured light measurement system is analyzed, and a method is proposed to compensate the error caused by the LCA. Firstly, based on the projective transformation, a 3D error map of LCA is constructed in the projector images by using a flat board and comparing the image coordinates of red, green and blue circles with the coordinates of white circles at preselected sample points within the measurement volume. The 3D map consists of the errors, which are the equivalent errors caused by LCA of the camera and projector. Then in measurements, error values of LCA are calculated and compensated to correct the projector image coordinates through the 3D error map and a tri-linear interpolation method. Eventually, 3D coordinates with higher accuracy are re-calculated according to the compensated image coordinates. The effectiveness of the proposed method is verified in the following experiments. PMID:27598174
van Grunsven, Roy H. A.; Ramakers, Jip J. C.; Ferguson, Kim B.; Raap, Thomas; Donners, Maurice; Veenendaal, Elmar M.; Visser, Marcel E.
2017-01-01
Artificial light at night has shown a remarkable increase over the past decades. Effects are reported for many species groups, and include changes in presence, behaviour, physiology and life-history traits. Among these, bats are strongly affected, and how bat species react to light is likely to vary with light colour. Different spectra may therefore be applied to reduce negative impacts. We used a unique set-up of eight field sites to study the response of bats to three different experimental light spectra in an otherwise dark and undisturbed natural habitat. We measured activity of three bat species groups around transects with light posts emitting white, green and red light with an intensity commonly used to illuminate countryside roads. The results reveal a strong and spectrum-dependent response for the slow-flying Myotis and Plecotus and more agile Pipistrellus species, but not for Nyctalus and Eptesicus species. Plecotus and Myotis species avoided white and green light, but were equally abundant in red light and darkness. The agile, opportunistically feeding Pipistrellus species were significantly more abundant around white and green light, most likely because of accumulation of insects, but equally abundant in red illuminated transects compared to dark control. Forest-dwelling Myotis and Plecotus species and more synanthropic Pipistrellus species are thus least disturbed by red light. Hence, in order to limit the negative impact of light at night on bats, white and green light should be avoided in or close to natural habitat, but red lights may be used if illumination is needed. PMID:28566484
NASA Technical Reports Server (NTRS)
Schuerger, A. C.; Brown, C. S.; Stryjewski, E. C.
1997-01-01
Pepper plants (Capsicum annuum L. cv., Hungarian Wax) were grown under metal halide (MH) lamps or light-emitting diode (LED) arrays with different spectra to determine the effects of light quality on plant anatomy of leaves and stems. One LED (660) array supplied 90% red light at 660 nm (25nm band-width at half-peak height) and 1% far-red light between 700-800nm. A second LED (660/735) array supplied 83% red light at 660nm and 17% far-red light at 735nm (25nm band-width at half-peak height). A third LED (660/blue) array supplied 98% red light at 660nm, 1% blue light between 350-550nm, and 1% far-red light between 700-800nm. Control plants were grown under broad spectrum metal halide lamps. Plants were gron at a mean photon flux (300-800nm) of 330 micromol m-2 s-1 under a 12 h day-night photoperiod. Significant anatomical changes in stem and leaf morphologies were observed in plants grown under the LED arrays compared to plants grown under the broad-spectrum MH lamp. Cross-sectional areas of pepper stems, thickness of secondary xylem, numbers of intraxylary phloem bundles in the periphery of stem pith tissues, leaf thickness, numbers of choloplasts per palisade mesophyll cell, and thickness of palisade and spongy mesophyll tissues were greatest in peppers grown under MH lamps, intermediate in plants grown under the 660/blue LED array, and lowest in peppers grown under the 660 or 660/735 LED arrays. Most anatomical features of pepper stems and leaves were similar among plants grown under 660 or 660/735 LED arrays. The effects of spectral quality on anatomical changes in stem and leaf tissues of peppers generally correlate to the amount of blue light present in the primary light source.
Schuerger, A C; Brown, C S; Stryjewski, E C
1997-03-01
Pepper plants (Capsicum annuum L. cv., Hungarian Wax) were grown under metal halide (MH) lamps or light-emitting diode (LED) arrays with different spectra to determine the effects of light quality on plant anatomy of leaves and stems. One LED (660) array supplied 90% red light at 660 nm (25nm band-width at half-peak height) and 1% far-red light between 700-800nm. A second LED (660/735) array supplied 83% red light at 660nm and 17% far-red light at 735nm (25nm band-width at half-peak height). A third LED (660/blue) array supplied 98% red light at 660nm, 1% blue light between 350-550nm, and 1% far-red light between 700-800nm. Control plants were grown under broad spectrum metal halide lamps. Plants were gron at a mean photon flux (300-800nm) of 330 micromol m-2 s-1 under a 12 h day-night photoperiod. Significant anatomical changes in stem and leaf morphologies were observed in plants grown under the LED arrays compared to plants grown under the broad-spectrum MH lamp. Cross-sectional areas of pepper stems, thickness of secondary xylem, numbers of intraxylary phloem bundles in the periphery of stem pith tissues, leaf thickness, numbers of choloplasts per palisade mesophyll cell, and thickness of palisade and spongy mesophyll tissues were greatest in peppers grown under MH lamps, intermediate in plants grown under the 660/blue LED array, and lowest in peppers grown under the 660 or 660/735 LED arrays. Most anatomical features of pepper stems and leaves were similar among plants grown under 660 or 660/735 LED arrays. The effects of spectral quality on anatomical changes in stem and leaf tissues of peppers generally correlate to the amount of blue light present in the primary light source.
Root-shoot interaction in the greening of wheat seedlings grown under red light
NASA Technical Reports Server (NTRS)
Tripathy, B. C.; Brown, C. S.
1995-01-01
Wheat seedlings grown with roots exposed to constant red light (300-500 micromoles m-2 s-1) did not accumulate chlorophyll in the leaves. In contrast, seedlings grown with their roots shielded from light accumulated chlorophylls. Chlorophyll biosynthesis could be induced in red-light-grown chlorophyll-deficient yellow plants by either reducing the red-light intensity at the root surface to 100 micromoles m-1 s-1 or supplementing with 6% blue light. The inhibition of chlorophyll biosynthesis was due to impairment of the Mg-chelatase enzyme working at the origin of the Mg-tetrapyrrole pathway. The root-perceived photomorphogenic inhibition of shoot greening demonstrates root-shoot interaction in the greening process.
Car driver behavior at flashing light railroad grade crossings.
Tenkink, E; Van der Horst, R
1990-06-01
The behavior of car drivers at two Dutch railroad grade crossings with automatic flashing warning lights was analyzed. Car drivers were videotaped while approaching either the red flashing lights or the white flashing "safe"-signal. Approach speeds, positions, and time intervals were semiautomatically measured from videos of more than 900 drivers: 660 while confronted with the red lights and 272 while passing the white light. Of the latter group, head movements during the approach to the crossing were also registered. Red light compliance was relatively good, as no driver was observed to cross later than 6 seconds after the onset of the red lights, despite train-arrival times of well over 60 seconds. The level of red light compliance was further quantified in terms of both the deceleration and time-to-stopping-line as accepted by drivers. From a comparison with earlier research on red light compliance at signalized road intersections it appeared that red light compliance was better at railroad crossings than at road crossings. It is concluded that faulty red light compliance is not a major cause for car-train accidents and that emphasis should be placed on the ability of the present device to attract attention and to signal unambiguously. The high degree of compliance also causes unexpected driver actions, such as emergency braking and hesitations. A yellow phase may reduce these problems. Some drivers tended to proceed immediately after a train had cleared the road instead of waiting for the end of the red signal (typically some 3 to 5 seconds after the train had passed). This tendency might reveal a major cause of dramatic errors when a second train is approaching. Immediate extinction of the red signal is suggested, or even better, a separate signal to announce the arrival of the second train. Behavior during the white signal phase also showed indications of uncertainty. In some 10% of cases drivers tended to decelerate more strongly than necessary and to make extra head movements. It is recommended that the present white flashing signal be reconsidered.
The GCT camera for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Lapington, J. S.; Abchiche, A.; Allan, D.; Amans, J.-P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Bose, R.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Buckley, J.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dournaux, J.-L.; Dumas, D.; Ernenwein, J.-P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J.-M.; Jankowsky, D.; Jegouzo, I.; Jogler, T.; Kawashima, T.; Kraus, M.; Laporte, P.; Leach, S.; Lefaucheur, J.; Markoff, S.; Melse, T.; Minaya, I. A.; Mohrmann, L.; Molyneux, P.; Moore, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayede, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Varner, G.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.; Zorn, J.; CTA Consortium
2017-12-01
The Gamma Cherenkov Telescope (GCT) is one of the designs proposed for the Small Sized Telescope (SST) section of the Cherenkov Telescope Array (CTA). The GCT uses dual-mirror optics, resulting in a compact telescope with good image quality and a large field of view with a smaller, more economical, camera than is achievable with conventional single mirror solutions. The photon counting GCT camera is designed to record the flashes of atmospheric Cherenkov light from gamma and cosmic ray initiated cascades, which last only a few tens of nanoseconds. The GCT optics require that the camera detectors follow a convex surface with a radius of curvature of 1 m and a diameter of 35 cm, which is approximated by tiling the focal plane with 32 modules. The first camera prototype is equipped with multi-anode photomultipliers, each comprising an 8×8 array of 6×6 mm2 pixels to provide the required angular scale, adding up to 2048 pixels in total. Detector signals are shaped, amplified and digitised by electronics based on custom ASICs that provide digitisation at 1 GSample/s. The camera is self-triggering, retaining images where the focal plane light distribution matches predefined spatial and temporal criteria. The electronics are housed in the liquid-cooled, sealed camera enclosure. LED flashers at the corners of the focal plane provide a calibration source via reflection from the secondary mirror. The first GCT camera prototype underwent preliminary laboratory tests last year. In November 2015, the camera was installed on a prototype GCT telescope (SST-GATE) in Paris and was used to successfully record the first Cherenkov light of any CTA prototype, and the first Cherenkov light seen with such a dual-mirror optical system. A second full-camera prototype based on Silicon Photomultipliers is under construction. Up to 35 GCTs are envisaged for CTA.
Comparing light sensitivity, linearity and step response of electronic cameras for ophthalmology.
Kopp, O; Markert, S; Tornow, R P
2002-01-01
To develop and test a procedure to measure and compare light sensitivity, linearity and step response of electronic cameras. The pixel value (PV) of digitized images as a function of light intensity (I) was measured. The sensitivity was calculated from the slope of the P(I) function, the linearity was estimated from the correlation coefficient of this function. To measure the step response, a short sequence of images was acquired. During acquisition, a light source was switched on and off using a fast shutter. The resulting PV was calculated for each video field of the sequence. A CCD camera optimized for the near-infrared (IR) spectrum showed the highest sensitivity for both, visible and IR light. There are little differences in linearity. The step response depends on the procedure of integration and read out.
Multi-channel automotive night vision system
NASA Astrophysics Data System (ADS)
Lu, Gang; Wang, Li-jun; Zhang, Yi
2013-09-01
A four-channel automotive night vision system is designed and developed .It is consist of the four active near-infrared cameras and an Mulit-channel image processing display unit,cameras were placed in the automobile front, left, right and rear of the system .The system uses near-infrared laser light source,the laser light beam is collimated, the light source contains a thermoelectric cooler (TEC),It can be synchronized with the camera focusing, also has an automatic light intensity adjustment, and thus can ensure the image quality. The principle of composition of the system is description in detail,on this basis, beam collimation,the LD driving and LD temperature control of near-infrared laser light source,four-channel image processing display are discussed.The system can be used in driver assistance, car BLIS, car parking assist system and car alarm system in day and night.
Improving accuracy of Plenoptic PIV using two light field cameras
NASA Astrophysics Data System (ADS)
Thurow, Brian; Fahringer, Timothy
2017-11-01
Plenoptic particle image velocimetry (PIV) has recently emerged as a viable technique for acquiring three-dimensional, three-component velocity field data using a single plenoptic, or light field, camera. The simplified experimental arrangement is advantageous in situations where optical access is limited and/or it is not possible to set-up the four or more cameras typically required in a tomographic PIV experiment. A significant disadvantage of a single camera plenoptic PIV experiment, however, is that the accuracy of the velocity measurement along the optical axis of the camera is significantly worse than in the two lateral directions. In this work, we explore the accuracy of plenoptic PIV when two plenoptic cameras are arranged in a stereo imaging configuration. It is found that the addition of a 2nd camera improves the accuracy in all three directions and nearly eliminates any differences between them. This improvement is illustrated using both synthetic and real experiments conducted on a vortex ring using both one and two plenoptic cameras.
Applications of digital image acquisition in anthropometry
NASA Technical Reports Server (NTRS)
Woolford, B.; Lewis, J. L.
1981-01-01
A description is given of a video kinesimeter, a device for the automatic real-time collection of kinematic and dynamic data. Based on the detection of a single bright spot by three TV cameras, the system provides automatic real-time recording of three-dimensional position and force data. It comprises three cameras, two incandescent lights, a voltage comparator circuit, a central control unit, and a mass storage device. The control unit determines the signal threshold for each camera before testing, sequences the lights, synchronizes and analyzes the scan voltages from the three cameras, digitizes force from a dynamometer, and codes the data for transmission to a floppy disk for recording. Two of the three cameras face each other along the 'X' axis; the third camera, which faces the center of the line between the first two, defines the 'Y' axis. An image from the 'Y' camera and either 'X' camera is necessary for determining the three-dimensional coordinates of the point.
The design and performance of high resolution échelle spectrographs in astronomy
NASA Astrophysics Data System (ADS)
Barnes, Stuart
The design and performance of several high resolution spectrographs for use in astronomy will be described. After a basic outline of the required theory, the design and performance of HERCULES will be presented. HERCULES is an R2 spectrograph fibre-fed from the MJUO 1-m telescope. The échelle grating has 31.6 grooves/mm and it uses a BK7 prism with a 50° apex angle in double-pass for cross-dispersion. A folded Schmidt camera is used for imaging. With a detector having an area 50 x 50 mm, and pixels less than 25 µm, HERCULES is capable of resolving powers of 40,000 to 80,000 and wavelength coverage from 380 to 880 nm. The total throughput (from the fibre entrance to the CCD) is expected to be nearly 20% (in 1" seeing). Measured efficiencies are only slightly less than this. HERCULES is also shown to be capable of excellent radial velocity precision with no apparent difference between long-term and short-term stability. Several significant upgrade options are also described. As part of the evolution of the design of a high resolution spectrograph for SALT, several instruments were developed for 10-metre class telescopes. Early designs, based in part on the successful HERCULES design, did not meet the requirements of a number of potential users, due in particular to the limited ability to inter-leave object and sky orders. This resulted in the design of SALT HRS R2 which uses a mosaic of two 308 x 413 mm R2 échelle gratings with 87 grooves/mm. Cross-dispersion is achieved with a pair of large 40° apex angle BK7 prisms used in double-pass. The échelle grating accepts a 365-mm collimated beam. The camera is a catadioptric system having a 1.2-m primary mirror and three lenses made of BK7 each around 850 mm in diameter. Complete unvignetted (except by the CCD obstruction) wavelength coverage from 370nm to 890nm is possible on a mosaic of three 2k by 4k CCDS with 15 µm pixels. A maximum resolving power of R ≈ 80,000 is possible. For immunity to atmospheric pressure and temperature changes the entire spectrograph is designed to be housed inside either a helium atmosphere or a light vacuum. The spectrograph chamber is nearly seven metres long. An alternative to the R2 SALT HRS is also described. This instrument is an R4 dual beam spectrograph based on a white pupil layout. The design is based on suggestions by B. Delabre and follows closely this authors SOAR HRS instrument. SALT HRS R4 uses volume-phased holographic gratings for cross-dispersion and a 836 x 204 mm échelle grating with 41.6 grooves/mm. The grating will be replicated from two smaller gratings onto a single Zerodur blank. The spectrograph is split into blue and red arms by a dichroic located near the white pupil relay intermediate focus. Wavelengths from 370 nm to 890 nm are covered by two fixed format blue and red dedicated dioptric cameras. The detectors will be a single 2k by 4k CCD with 15 µm pixels for the blue camera and a 4k by 4k CCD with 15 µm pixels for the red. The size of the cameras is reduced significantly by white pupil demagnification from an initial 200-mm diameter collimated beam incident on the échelle grating to around 100 mm (in undispersed light) on the VPH gratings. The final SALT HRS R4 instrument is also designed to be immersed in a vacuum vessel which is considerably smaller than that proposed for the R2 spectrograph. SALT HRS R4 is currently being developed in detail and will be presented for a critical design review in 2005 April.
ERIC Educational Resources Information Center
Catelli, Francisco; Giovannini, Odilon; Bolzan, Vicente Dall Agnol
2011-01-01
The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made. (Contains 4 figures.)
NASA Astrophysics Data System (ADS)
Nishidate, Izumi; Hoshi, Akira; Aoki, Yuta; Nakano, Kazuya; Niizeki, Kyuichi; Aizu, Yoshihisa
2016-03-01
A non-contact imaging method with a digital RGB camera is proposed to evaluate plethysmogram and spontaneous lowfrequency oscillation. In vivo experiments with human skin during mental stress induced by the Stroop color-word test demonstrated the feasibility of the method to evaluate the activities of autonomic nervous systems.
Visual based laser speckle pattern recognition method for structural health monitoring
NASA Astrophysics Data System (ADS)
Park, Kyeongtaek; Torbol, Marco
2017-04-01
This study performed the system identification of a target structure by analyzing the laser speckle pattern taken by a camera. The laser speckle pattern is generated by the diffuse reflection of the laser beam on a rough surface of the target structure. The camera, equipped with a red filter, records the scattered speckle particles of the laser light in real time and the raw speckle image of the pixel data is fed to the graphic processing unit (GPU) in the system. The algorithm for laser speckle contrast analysis (LASCA) computes: the laser speckle contrast images and the laser speckle flow images. The k-mean clustering algorithm is used to classify the pixels in each frame and the clusters' centroids, which function as virtual sensors, track the displacement between different frames in time domain. The fast Fourier transform (FFT) and the frequency domain decomposition (FDD) compute the modal properties of the structure: natural frequencies and damping ratios. This study takes advantage of the large scale computational capability of GPU. The algorithm is written in Compute Unifies Device Architecture (CUDA C) that allows the processing of speckle images in real time.
Effects of supplementary lighting by natural light for growth of Brassica chinensis
NASA Astrophysics Data System (ADS)
Yeh, Shih-Chuan; Lee, Hui-Ping; Kao, Shih-Tse; Lu, Ju-Lin
2016-04-01
This paper present a model of cultivated chamber with supplementary natural colour light. We investigate the effects of supplementary natural red light and natural blue light on growth of Brassica chinensis under natural white light illumination. After 4 weeks of supplementary colour light treatment, the experiment results shown that the weight of fresh leaf were not affected by supplementary natural blue light. However, those Brassica chinensis were cultivated in the chambers with supplementary natural red light obtained a significant increasing of fresh weight of leaf under both white light illuminate models. The combination of natural white light with supplementary natural red light illumination will be benefits in growth for cultivation and energy saving.
Fischer, Andreas; Kupsch, Christian; Gürtler, Johannes; Czarske, Jürgen
2015-09-21
Non-intrusive fast 3d measurements of volumetric velocity fields are necessary for understanding complex flows. Using high-speed cameras and spectroscopic measurement principles, where the Doppler frequency of scattered light is evaluated within the illuminated plane, each pixel allows one measurement and, thus, planar measurements with high data rates are possible. While scanning is one standard technique to add the third dimension, the volumetric data is not acquired simultaneously. In order to overcome this drawback, a high-speed light field camera is proposed for obtaining volumetric data with each single frame. The high-speed light field camera approach is applied to a Doppler global velocimeter with sinusoidal laser frequency modulation. As a result, a frequency multiplexing technique is required in addition to the plenoptic refocusing for eliminating the crosstalk between the measurement planes. However, the plenoptic refocusing is still necessary in order to achieve a large refocusing range for a high numerical aperture that minimizes the measurement uncertainty. Finally, two spatially separated measurement planes with 25×25 pixels each are simultaneously acquired with a measurement rate of 0.5 kHz with a single high-speed camera.
Far Red and White Light-promoted Utilization of Calcium by Seedlings of Phaseolus vulgaris L.
Helms, K; David, D J
1973-01-01
The cotyledons and embryo axes of seeds of Phaseolus vulgaris L. cv. Pinto contained 16% of the total calcium in the seed. The remaining 84% was in the testas. There was no evidence that calcium in testas was used in seedling growth or that calcium was leached from seedlings during growth.An external supply of calcium decreased the incidence of hypocotyl collapse (a severe symptom of calcium deficiency), increased the calcium content of all organs, and increased the dry weight of all organs except cotyledons. Light treatments decreased the incidence of hypocotyl collapse and increased the calcium content and dry weight of all organs except cotyledons and hypocotyls.White light was more effective than far red light for decreasing incidence of hypocotyl collapse. Usually the effects of white light and far red light on the calcium content and dry weight of organs were similar, and usually those of white light were quantitatively greater than those of far red light. It is suggested that the light-promoted effects were associated with photomorphogenesis and that differences in data obtained with white light and far red light could be associated with photosynthesis.
Robust Behavior Recognition in Intelligent Surveillance Environments.
Batchuluun, Ganbayar; Kim, Yeong Gon; Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung
2016-06-30
Intelligent surveillance systems have been studied by many researchers. These systems should be operated in both daytime and nighttime, but objects are invisible in images captured by visible light camera during the night. Therefore, near infrared (NIR) cameras, thermal cameras (based on medium-wavelength infrared (MWIR), and long-wavelength infrared (LWIR) light) have been considered for usage during the nighttime as an alternative. Due to the usage during both daytime and nighttime, and the limitation of requiring an additional NIR illuminator (which should illuminate a wide area over a great distance) for NIR cameras during the nighttime, a dual system of visible light and thermal cameras is used in our research, and we propose a new behavior recognition in intelligent surveillance environments. Twelve datasets were compiled by collecting data in various environments, and they were used to obtain experimental results. The recognition accuracy of our method was found to be 97.6%, thereby confirming the ability of our method to outperform previous methods.
Feasibility and accuracy assessment of light field (plenoptic) PIV flow-measurement technique
NASA Astrophysics Data System (ADS)
Shekhar, Chandra; Ogawa, Syo; Kawaguchi, Tatsuya
A light field camera can enable measurement of all the three velocity components of a flow field inside a three-dimensional volume when implemented in a PIV measurement. Due to the usage of only one camera, the measurement procedure gets greatly simplified, as well as measurement of the flows with limited visual access also becomes possible. Due to these advantages, light field cameras and their usage in PIV measurements are actively studied. The overall procedure of obtaining an instantaneous flow field consists of imaging a seeded flow at two closely separated time instants, reconstructing the two volumetric distributions of the particles using algorithms such as MART, followed by obtaining the flow velocity through cross-correlations. In this study, we examined effects of various configuration parameters of a light field camera on the in-plane and the depth resolutions, obtained near-optimal parameters in a given case, and then used it to simulate a PIV measurement scenario in order to assess the reconstruction accuracy.
SVBRDF-Invariant Shape and Reflectance Estimation from a Light-Field Camera.
Wang, Ting-Chun; Chandraker, Manmohan; Efros, Alexei A; Ramamoorthi, Ravi
2018-03-01
Light-field cameras have recently emerged as a powerful tool for one-shot passive 3D shape capture. However, obtaining the shape of glossy objects like metals or plastics remains challenging, since standard Lambertian cues like photo-consistency cannot be easily applied. In this paper, we derive a spatially-varying (SV)BRDF-invariant theory for recovering 3D shape and reflectance from light-field cameras. Our key theoretical insight is a novel analysis of diffuse plus single-lobe SVBRDFs under a light-field setup. We show that, although direct shape recovery is not possible, an equation relating depths and normals can still be derived. Using this equation, we then propose using a polynomial (quadratic) shape prior to resolve the shape ambiguity. Once shape is estimated, we also recover the reflectance. We present extensive synthetic data on the entire MERL BRDF dataset, as well as a number of real examples to validate the theory, where we simultaneously recover shape and BRDFs from a single image taken with a Lytro Illum camera.
Photooxidation of Amplex Red to resorufin: implications of exposing the Amplex Red assay to light
Zhao, Baozhong; Summers, Fiona A.; Mason, Ronald P.
2012-01-01
The Amplex Red assay, a fluorescent assay for the detection of H2O2, relies on the reaction of H2O2 and colorless, nonfluorescent Amplex Red with a 1:1 stoichiometry to form colored, fluorescent resorufin, catalyzed by horseradish peroxidase (HRP). We have found that resorufin is artifactually formed when Amplex Red is exposed to light. In the absence of H2O2 and HRP, the absorption and fluorescence spectra of Amplex Red changed during exposure to ambient room light or instrumental excitation light, clearly indicating that the fluorescent product resorufin had formed. This photochemistry was initiated by trace amounts of resorufin that are present in Amplex Red stock solutions. ESR spin-trapping studies demonstrated that superoxide radical was an intermediate in this process. Oxygen consumption measurements further confirmed that superoxide and H2O2 were artifactually produced by the photooxidation of Amplex Red. The artifactual formation of resorufin was also significantly increased by the presence of superoxide dismutase or HRP. This photooxidation process will result in a less sensitive assay for H2O2 under ambient light exposure and potentially invalid measurements under high energy exposure such as UVA irradiation. In general, precautions should be taken to minimize exposure to light during measurement of oxidative stress with Amplex Red. PMID:22765927
Photooxidation of Amplex Red to resorufin: implications of exposing the Amplex Red assay to light.
Zhao, Baozhong; Summers, Fiona A; Mason, Ronald P
2012-09-01
The Amplex Red assay, a fluorescent assay for the detection of H(2)O(2), relies on the reaction of H(2)O(2) and colorless, nonfluorescent Amplex Red with a 1:1 stoichiometry to form colored, fluorescent resorufin, catalyzed by horseradish peroxidase (HRP). We have found that resorufin is artifactually formed when Amplex Red is exposed to light. In the absence of H(2)O(2) and HRP, the absorption and fluorescence spectra of Amplex Red changed during exposure to ambient room light or instrumental excitation light, clearly indicating that the fluorescent product resorufin had formed. This photochemistry was initiated by trace amounts of resorufin that are present in Amplex Red stock solutions. ESR spin-trapping studies demonstrated that superoxide radical was an intermediate in this process. Oxygen consumption measurements further confirmed that superoxide and H(2)O(2) were artifactually produced by the photooxidation of Amplex Red. The artifactual formation of resorufin was also significantly increased by the presence of superoxide dismutase or HRP. This photooxidation process will result in a less sensitive assay for H(2)O(2) under ambient light exposure and potentially invalid measurements under high energy exposure such as UVA irradiation. In general, precautions should be taken to minimize exposure to light during measurement of oxidative stress with Amplex Red. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Do, Trong Hop; Yoo, Myungsik
2018-01-01
This paper proposes a vehicle positioning system using LED street lights and two rolling shutter CMOS sensor cameras. In this system, identification codes for the LED street lights are transmitted to camera-equipped vehicles through a visible light communication (VLC) channel. Given that the camera parameters are known, the positions of the vehicles are determined based on the geometric relationship between the coordinates of the LEDs in the images and their real world coordinates, which are obtained through the LED identification codes. The main contributions of the paper are twofold. First, the collinear arrangement of the LED street lights makes traditional camera-based positioning algorithms fail to determine the position of the vehicles. In this paper, an algorithm is proposed to fuse data received from the two cameras attached to the vehicles in order to solve the collinearity problem of the LEDs. Second, the rolling shutter mechanism of the CMOS sensors combined with the movement of the vehicles creates image artifacts that may severely degrade the positioning accuracy. This paper also proposes a method to compensate for the rolling shutter artifact, and a high positioning accuracy can be achieved even when the vehicle is moving at high speeds. The performance of the proposed positioning system corresponding to different system parameters is examined by conducting Matlab simulations. Small-scale experiments are also conducted to study the performance of the proposed algorithm in real applications.
Light Spectrum Related Responses of 1-g and Clino-Rotated Cress
NASA Astrophysics Data System (ADS)
Rakleviciene, D.; Svegzdiene, D.; Losinska, R.
2008-06-01
Growth and positioning of cress on a 50-rpm horizontal clinostat in response to blue (450 nm), red (660 nm) and far red (735 nm) light spectral components and their combinations (red & far red or blue & red & far red) were estimated and compared with cress grown in the usual vertical position with and without illumination. No gravity-related alterations have been determined in the elongation of dark-grown hypocotyls, though leaves slightly responded to clino-rotation. Impact of light of 450, 660 and 735 nm wavelengths applied at a comparatively low density of the photon flux (5, 13, 0.8-1 μmol m-2s-1, respectively) had a stronger inhibiting effect on the elongation of hypocotyls on clinostat than at 1 g. Growth of 1-g petioles responded to light spectrum which was not the case with clino-rotated ones. However, radial expansion of cells in palisade and spongy mesophyll tissues of clino-rotated laminas was promoted under combined blue & red & far red illumination (50 μmol·m-2s-1). Gravity-dependent alteration of the positioning of leaf petioles and laminas was suppressed by light. The obtained data confirm the interactions between responses of cress seedlings induced by changed gravity and by spectral components of light.
NASA Technical Reports Server (NTRS)
Ivanov, Anton B.
2003-01-01
The Mars Orbiter Camera (MOC) has been operating on board of the Mars Global Surveyor (MGS) spacecraft since 1998. It consists of three cameras - Red and Blue Wide Angle cameras (FOV=140 deg.) and Narrow Angle camera (FOV=0.44 deg.). The Wide Angle camera allows surface resolution down to 230 m/pixel and the Narrow Angle camera - down to 1.5 m/pixel. This work is a continuation of the project, which we have reported previously. Since then we have refined and improved our stereo correlation algorithm and have processed many more stereo pairs. We will discuss results of our stereo pair analysis located in the Mars Exploration rovers (MER) landing sites and address feasibility of recovering topography from stereo pairs (especially in the polar regions), taken during MGS 'Relay-16' mode.
33 CFR 117.1007 - Elizabeth River-Eastern Branch.
Code of Federal Regulations, 2013 CFR
2013-07-01
... closing the draw, the channel traffic lights will change from flashing green to flashing red, the horn... down to vessels, the channel traffic lights will continue to flash red. (6) When the rail traffic has... opening to vessel traffic. During the opening swing movement, the channel traffic lights will flash red...
33 CFR 117.1007 - Elizabeth River-Eastern Branch.
Code of Federal Regulations, 2014 CFR
2014-07-01
... closing the draw, the channel traffic lights will change from flashing green to flashing red, the horn... down to vessels, the channel traffic lights will continue to flash red. (6) When the rail traffic has... opening to vessel traffic. During the opening swing movement, the channel traffic lights will flash red...
33 CFR 117.1007 - Elizabeth River-Eastern Branch.
Code of Federal Regulations, 2011 CFR
2011-07-01
... closing the draw, the channel traffic lights will change from flashing green to flashing red, the horn... down to vessels, the channel traffic lights will continue to flash red. (6) When the rail traffic has... opening to vessel traffic. During the opening swing movement, the channel traffic lights will flash red...
33 CFR 117.1007 - Elizabeth River-Eastern Branch.
Code of Federal Regulations, 2012 CFR
2012-07-01
... closing the draw, the channel traffic lights will change from flashing green to flashing red, the horn... down to vessels, the channel traffic lights will continue to flash red. (6) When the rail traffic has... opening to vessel traffic. During the opening swing movement, the channel traffic lights will flash red...
DOT National Transportation Integrated Search
2013-11-01
Red light running has become a serious safety issue at signalized intersections throughout the : United States. One objective of this study was to identify the characteristics of red-light-running (RLR) : crashes and the drivers involved in those cra...
Dhakal, Radhika; Park, Euiho; Lee, Se-Weon; Baek, Kwang-Hyun
2015-01-01
Specific wavelengths of light can exert various physiological changes in plants, including effects on responses to disease incidence. To determine whether specific light wavelength had effects on rotting disease caused by Pseudomonas putida 229, soybean sprouts were germinated under a narrow range of wavelengths from light emitting diodes (LEDs), including red (650–660), far red (720–730) and blue (440–450 nm) or broad range of wavelength from daylight fluorescence bulbs. The controls were composed of soybean sprouts germinated in darkness. After germination under different conditions for 5 days, the soybean sprouts were inoculated with P. putida 229 and the disease incidence was observed for 5 days. The sprouts exposed to red light showed increased resistance against P. putida 229 relative to those grown under other conditions. Soybean sprouts germinated under red light accumulated high levels of salicylic acid (SA) accompanied with up-regulation of the biosynthetic gene ICS and the pathogenesis- related (PR) gene PR-1, indicating that the resistance was induced by the action of SA via de novo synthesis of SA in the soybean sprouts by red light irradiation. Taken together, these data suggest that only the narrow range of red light can induce disease resistance in soybean sprouts, regulated by the SA-dependent pathway via the de novo synthesis of SA and up-regulation of PR genes. PMID:25679808
Evidence for yellow light suppression of lettuce growth
NASA Technical Reports Server (NTRS)
Dougher, T. A.; Bugbee, B.
2001-01-01
Researchers studying plant growth under different lamp types often attribute differences in growth to a blue light response. Lettuce plants were grown in six blue light treatments comprising five blue light fractions (0, 2, 6% from high-pressure sodium [HPS] lamps and 6, 12, 26% from metal halide [MH] lamps). Lettuce chlorophyll concentration, dry mass, leaf area and specific leaf area under the HPS and MH 6% blue were significantly different, suggesting wavelengths other than blue and red affected plant growth. Results were reproducible in two replicate studies at each of two photosynthetic photon fluxes, 200 and 500 mumol m-2 s-1. We graphed the data against absolute blue light, phytochrome photoequilibrium, phototropic blue, UV, red:far red, blue:red, blue: far red and 'yellow' light fraction. Only the 'yellow' wavelength range (580-600 nm) explained the differences between the two lamp types.
NASA Technical Reports Server (NTRS)
Franke, John M.; Rhodes, David B.; Jones, Stephen B.; Dismond, Harriet R.
1992-01-01
A technique for synchronizing a pulse light source to charge coupled device cameras is presented. The technique permits the use of pulse light sources for continuous as well as stop action flow visualization. The technique has eliminated the need to provide separate lighting systems at facilities requiring continuous and stop action viewing or photography.
NASA Technical Reports Server (NTRS)
Behringer, F. J.; Lomax, T. L.
1999-01-01
The lz-2 mutation in tomato (Lycopersicon esculentum) causes conditional reversal of shoot gravitropism by light. This response is mediated by phytochrome. To further elicit the mechanism by which phytochrome regulates the lz-2 phenotype, phytochrome-deficient lz-2 plants were generated. Introduction of au alleles, which severely block chromophore biosynthesis, eliminated the reversal of hypocotyl gravitropism in continuous red and far-red light. The fri1 and tri1 alleles were introduced to specifically deplete phytochromes A and B1, respectively. In dark-grown seedlings, phytochrome A was necessary for response to high-irradiance far-red light, a complete response to low fluence red light, and also mediated the effects of blue light in a far-red reversible manner. Loss of phytochrome B1 alone did not significantly affect the behaviour of lz-2 plants under any light treatment tested. However, dark-grown lz-2 plants lacking both phytochrome A and B1 exhibited reduced responses to continuous red and were less responsive to low fluence red light and high fluence blue light than plants that were deficient for phytochrome A alone. In high light, full spectrum greenhouse conditions, lz-2 plants grew downward regardless of the phytochrome deficiency. These results indicate that phytochromes A and B1 play significant roles in mediating the lz-2 phenotype and that at least one additional phytochrome is involved in reversing shoot gravitropism in this mutant.
Calcium in the regulation of gravitropism by light
NASA Technical Reports Server (NTRS)
Perdue, D. O.; LaFavre, A. K.; Leopold, A. C.
1988-01-01
The red light requirement for positive gravitropism in roots of corn (Zea mays cv "Merit") provides an entry for examining the participation of calcium in gravitropism. Applications of calcium chelators inhibit the light response. Calcium channel blockers (verapamil, lanthanum) can also inhibit the light response, and a calcium ionophore, A23187, can substitute for light. One can substitute for red light by treatments which have elsewhere been shown to trigger Ca2+ influx into the cytosol, e.g. heat or cold shock. Agents which are known to be agonists of the phosphatidylinositol second messenger system (serotonin, 2,4-dichlorophenoxyacetic acid, deoxycholate) can each partially substitute for the red light, and Li+ can inhibit the light effect. These experiments suggest that the induction of positive gravitropism by red light involves a rise in cytoplasmic Ca2+ concentration, and that a contribution to this end may be made by the phosphatidylinositol second messenger system.
Electronic cameras for low-light microscopy.
Rasnik, Ivan; French, Todd; Jacobson, Ken; Berland, Keith
2013-01-01
This chapter introduces to electronic cameras, discusses the various parameters considered for evaluating their performance, and describes some of the key features of different camera formats. The chapter also presents the basic understanding of functioning of the electronic cameras and how these properties can be exploited to optimize image quality under low-light conditions. Although there are many types of cameras available for microscopy, the most reliable type is the charge-coupled device (CCD) camera, which remains preferred for high-performance systems. If time resolution and frame rate are of no concern, slow-scan CCDs certainly offer the best available performance, both in terms of the signal-to-noise ratio and their spatial resolution. Slow-scan cameras are thus the first choice for experiments using fixed specimens such as measurements using immune fluorescence and fluorescence in situ hybridization. However, if video rate imaging is required, one need not evaluate slow-scan CCD cameras. A very basic video CCD may suffice if samples are heavily labeled or are not perturbed by high intensity illumination. When video rate imaging is required for very dim specimens, the electron multiplying CCD camera is probably the most appropriate at this technological stage. Intensified CCDs provide a unique tool for applications in which high-speed gating is required. The variable integration time video cameras are very attractive options if one needs to acquire images at video rate acquisition, as well as with longer integration times for less bright samples. This flexibility can facilitate many diverse applications with highly varied light levels. Copyright © 2007 Elsevier Inc. All rights reserved.
Arthropod prey of nestling red-cockaded woodpeckers in the upper coastal plain of South Carolina
James L. Hanula; Kathleen E. Franzreb
1995-01-01
Four nest cavities of the Red-cockaded Woodpecker (Picoides borealis) were monitored with automatic cameras to determine the prey selected to feed nestlings. Twelve adults were photographed making nearly 3000 nest visits. Prey in 28 arthropod taxa were recognizable in 65% of the photographic slides. Wood roaches in the genus (Parcoblutta...
Availability and abundance of prey for the red-cockaded woodpecker
James L. Hanula; Scott Horn
2004-01-01
Over a 10-year period we investigated red-cockaded woodpecker (Picoides borealis) prey use, sources of prey, prey distribution within trees and stands, and how forest management decisions affect prey abundance in South Carolina, Alabama, Georgia, and Florida. Cameras were operated at 31 nest cavities to record nest visits with prey in 4 locations...
Kerry R. Foresman; Dean Pearson
1999-01-01
We investigated winter activity patterns of American Martens, Martes americana, Snowshoe Hares, Lepus americanus, and Red Squirrels, Tamiasciurus hudsonicus, in westcentral Montana between November 1994 and March 1995 using dual-sensor remote cameras. One hundred percent of Snowshoe Hare (n = 25) observations occurred at night while Martens (n = 85) exhibited...
Diet of nestling red-cockaded woodpeckers at three locations
James L. Hanula; Donald Lipscomb; Kathleen E. Franzreb; Susan C. Loeb
2000-01-01
We conducted a 2-yr study of the nestling diet of red-cockaded woodpeckers (Picoides borealis) at three locations to determine how it varied among sites. We photographed 5939 nest visits by adult woodpeckers delivering food items for nestlings. In 1994, we located cameras near three nest cavities on the Lower Coastal Plain of South Carolina and near...
Eukaryotic algal phytochromes span the visible spectrum
Rockwell, Nathan C.; Duanmu, Deqiang; Martin, Shelley S.; Bachy, Charles; Price, Dana C.; Bhattacharya, Debashish; Worden, Alexandra Z.; Lagarias, J. Clark
2014-01-01
Plant phytochromes are photoswitchable red/far-red photoreceptors that allow competition with neighboring plants for photosynthetically active red light. In aquatic environments, red and far-red light are rapidly attenuated with depth; therefore, photosynthetic species must use shorter wavelengths of light. Nevertheless, phytochrome-related proteins are found in recently sequenced genomes of many eukaryotic algae from aquatic environments. We examined the photosensory properties of seven phytochromes from diverse algae: four prasinophyte (green algal) species, the heterokont (brown algal) Ectocarpus siliculosus, and two glaucophyte species. We demonstrate that algal phytochromes are not limited to red and far-red responses. Instead, different algal phytochromes can sense orange, green, and even blue light. Characterization of these previously undescribed photosensors using CD spectroscopy supports a structurally heterogeneous chromophore in the far-red–absorbing photostate. Our study thus demonstrates that extensive spectral tuning of phytochromes has evolved in phylogenetically distinct lineages of aquatic photosynthetic eukaryotes. PMID:24567382
NASA Astrophysics Data System (ADS)
Ma, Chen; Cheng, Dewen; Xu, Chen; Wang, Yongtian
2014-11-01
Fundus camera is a complex optical system for retinal photography, involving illumination and imaging of the retina. Stray light is one of the most significant problems of fundus camera because the retina is so minimally reflective that back reflections from the cornea and any other optical surface are likely to be significantly greater than the light reflected from the retina. To provide maximum illumination to the retina while eliminating back reflections, a novel design of illumination system used in portable fundus camera is proposed. Internal illumination, in which eyepiece is shared by both the illumination system and the imaging system but the condenser and the objective are separated by a beam splitter, is adopted for its high efficiency. To eliminate the strong stray light caused by corneal center and make full use of light energy, the annular stop in conventional illumination systems is replaced by a fiber-coupled, ring-shaped light source that forms an annular beam. Parameters including size and divergence angle of the light source are specially designed. To weaken the stray light, a polarized light source is used, and an analyzer plate is placed after beam splitter in the imaging system. Simulation results show that the illumination uniformity at the fundus exceeds 90%, and the stray light is within 1%. Finally, a proof-of-concept prototype is developed and retinal photos of an ophthalmophantom are captured. The experimental results show that ghost images and stray light have been greatly reduced to a level that professional diagnostic will not be interfered with.
Characterization of a thinned back illuminated MIMOSA V sensor as a visible light camera
NASA Astrophysics Data System (ADS)
Bulgheroni, Antonio; Bianda, Michele; Caccia, Massimo; Cappellini, Chiara; Mozzanica, Aldo; Ramelli, Renzo; Risigo, Fabio
2006-09-01
This paper reports the measurements that have been performed both in the Silicon Detector Laboratory at the University of Insubria (Como, Italy) and at the Instituto Ricerche SOlari Locarno (IRSOL) to characterize a CMOS pixel particle detector as a visible light camera. The CMOS sensor has been studied in terms of Quantum Efficiency in the visible spectrum, image blooming and reset inefficiency in saturation condition. The main goal of these measurements is to prove that this kind of particle detector can also be used as an ultra fast, 100% fill factor visible light camera in solar physics experiments.
Optical registration of spaceborne low light remote sensing camera
NASA Astrophysics Data System (ADS)
Li, Chong-yang; Hao, Yan-hui; Xu, Peng-mei; Wang, Dong-jie; Ma, Li-na; Zhao, Ying-long
2018-02-01
For the high precision requirement of spaceborne low light remote sensing camera optical registration, optical registration of dual channel for CCD and EMCCD is achieved by the high magnification optical registration system. System integration optical registration and accuracy of optical registration scheme for spaceborne low light remote sensing camera with short focal depth and wide field of view is proposed in this paper. It also includes analysis of parallel misalignment of CCD and accuracy of optical registration. Actual registration results show that imaging clearly, MTF and accuracy of optical registration meet requirements, it provide important guarantee to get high quality image data in orbit.
Foulds, Wallace S; Barathi, Veluchamy A; Luu, Chi D
2013-12-09
To determine whether progressive ametropia can be induced in chicks and reversed by manipulation of the chromaticity of ambient light. One-day-old chicks were raised in red light (90% red, 10% yellow-green) or in blue light (85% blue, 15% green) with a 12 hour on/off cycle for 14 to 42 days. Refraction was determined by streak retinoscopy, and by automated infrared photoretinoscopy and ocular biometry by A-scan ultrasonography. Red light induced progressive myopia (mean refraction ± SD at 28 days, -2.83 ± 0.25 diopters [D]). Progressive hyperopia was induced by blue light (mean refraction at 28 days, +4.55 ± 0.21 D). The difference in refraction between the groups was highly significant at P < 0.001. Induced myopia or hyperopia was axial as confirmed by ultrasound biometry. Myopia induced by 21 days of red light (-2.21 ± 0.21 D) was reversed to hyperopia (+2.50 ± 0.29 D) by subsequent 21 days of blue light. Hyperopia induced by 21 days of blue light (+4.21 ± 0.19 D) was reversed to myopia (-1.23 ± 0.12 D) by 21 days of red light. Rearing chicks in red light caused progressive myopia, while rearing in blue light caused progressive hyperopia. Light-induced myopia or hyperopia in chicks can be reversed to hyperopia or myopia, respectively, by an alteration in the chromaticity of ambient light. Manipulation of chromaticity may be applicable to the management of human childhood myopia.
Accurate and cost-effective MTF measurement system for lens modules of digital cameras
NASA Astrophysics Data System (ADS)
Chang, Gao-Wei; Liao, Chia-Cheng; Yeh, Zong-Mu
2007-01-01
For many years, the widening use of digital imaging products, e.g., digital cameras, has given rise to much attention in the market of consumer electronics. However, it is important to measure and enhance the imaging performance of the digital ones, compared to that of conventional cameras (with photographic films). For example, the effect of diffraction arising from the miniaturization of the optical modules tends to decrease the image resolution. As a figure of merit, modulation transfer function (MTF) has been broadly employed to estimate the image quality. Therefore, the objective of this paper is to design and implement an accurate and cost-effective MTF measurement system for the digital camera. Once the MTF of the sensor array is provided, that of the optical module can be then obtained. In this approach, a spatial light modulator (SLM) is employed to modulate the spatial frequency of light emitted from the light-source. The modulated light going through the camera under test is consecutively detected by the sensors. The corresponding images formed from the camera are acquired by a computer and then, they are processed by an algorithm for computing the MTF. Finally, through the investigation on the measurement accuracy from various methods, such as from bar-target and spread-function methods, it appears that our approach gives quite satisfactory results.
2002-08-01
This sturning image, taken by the newly installed Advanced Camera for Surveys (ACS) aboard the Hubble Space Telescope (HST), is an image of the center of the Omega Nebula. It is a hotbed of newly born stars wrapped in colorful blankets of glowing gas and cradled in an enormous cold, dark hydrogen cloud. The region of nebula shown in this photograph is about 3,500 times wider than our solar system. The nebula, also called M17 and the Swan Nebula, resides 5,500 light-years away in the constellation Sagittarius. The Swan Nebula is illuminated by ultraviolet radiation from young, massive stars, located just beyond the upper-right corner of the image. The powerful radiation from these stars evaporates and erodes the dense cloud of cold gas within which the stars formed. The blistered walls of the hollow cloud shine primarily in the blue, green, and red light emitted by excited atoms of hydrogen, nitrogen, oxygen, and sulfur. Particularly striking is the rose-like feature, seen to the right of center, which glows in the red light emitted by hydrogen and sulfur. As the infant stars evaporate the surrounding cloud, they expose dense pockets of gas that may contain developing stars. One isolated pocket is seen at the center of the brightest region of the nebula. Other dense pockets of gas have formed the remarkable feature jutting inward from the left edge of the image. The color image is constructed from four separate images taken in these filters: blue, near infrared, hydrogen alpha, and doubly ionized oxygen. Credit: NASA, H. Ford (JHU), G. Illingworth (USCS/LO), M. Clampin (STScI), G. Hartig (STScI), the ACS Science Team, and ESA.
Hubble Space Telescope Image of Omega Nebula
NASA Technical Reports Server (NTRS)
2002-01-01
This sturning image, taken by the newly installed Advanced Camera for Surveys (ACS) aboard the Hubble Space Telescope (HST), is an image of the center of the Omega Nebula. It is a hotbed of newly born stars wrapped in colorful blankets of glowing gas and cradled in an enormous cold, dark hydrogen cloud. The region of nebula shown in this photograph is about 3,500 times wider than our solar system. The nebula, also called M17 and the Swan Nebula, resides 5,500 light-years away in the constellation Sagittarius. The Swan Nebula is illuminated by ultraviolet radiation from young, massive stars, located just beyond the upper-right corner of the image. The powerful radiation from these stars evaporates and erodes the dense cloud of cold gas within which the stars formed. The blistered walls of the hollow cloud shine primarily in the blue, green, and red light emitted by excited atoms of hydrogen, nitrogen, oxygen, and sulfur. Particularly striking is the rose-like feature, seen to the right of center, which glows in the red light emitted by hydrogen and sulfur. As the infant stars evaporate the surrounding cloud, they expose dense pockets of gas that may contain developing stars. One isolated pocket is seen at the center of the brightest region of the nebula. Other dense pockets of gas have formed the remarkable feature jutting inward from the left edge of the image. The color image is constructed from four separate images taken in these filters: blue, near infrared, hydrogen alpha, and doubly ionized oxygen. Credit: NASA, H. Ford (JHU), G. Illingworth (USCS/LO), M. Clampin (STScI), G. Hartig (STScI), the ACS Science Team, and ESA.
NASA Astrophysics Data System (ADS)
Morrill, Waldirene B. B.; Barnabé, Janice M. C.; da Silva, Tatiana P. N.; Pandorfi, Héliton; Gouveia-Neto, Artur S.; Souza, Wellington S.
2014-03-01
Growth performance, behavior, and development of broilers reared under red, green, and blue monochromatic and/or multicolor LED-based illuminants is investigated. The lighting treatments were performed on a 24h lighting basis during six weeks. Monochromatic red(630 nm), green(520 nm), and blue(460 nm), and simultaneous blue-green, and whitelight housing illumination was employed. Bodyweight, food consumption, and behavior were monitored and compared amongst light treatments. The behavioral data showed that broilers reared under green lighting presented the lowest respiratory rate (87 mov. min-1) while those under red lighting presented the highest (96 mov. min-1). Results also showed that broilers under blue and/or green monochromatic illumination exhibited up to 6%, and 8.9 % increase in final bodyweight when compared to those under red or white-light, respectively. The highest feed intake, and lowest body weight gain was observed in broilers reared under blue and red illumination, respectively.
ERIC Educational Resources Information Center
Brochu, Michel
1983-01-01
In August, 1981, National Aeronautics and Space Administration launched Dynamics Explorer 1 into polar orbit equipped with three cameras built to view the Northern Lights. The cameras can photograph aurora borealis' faint light without being blinded by the earth's bright dayside. Photographs taken by the satellite are provided. (JN)
Effect of meat appearance on consumer preferences for pork chops in Greece and Cyprus.
Fortomaris, P; Arsenos, G; Georgiadis, M; Banos, G; Stamataris, C; Zygoyiannis, D
2006-04-01
The effect of meat appearance on consumers' preferences for pork chops was assessed using images manipulated for appearance characteristics. Data were collected from 412 consumers in Greece and Cyprus. Consumers were asked for their preference for pork chops from a book of computer-modified images and then completed a questionnaire of socio-demographic information, including eating and purchasing behaviour. Consumers under the age of 35 years showed preferences for dark red, lean pork, while consumers aged 35 years and older preferred either dark or light red pork. Gender appeared to be an important selection factor as men showed an increased preference for dark red pork while women preferred the light red. Consumers who stated that they like pork for its taste (91%) preferred either dark or light red pork chops while those who like pork for reasons other than taste preferred dark red, lean pork. Urban consumers preferred light red, fatty pork chops while the rural consumers preferred the dark red pork chops.
Spoelstra, Kamiel; van Grunsven, Roy H A; Ramakers, Jip J C; Ferguson, Kim B; Raap, Thomas; Donners, Maurice; Veenendaal, Elmar M; Visser, Marcel E
2017-05-31
Artificial light at night has shown a remarkable increase over the past decades. Effects are reported for many species groups, and include changes in presence, behaviour, physiology and life-history traits. Among these, bats are strongly affected, and how bat species react to light is likely to vary with light colour. Different spectra may therefore be applied to reduce negative impacts. We used a unique set-up of eight field sites to study the response of bats to three different experimental light spectra in an otherwise dark and undisturbed natural habitat. We measured activity of three bat species groups around transects with light posts emitting white, green and red light with an intensity commonly used to illuminate countryside roads. The results reveal a strong and spectrum-dependent response for the slow-flying Myotis and Plecotus and more agile Pipistrellus species, but not for Nyctalus and Eptesicus species. Plecotus and Myotis species avoided white and green light, but were equally abundant in red light and darkness. The agile, opportunistically feeding Pipistrellus species were significantly more abundant around white and green light, most likely because of accumulation of insects, but equally abundant in red illuminated transects compared to dark control. Forest-dwelling Myotis and Plecotus species and more synanthropic Pipistrellus species are thus least disturbed by red light. Hence, in order to limit the negative impact of light at night on bats, white and green light should be avoided in or close to natural habitat, but red lights may be used if illumination is needed. © 2017 The Author(s).
Phase response of the Arabidopsis thaliana circadian clock to light pulses of different wavelengths.
Ohara, Takayuki; Fukuda, Hirokazu; Tokuda, Isao T
2015-04-01
Light is known as one of the most powerful environmental time cues for the circadian system. The quality of light is characterized by its intensity and wavelength. We examined how the phase response of Arabidopsis thaliana depends on the wavelength of the stimulus light and the type of light perturbation. Using transgenic A. thaliana expressing a luciferase gene, we monitored the rhythm of the bioluminescence signal. We stimulated the plants under constant red light using 3 light perturbation treatments: (1) increasing the red light intensity, (2) turning on a blue light while turning off the red light, and (3) turning on a blue light while keeping the red light on. To examine the phase response properties, we generated a phase transition curve (PTC), which plots the phase after the perturbation as a function of the phase before the perturbation. To evaluate the effect of the 3 light perturbation treatments, we simulated PTCs using a mathematical model of the plant circadian clock and fitted the simulated PTCs to the experimentally measured PTCs. Among the 3 treatments, perturbation (3) provided the strongest stimulus. The results indicate that the color of the stimulus light and the type of pulse administration affect the phase response in a complex manner. Moreover, the results suggest the involvement of interaction between red and blue light signaling pathways in resetting of the plant circadian clock. © 2015 The Author(s).
Utilization of Android-base Smartphone to Support Handmade Spectrophotometer : A Preliminary Study
NASA Astrophysics Data System (ADS)
Ujiningtyas, R.; Apriliani, E.; Yohana, I.; Afrillianti, L.; Hikmah, N.; Kurniawan, C.
2018-04-01
Visible spectrophotometer is a powerful instrument in chemistry. We can identify the chemical species base on their specific color and then we can also determine the amount of the species using the spectrophotometer. However, the availability of visible spectrophotometer still limited, particularly for education. This affect the skill of student to have experience on handling the instrumentation. On the other hand, the communication technology creates an opportunity for student to explore their smart feature, mainly the camera. The objective of this research is to make an application that utilize the camera feature as a detector for handmade visible spectrophotometer. The software have been made based on android program, and we name it as Spectrophone®. The spectrophotometer consists of an acrylic body, sample compartment, and light sources (USB-LED lamp powered by 6600 mAh battery). Before reach the sample, the light source was filtered using colored-mica plastic. The spectrophone® apps utilize the camera to detect the color based on its RGB composition. A different colored solution will show a different RGB composition based on the concentration and specific absorbance wavelength. We then can choose one type of color composition, R or G or B only to be converted as an absorbance using -Log (Cs/Co), where Cs and Co are color composition of sample and blank, respectively. The calibration curve of metilen blue measured. In a red (R) composition, the regression is not linear (R2=0.78) compare to the result of UV-Vis spectrophotomer model Spectroquant Pharo 300 (R2=0.8053). This measurement result shows that The Spectrophone® still need to be evaluated and corrected. One problem than can we identify that the diameter of pick point of RGB composition is too wide and this will affect the reading color composition. Next, we will fix the problem and in advance we will apply this Spectrophone® in a wide scale.
Joint estimation of high resolution images and depth maps from light field cameras
NASA Astrophysics Data System (ADS)
Ohashi, Kazuki; Takahashi, Keita; Fujii, Toshiaki
2014-03-01
Light field cameras are attracting much attention as tools for acquiring 3D information of a scene through a single camera. The main drawback of typical lenselet-based light field cameras is the limited resolution. This limitation comes from the structure where a microlens array is inserted between the sensor and the main lens. The microlens array projects 4D light field on a single 2D image sensor at the sacrifice of the resolution; the angular resolution and the position resolution trade-off under the fixed resolution of the image sensor. This fundamental trade-off remains after the raw light field image is converted to a set of sub-aperture images. The purpose of our study is to estimate a higher resolution image from low resolution sub-aperture images using a framework of super-resolution reconstruction. In this reconstruction, these sub-aperture images should be registered as accurately as possible. This registration is equivalent to depth estimation. Therefore, we propose a method where super-resolution and depth refinement are performed alternatively. Most of the process of our method is implemented by image processing operations. We present several experimental results using a Lytro camera, where we increased the resolution of a sub-aperture image by three times horizontally and vertically. Our method can produce clearer images compared to the original sub-aperture images and the case without depth refinement.
Effect of Light Quality on Stomatal Opening in Leaves of Xanthium strumarium L. 1
Sharkey, Thomas D.; Raschke, Klaus
1981-01-01
Flux response curves were determined at 16 wavelengths of light for the conductance for water vapor of the lower epidermis of detached leaves of Xanthium strumarium L. An action spectrum of stomatal opening resulted in which blue light (wavelengths between 430 and 460 nanometers) was nearly ten times more effective than red light (wavelengths between 630 and 680 nanometers) in producing a conductance of 15 centimoles per square meter per second. Stomata responded only slightly to green light. An action spectrum of stomatal responses to red light corresponded to that of CO2 assimilation; the inhibitors of photosynthetic electron transport, cyanazine (2-chloro-4[1-cyano-1-methylethylamino]-6-ethylamino-s-triazine) and 3-(3,4-dichlorophenyl)-1,1-dimethylurea, eliminated the response to red light. This indicates that light absorption by chlorophyll is the cause of stomatal sensitivity to red light. Determination of flux response curves on leaves in the normal position (upper epidermis facing the light) or in the inverted position (lower epidermis facing the light) led to the conclusion that the photoreceptors for blue as well as for red light are located on or near the surfaces of the leaves; presumably they are in the guard cells themselves. PMID:16662069
Development of a camera casing suited for cryogenic and vacuum applications
NASA Astrophysics Data System (ADS)
Delaquis, S. C.; Gornea, R.; Janos, S.; Lüthi, M.; von Rohr, Ch Rudolf; Schenk, M.; Vuilleumier, J.-L.
2013-12-01
We report on the design, construction, and operation of a PID temperature controlled and vacuum tight camera casing. The camera casing contains a commercial digital camera and a lighting system. The design of the camera casing and its components are discussed in detail. Pictures taken by this cryo-camera while immersed in argon vapour and liquid nitrogen are presented. The cryo-camera can provide a live view inside cryogenic set-ups and allows to record video.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinones, M.A.; Lu, Zhenmin; Zeiger, E.
1996-03-05
Fluorescence spectroscopy was used to characterize blue light responses from chloroplasts of adaxial guard cells from Pima cotton (Gossypium barbadense) and coleoptile tips from corn (Zea mays). The chloroplast response to blue light was quantified by measurements of the blue light-induced enhancement of a red light-stimulated quenching of chlorophyll a fluorescence. In adaxial (upper) guard cells, low fluence rates of blue light applied under saturating fluence rates of red light enhanced the red light-stimulated fluorescence quenching by up to 50%. In contrast, added blue light did not alter the red light-stimulated quenching from abaxial (lower) guard cells. This response patternmore » paralleled the blue light sensitivity of stomatal opening in the two leaf surfaces. An action spectrum for the blue light-induced enhancement of the red light-stimulated quenching showed a major peak at 450 nm and two minor peaks at 420 and 470 nm. This spectrum matched closely an action spectrum for blue light-stimulated stomatal opening. Coleoptile chloroplasts also showed an enhancement by blue light of red light-stimulated quenching. The action spectrum of this response, showing a major peak at 450 nm, a minor peak at 470 nm, and a shoulder at 430 nm, closely matched an action spectrum for blue light-stimulated coleoptile phototropism. Both action spectra match the absorption spectrum of zeaxanthin, a chloroplastic carotenoid recently implicated in blue light photoreception of both guard cells and coleoptiles. The remarkable similarity between the action spectra for the blue light responses of guard cells and coleoptile chloroplasts and the spectra for blue light-stimulated stomatal opening and phototropism, coupled to the recently reported evidence on a role of zeaxanthin in blue light photoreception, indicates that the guard cell and coleoptile chloroplasts specialize in sensory transduction. 28 refs. 4 figs.« less
Spectral Imaging of Portolan Charts
NASA Astrophysics Data System (ADS)
France, Fenella G.; Wilson, Meghan A.; Ghez, Anita
2018-05-01
Spectral imaging of Portolan Charts, early nautical charts, provided extensive new information about their construction and creation. The origins of the portolan chart style have been a continual source of perplexity to numerous generations of cartographic historians. The spectral imaging system utilized incorporates a 50 megapixel mono-chrome camera with light emitting diode (LED) illumination panels that cover the range from 365 nm to 1050 nm to capture visible and non-visible information. There is little known about how portolan charts evolved, and what influenced their creation. These early nautical charts began as working navigational tools of medieval mariners, initially made in the 1300s in Italy, Portugal and Spain; however the origin and development of the portolan chart remained shrouded in mystery. Questions about these early navigational charts included whether colorants were commensurate with the time period and geographical location, and if different, did that give insight into trade routes, or possible later additions to the charts? For example; spectral data showed the red pigment on both the 1320 portolan chart and the 1565 Galapagos Islands matched vermillion, an opaque red pigment used since antiquity. The construction of these charts was also of great interest. Spectral imaging with a range of illumination modes revealed the presence of a "hidden circle" often referred to in relation to their construction. This paper will present in-depth analysis of how spectral imaging of the Portolans revealed similarities and differences, new hidden information and shed new light on construction and composition.
Safety evaluation of intersections with dynamic use of exit-lanes for left-turn using field data.
Zhao, Jing; Liu, Yue
2017-05-01
As a newly proposed unconventional intersection design, the exit-lanes for left-turn (EFL) intersection is found to be effective in increasing the intersection capacity with high level of application flexibility, especially under heavy left-turn traffic conditions. However, the operational safety of EFL is of most concern to the authority prior to its implementation. This paper evaluates the safety of the EFL intersections by studying the behavior of left-turn maneuvers using field data collected at 7 locations in China. A total of 22830 left-turn vehicles were captured, in which 9793 vehicles turned left using the mixed-usage area. Four potential safety problems, including the red-light violations, head-on collision risks, trapped vehicles, and rear-end crash risks, were discussed. Statistical analyses were carried out to compare the safety risk between the EFL intersection and the conventional one. Results indicate that the safety problems of EFL intersections mainly lie in higher percentages in red-light violations at the pre-signal (1.83% higher), wrong-way violation problems during the peak hours (the violation rate reaches up to 11.07%), and the lower travel speeds in the mixed-usage area (18.75% lower). Such risks can be counteracted, however, by providing more guiding information, installing cameras to investigate and punish violation maneuvers, and adjusting design parameter values for layout design and signal timing, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Orbital docking system centerline color television camera system test
NASA Technical Reports Server (NTRS)
Mongan, Philip T.
1993-01-01
A series of tests was run to verify that the design of the centerline color television camera (CTVC) system is adequate optically for the STS-71 Space Shuttle Orbiter docking mission with the Mir space station. In each test, a mockup of the Mir consisting of hatch, docking mechanism, and docking target was positioned above the Johnson Space Center's full fuselage trainer, which simulated the Orbiter with a mockup of the external airlock and docking adapter. Test subjects viewed the docking target through the CTVC under 30 different lighting conditions and evaluated target resolution, field of view, light levels, light placement, and methods of target alignment. Test results indicate that the proposed design will provide adequate visibility through the centerline camera for a successful docking, even with a reasonable number of light failures. It is recommended that the flight deck crew have individual switching capability for docking lights to provide maximum shadow management and that centerline lights be retained to deal with light failures and user preferences. Procedures for light management should be developed and target alignment aids should be selected during simulated docking runs.
Ghoneim, Ehab M
2014-01-01
To evaluate the use of red-free light for the measurement of intraocular pressure (IOP) using a Goldmann applanation tonometer without fluorescein. This cross-sectional study was carried out on 500 eyes in 250 patients attending the Ophthalmology Outpatient Clinic at Suez Canal University Hospital. The IOP was measured using a Goldmann applanation tonometer mounted on a Haag-Streit slit-lamp. The measurements were performed first using red-free light without fluorescein. Then the measurements were repeated with cobalt blue light and topical fluorescein on the same eyes. The mean IOP was 15.23 ± 3.3 (SD) mm Hg using the red-free light without fluorescein, whereas it was 15.78 ± 3.7 (SD) mm Hg when measured using cobalt blue light after the application of fluorescein to the conjunctival sac. This difference was not statistically significant. Measurement of IOP with a Goldmann applanationtonometer with red-free light and without the use of fluorescein is simple, saves time, and gives an accurate IOP measurement relative to the traditional measurement technique with cobalt blue light and topical fluorescein.
Aliasing Detection and Reduction Scheme on Angularly Undersampled Light Fields.
Xiao, Zhaolin; Wang, Qing; Zhou, Guoqing; Yu, Jingyi
2017-05-01
When using plenoptic camera for digital refocusing, angular undersampling can cause severe (angular) aliasing artifacts. Previous approaches have focused on avoiding aliasing by pre-processing the acquired light field via prefiltering, demosaicing, reparameterization, and so on. In this paper, we present a different solution that first detects and then removes angular aliasing at the light field refocusing stage. Different from previous frequency domain aliasing analysis, we carry out a spatial domain analysis to reveal whether the angular aliasing would occur and uncover where in the image it would occur. The spatial analysis also facilitates easy separation of the aliasing versus non-aliasing regions and angular aliasing removal. Experiments on both synthetic scene and real light field data sets (camera array and Lytro camera) demonstrate that our approach has a number of advantages over the classical prefiltering and depth-dependent light field rendering techniques.
DOT National Transportation Integrated Search
1994-11-01
The objective of this study was to evaluate the effectiveness of using strobe lights in the red lens of traffic signals and, if appropriate, to recommend guidelines for their use. Strobe lights are used as a supplement to the red lens to draw the att...
Monostori, István; Heilmann, Márk; Kocsy, Gábor; Rakszegi, Marianna; Ahres, Mohamed; Altenbach, Susan B.; Szalai, Gabriella; Pál, Magda; Toldi, Dávid; Simon-Sarkadi, Livia; Harnos, Noémi; Galiba, Gábor; Darko, Éva
2018-01-01
The use of light-emitting diode (LED) technology for plant cultivation under controlled environmental conditions can result in significant reductions in energy consumption. However, there is still a lack of detailed information on the lighting conditions required for optimal growth of different plant species and the effects of light intensity and spectral composition on plant metabolism and nutritional quality. In the present study, wheat plants were grown under six regimens designed to compare the effects of LED and conventional fluorescent lights on growth and development, leaf photosynthesis, thiol and amino acid metabolism as well as grain yield and flour quality of wheat. Benefits of LED light sources over fluorescent lighting were manifested in both yield and quality of wheat. Elevated light intensities made possible with LEDs increased photosynthetic activity, the number of tillers, biomass and yield. At lower light intensities, blue, green and far-red light operated antagonistically during the stem elongation period. High photosynthetic activity was achieved when at least 50% of red light was applied during cultivation. A high proportion of blue light prolonged the juvenile phase, while the shortest flowering time was achieved when the blue to red ratio was around one. Blue and far-red light affected the glutathione- and proline-dependent redox environment in leaves. LEDs, especially in Blue, Pink and Red Low Light (RedLL) regimens improved flour quality by modifying starch and protein content, dough strength and extensibility as demonstrated by the ratios of high to low molecular weight glutenins, ratios of glutenins to gliadins and gluten spread values. These results clearly show that LEDs are efficient for experimental wheat cultivation, and make it possible to optimize the growth conditions and to manipulate metabolism, yield and quality through modification of light quality and quantity. PMID:29780400