These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Latest Results of 3D Topographic Mapping Using Lunar Reconnaissance Orbiter Narrow-Angle Camera Data  

NASA Astrophysics Data System (ADS)

This abstract presents the latest research results and quantitative analysis of topographic mapping using Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NAC) conducted at the Ohio State University.

Li, R.; Wang, W.; He, S.; Yan, L.; Meng, X.; Crawford, J.; Robinson, M. S.; Tran, T.; Archinal, B. A.; Lroc Team

2011-03-01

2

Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera  

NASA Astrophysics Data System (ADS)

One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature Geoscience, in press. [2] Lawrence et al. (2011) LPSC XLII, Abst 2228. [3] Garry et al. (2011) LPSC XLII, Abst 2605.

Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

2011-12-01

3

Narrow Angle movie  

NASA Technical Reports Server (NTRS)

This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

1999-01-01

4

The narrow angle camera of the MPCS suite for the MarcoPolo ESA Mission: requirements and optical design solutions  

NASA Astrophysics Data System (ADS)

Possible optical designs of a Narrow Angle Camera (NAC) suitable for being the high resolution channel of the MarcoPolo Camera System for the MarcoPolo ESA mission are presented. The MarcoPolo mission objective is the rendezvous with a Near Earth Asteroid in order to fully characterize the body, to land on the surface and to return to Earth a sample of the asteroid soil. Science goals for the NAC are global mapping of the object, detailed investigations of the surface at high spatial resolution (order of millimeters), and deep examination of possible landing sites from a close distance. The instrument has a 3"/pixel scale factor, corresponding to 80 mm/px at 5 km from the surface, on a 1.75° × 1.75° FoV; imaging in 5 to 8 different spectral bands (panchromatic and broadband), in the range between 400 and 900 nm, is foreseen. Since the target is an extended low contrast object, to avoid image contrast degradation, only off-axis unobstructed optical layouts have been considered. Solutions with two mirrors plus a refractive corrector, or all-reflective three mirrors ones, have been studied, both allowing to reach good aberration balancing over all the field of view: the diffraction Ensquared Energy inside one pixel of the detector is of the order of 70%. To cope with the hazardous radiation environment in which the spacecraft will be immersed in during the mission, all the glasses selected for the design are rad-hard type.

da Deppo, Vania; Cremonese, Gabriele; Naletto, Giampiero

2010-07-01

5

High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images  

USGS Publications Warehouse

We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

2003-01-01

6

3-D Reconstruction from Narrow-Angle Radiographs  

NASA Astrophysics Data System (ADS)

During in-service inspections, experts are faced with the delicate task of establishing a complete diagnostic of defects from radiographs. This paper presents a 3-D reconstruction processing technique in conditions representative of pipe inspections: the incidence angle is very narrow. The reconstruction process relies on the estimation of the attenuation, also called inversion because it restores the attenuation from both data and prior information. The attenuation volume allows the characterization of the defects.

Fournier, L.; Châtellier, L.; Charbonnier, B.; Chassignole, B.

2005-04-01

7

Viscogonioplasty in narrow angle glaucoma: a randomized controlled trial  

PubMed Central

Purpose: To determine the effect of viscogonioplasty and cataract extraction on intraocular pressure in patients with narrow angle glaucoma. Methods: This was a double-masked randomized controlled trial involving 50 eyes (25 cases and 25 controls) from 38 consecutive patients. All patients underwent phacoemulsification with or without viscogonioplasty. The main outcome measures were intraocular pressure post-treatment and number of glaucoma medications post-treatment. Results: Cases had a greater reduction in intraocular pressure than controls, with a mean intraocular pressure (standard deviation) at 12 months of 13.7 (±2.89) mmHg compared with 16.2 (±3.55) mmHg in controls (P = 0.009). Cases had a greater reduction in mean number of antiglaucoma medications than controls at 12-month review, with 13 of 25 eyes (52%) of cases controlled without any antiglaucoma therapy versus 9 of 25 (36%) of the controls (P = 0.005). Conclusions: Viscogonioplasty combined with cataract extraction has a significantly greater effect than cataract extraction alone on lowering intraocular pressure in patients with poorly controlled narrow angle glaucoma and should therefore be considered as a treatment option for patients with this condition. PMID:21191443

Varma, Deepali; Adams, Wendy; Bunce, Catey; Phelan, Peter; Fraser, Scott

2010-01-01

8

WIDE-ANGLE, NARROW-ANGLE, AND IMAGING BASELINES OF OPTICAL LONG-BASELINE INTERFEROMETERS  

SciTech Connect

For optical interferometers, the baseline is typically defined as the vector joining two perfectly identical telescopes. However, when the telescopes are naturally different or when the requirements on the baseline vector challenge the telescope perfection, the baseline definition depends on how the interferometer is used. This is where the notions of wide-angle, narrow-angle, and imaging baselines come into play. This article explores this variety of baselines, with the purpose of presenting a coherent set of definitions, describing how they relate to each other, and suggesting baseline metrology requirements. Ultimately, this work aims at supporting upcoming long-baseline optical interferometers with narrow-angle astrometry and phase-referenced imaging capabilities at the microarcsecond level.

Woillez, J. [W. M. Keck Observatory, 65-1120 Mamalahoa Highway, Kamuela, HI 96743 (United States)] [W. M. Keck Observatory, 65-1120 Mamalahoa Highway, Kamuela, HI 96743 (United States); Lacour, S., E-mail: jwoillez@keck.hawaii.edu, E-mail: sylvestre.lacour@obspm.fr [Observatoire de Paris, Place Jules Janssen, F-92195 Meudon (France)

2013-02-10

9

3-D Reconstruction from Narrow-Angle Radiographs  

NASA Astrophysics Data System (ADS)

So as to detect and characterize potential defects in pipes, inspections are carried out with the help of non-destructive examination techniques (NDE) including x- or ? radiography. Should a defect be detected, one can be asked to prove the component still stands the mechanical constraints. In these cases of expertise, the use of a 3-D reconstruction processing technique can be very useful. One characteristic of such applications is that, in general the number and angles of projections are very limited and the data are very noisy, so classical tomography algorithms cannot solve the problem. In this work, we study two methods of reconstruction that allows to take the specificity of radiography inspection into account through two different means: a reconstruction technique based on a priori model (Markov-Potts), a binary technique that constrain the solution to be either 0 or 1 and called "BLMR". This paper focuses on first results we obtain on simulated data and real data corresponding to a mock-up with several electro-dynamically manufactured cylindrical defects.

Fournier, L.; Châtellier, L.; Peureux, P.; Mohammad-Djafari, A.; Idier, J.

2008-02-01

10

Improved iris localization by using wide and narrow field of view cameras for iris recognition  

NASA Astrophysics Data System (ADS)

Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

2013-10-01

11

13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

12

10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

13

Data analysis of narrow-angle field dependent tests in the MAM testbed interferometer  

NASA Astrophysics Data System (ADS)

The Microarcsecond Metrology Testbed (MAM) developed at the Jet Propulsion Laboratory is a single-baseline interferometer coupled with a precision pseudostar. It is designed to test the ability of the SIM science interferometer to perform microarcsecond stellar astrometry over both narrow-angle (1 degree) and wide-angle (7.5 degree) fields. The MAM Testbed features an optical interferometer with a white light source, all major optical components of a stellar interferometer and heterodyne metrology sensors. This paper will describe the performance metric used to evaluate our narrow-angle field dependent data and presents the results of the analysis. The narrow-angle 3 star observation scenario implemented in the MAM testbed consists of 1 target (T) star and 2 accompanied reference (R1,R2) stars, which are 1 degree apart horizontally from the target star. The observation of target (science) and reference stars are interlaced (R1,T,R2,T,repeat) in order to remove temporal and spatial drifts between consecutive measurements of the target star. The total observation time for target star is twice that of the 2 reference companions. Cyclic averaging was implemented in our observations in addition to interlacing. The least squares algorithm tested in our field independent measurements is applied to solve for the delays (or paths) of the target star and the 2 reference stars. A super chop variance was adopted as our performance metric. This super chop variance will remove the drifts of the target star path from its path differences with respect to the 2 reference stars. Recent data is presented which demonstrates agreement between the metrology and starlight paths to be better than 150pm in the 3 star narrow angle field of view. The research described was performed at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Sp ace Administration.

Shen, T. J.; Goullioud, R.; Catanzarite, J.; Shao, M.; Yu, J.; Machuzak, R.

2002-12-01

14

Accuracy and repeatability of joint angles measured using a single camera markerless motion capture system.  

PubMed

Markerless motion capture systems have developed in an effort to evaluate human movement in a natural setting. However, the accuracy and reliability of these systems remain understudied. Therefore, the goals of this study were to quantify the accuracy and repeatability of joint angles using a single camera markerless motion capture system and to compare the markerless system performance with that of a marker-based system. A jig was placed in multiple static postures with marker trajectories collected using a ten camera motion analysis system. Depth and color image data were simultaneously collected from a single Microsoft Kinect camera, which was subsequently used to calculate virtual marker trajectories. A digital inclinometer provided a measure of ground-truth for sagittal and frontal plane joint angles. Joint angles were calculated with marker data from both motion capture systems using successive body-fixed rotations. The sagittal and frontal plane joint angles calculated from the marker-based and markerless system agreed with inclinometer measurements by <0.5°. The systems agreed with each other by <0.5° for sagittal and frontal plane joint angles and <2° for transverse plane rotation. Both systems showed a coefficient of reliability <0.5° for all angles. These results illustrate the feasibility of a single camera markerless motion capture system to accurately measure lower extremity kinematics and provide a first step in using this technology to discern clinically relevant differences in the joint kinematics of patient populations. PMID:24315287

Schmitz, Anne; Ye, Mao; Shapiro, Robert; Yang, Ruigang; Noehren, Brian

2014-01-22

15

Phase referencing and narrow-angle astrometry in current and future interferometers  

NASA Astrophysics Data System (ADS)

Atmospheric turbulence is a serious problem for ground-based interferometers. It places tight limits on both sensitivity and measurement precision. Phase referencing is a method to overcome these limitations via the use of a bright reference star. The Palomar Testbed Interferometer was designed to use phase referencing and so can provide a pair of phase-stabilized starlight beams to a second (science) beam combiner. We have used this capability for several interesting studies, including very narrow angle astrometry. For close (1-arcsecond) pairs of stars we are able to achieve a differential astrometric precision in the range 20--30 micro-arcseconds.

Lane, Benjamin F.; Muterspaugh, Matthew W.

2004-10-01

16

Narrow-angle laser scanning microscope system for linewidth measurements on wafers  

NASA Astrophysics Data System (ADS)

The integrated-circuit industry in its push to finer and finer line geometries approaching submicrometer dimensions has created a need for ever more accurate and precise feature-size measurements to establish tighter control of fabrication processes. In conjunction with the NBS Semiconductor Linewidth Metrology Program, a unique narrow-angle laser measurement system was developed. The report describes the theory, optical design, and operation of the system and includes computer software useful for characterizing the pertinent optical parameters and images for patterned thin layers. For thick layers, the physics is more complex and only elements of the theory are included. For more detail the reader is referred to several related reports listed in the references.

Nyyssonen, Diane

1989-04-01

17

Narrow Angle Wide Spectral Range Radiometer Design FEANICS/REEFS Radiometer Design Report  

NASA Technical Reports Server (NTRS)

A critical measurement for the Radiative Enhancement Effects on Flame Spread (REEFS) microgravity combustion experiment is the net radiative flux emitted from the gases and from the solid fuel bed. These quantities are measured using a set of narrow angle, wide spectral range radiometers. The radiometers are required to have an angular field of view of 1.2 degrees and measure over the spectral range of 0.6 to 30 microns, which presents a challenging design effort. This report details the design of this radiometer system including field of view, radiometer response, radiometric calculations, temperature effects, error sources, baffling and amplifiers. This report presents some radiometer specific data but does not present any REEFS experiment data.

Camperchioli, William

2005-01-01

18

Large-angle pinhole gamma camera with depth-of-interaction detector for contamination monitoring  

NASA Astrophysics Data System (ADS)

The gamma camera system was designed for monitoring the medical fields such as a radiopharmaceutical preparation lab or a patient waiting room (after source injection) in the division of nuclear medicine. However, gamma cameras equipped with a large-angle pinhole collimator and a thick monolithic crystal suffer from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. To improve the uniformity of the spatial resolution across the field of view (FOV), we proposed a three-layer crystal detector with a maximum-likelihood position-estimation (MLPE) method, which can measure depth-of-interaction (DOI) information. The aim of this study was to develop and evaluate the performance of new detector experimentally. The proposed detector employed three layers of monolithic CsI(Tl) crystals, each of which is 50.0×50.0×2.0 mm3, and a large-angle pinhole collimator with an acceptance angle of 120°. The bottom surface of the third layer was directly coupled to an 8×8 channel position-sensitive photomultiplier tube (PSPMT, Hamamatsu H8500C). The PSPMT was read out using a resistive charge divider, which multiplexes 64 anodes into 8(X)+8(Y) channels. Gaussian-based MLPE method has been implemented using experimentally measured detector response functions (DRFs). Tc-99 m point source was imaged at different positions with and without DOI measurements. Experimental results showed that the spatial resolution was degraded gradually as the source moved from the center to the periphery of the FOV without DOI information but the DOI detector showed the marked improvement in the spatial resolution, especially at off-center by correcting the parallax error. In this paper, our new detector with DOI capability proved to characterize reliably the gamma event position with the high and uniform spatial resolution, so that the large-angle pinhole gamma camera could be a useful tool in contamination monitoring.

Baek, Cheol-Ha; Kim, Hyun-Il; Hwang, Ji Yeon; Jung An, Su; Kim, Kwang Hyun; Kwak, Sung-Woo; Chung, Yong Hyun

2011-08-01

19

Optical design of the wide angle camera for the Rosetta mission.  

PubMed

The final optical design of the Wide Angle Camera for the Rosetta mission to the P/Wirtanen comet is described. This camera is an F/5.6 telescope with a rather large 12 degrees x 12 degrees field of view. To satisfy the scientific requirements for spatial resolution, contrast capability, and spectral coverage, a two-mirror, off-axis, and unobstructed optical design, believed to be novel, has been adopted. This configuration has been simulated with a ray-tracing code, showing that theoretically more than 80% of the collimated beam energy falls within a single pixel (20" x 20") over the whole camera field of view and that the possible contrast ratio is smaller than 1/1000. Moreover, this novel optical design is rather simple from a mechanical point of view and is compact and relatively easy to align. All these characteristics make this type of camera rather flexible and also suitable for other space missions with similar performance requirements. PMID:11900025

Naletto, Giampiero; Da, Deppo Vania; Pelizzo, Maria Guglielmina; Ragazzoni, Roberto; Marchetti, Enrico

2002-03-01

20

SELF-CALIBRATION OF CENTRAL CAMERAS BY MINIMIZING ANGULAR ERROR  

E-print Network

. The proposed method can be hence applied to a large range of cameras from narrow-angle to fish-eye lenses of camera lenses is a signifi- cant problem in the analysis of digital images (Hartley and Kang, 2005 is usable for many narrow-angle lenses but it is not sufficient for omnidirectional cam- eras which may have

Brandt, Sami

21

Calibration of the Lunar Reconnaissance Orbiter Camera  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m\\/pixel and 2 UV filters (315 and 360 nm) with

M. Tschimmel; M. S. Robinson; D. C. Humm; B. W. Denevi; S. J. Lawrence; S. Brylow; M. Ravine; T. Ghaemi

2008-01-01

22

Mesosphere light scattering depolarization during the Perseids activity epoch by wide-angle polarization camera measurements  

NASA Astrophysics Data System (ADS)

The paper describes the study of scattered radiation field in the mesosphere basing on wide-angle polarization camera (WAPC) measurements of the twilight sky background and single scattering separation procedure. Mid-August observations in 2012 and 2013 show the decrease of single scattering polarization value probably related with Perseids meteor dust moderation in the upper mesosphere. Effect correlates with activity of tiny fraction of Perseids shower. Polarization and temperature analysis allows estimating the altitude of dust layer and character polarization of dust scattering.

Ugolnikov, Oleg S.; Maslov, Igor A.

2014-03-01

23

Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar\\u000a Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m\\/pixel visible and UV, respectively), while\\u000a the two NACs are monochrome narrow-angle linescan imagers (0.5 m\\/pixel). The primary mission of LRO is to obtain measurements\\u000a of the Moon that

M. S. Robinson; S. M. Brylow; M. Tschimmel; D. Humm; S. J. Lawrence; P. C. Thomas; B. W. Denevi; E. Bowman-Cisneros; J. Zerr; M. A. Ravine; M. A. Caplinger; F. T. Ghaemi; J. A. Schaffner; M. C. Malin; P. Mahanti; A. Bartels; J. Anderson; T. N. Tran; E. M. Eliason; A. S. McEwen; E. Turtle; B. L. Jolliff; H. Hiesinger

2010-01-01

24

A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets  

NASA Technical Reports Server (NTRS)

The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will give the analysts important experience in the proper use and calibration of SIM.

Shaklan, Stuart; Pan, Xiaopei

2004-01-01

25

Small-angle approximation to the transfer of narrow laser beams in anisotropic scattering media  

NASA Technical Reports Server (NTRS)

The broadening and the signal power detected of a laser beam traversing an anisotropic scattering medium were examined using the small-angle approximation to the radiative transfer equation in which photons suffering large-angle deflections are neglected. To obtain tractable answers, simple Gaussian and non-Gaussian functions for the scattering phase functions are assumed. Two other approximate approaches employed in the field to further simplify the small-angle approximation solutions are described, and the results obtained by one of them are compared with those obtained using small-angle approximation. An exact method for obtaining the contribution of each higher order scattering to the radiance field is examined but no results are presented.

Box, M. A.; Deepak, A.

1981-01-01

26

Asteroidal background for the Wide Angle Camera of the Rosetta Mission  

NASA Astrophysics Data System (ADS)

This paper aims to evaluate the background of known asteroids present in the field of the Wide Angle Camera (WAC) of the Rosetta Mission during the flybys of (4979)Otawara and (140)Siwa. The field of view (FoV) is very wide, namely 12x12 sqdeg, and the limiting magnitude in the R filter in 60 seconds is around the 10th; therefore the possibility of asteroids being in the image are not negligible. In order to determine the objects present in the FoV a program has been developed to derive the rectangular ecliptic coordinates of all numbered asteroids (27654 derived from MPC database). The same coordinates for the spacecraft were kindly provided by ESOC. The calculations have been extended from the 7th till the 13th of July 2006 for the Otawara flyby, and from the 18th till the 26th of July 2008 for the Siwa flyby. We have identified 11 objects till the 13th magnitude which will be present in WAC's FoV during Otawara flyby (2 till 11th magnitude) and 12 objects during Siwa flyby (1 till 11th magnitude). In the near future we plan to extend our search to all known asteroids (even non numbered ones) provided by MPC database. This work has been partly supported by the Italian Space Agency (ASI)

Barbieri, C.; Bernardi, F.; Bertini, I.

2001-11-01

27

Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients  

NASA Technical Reports Server (NTRS)

Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with 'kidney-shaped' cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side.

Soker, Noam; Sarazin, Craig L.; O'Dea, Christopher P.

1988-01-01

28

Erratum: The Wide Angle Camera of the ROSETTA Mission [Mem.SAIt 74, 434-435 (2003)  

NASA Astrophysics Data System (ADS)

The authors acknowledge that the paper fails to convey the correct information about the respective contributions and roles of the partners of the OSIRIS consortium. In particular, the hardware contributions of the Max-Planck Institut für Sonnensystemforschung, MPS, (Katlenburg Lindau, Germany, formerly MPAe), of the Instituto de Astrofisica de Andalucia (Granada, Spain), of the Department of Astronomy and Space Physics of Uppsala University (DASP), of ESA Research and Scientific Support Department (ESA/RSSD) to the Wide Angle Camera has not been mentioned or incorrectly expounded. The overall responsibility (PI ship) of MPS (MPAe) for OSIRIS and hence for the Wide Angle Camera is not correctly mentioned either. The correct information is given in the paper by Keller et al. (2006, Space Science Review, in press). The authors take this opportunity to acknowledge that the activity of the Italian team has been partly supported by the Italian Space Agency ASI through a contract to CISAS.

Barbieri, C.; Fornasier, S.; Verani, S.; Bertini, I.; Lazzarin, M.; Rampazzi, F.; Cremonese, G.; Ragazzoni, R.; Marzari, F.; Angrilli, F.; Bianchini, G. A.; Debei, S.; Dececco, M.; Guizzo, G.; Parzianello, G.; Ramous, P.; Saggin, B.; Zaccariotto, M.; da Deppo, V.; Naletto, G.; Nicolosi, G.; Pelizzo, M. G.; Tondello, G.; Brunello, P.; Peron, F.

29

Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.  

SciTech Connect

Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

Gehrke, Christopher R. (Caterpillar Inc.); Radovanovic, Michael S. (Caterpillar Inc.); Milam, David M. (Caterpillar Inc.); Martin, Glen C.; Mueller, Charles J.

2008-04-01

30

Observations of Comet 9P\\/Tempel 1 around the Deep Impact event by the OSIRIS cameras onboard Rosetta  

Microsoft Academic Search

The OSIRIS cameras on the Rosetta spacecraft observed Comet 9P\\/Tempel 1 from 5 days before to 10 days after it was hit by the Deep Impact projectile. The Narrow Angle Camera (NAC) monitored the cometary dust in 5 different filters. The Wide Angle Camera (WAC) observed through

Horst Uwe Keller; Michael Küppers; Sonia Fornasier; Pedro J. Gutiérrez; Stubbe F. Hviid; Laurent Jorda; Jörg Knollenberg; Stephen C. Lowry; Miriam Rengel; Ivano Bertini; Rainer Kramm; Ekkehard Kührt; Luisa-Maria Lara; Holger Sierks; Cesare Barbieri; Philippe Lamy; Hans Rickman; Rafael Rodrigo; Michael F. A'Hearn; Björn J. R. Davidsson; Marco Fulle; Fritz Gliem; Olivier Groussin; José J. Lopez Moreno; Francesco Marzari; Angel Sanz; Camino Bajo de Huétor; Chung Li; G. Galilei

2006-01-01

31

Enantiopure Narrow Bite-Angle P?OP Ligands: Synthesis and Catalytic Performance in Asymmetric Hydroformylations and Hydrogenations.  

PubMed

Herein is reported the preparation of a set of narrow bite-angle P-OP ligands the backbone of which contains a stereogenic carbon atom. The synthesis was based on a Corey-Bakshi-Shibata (CBS)-catalyzed asymmetric reduction of phosphomides. The structure of the resulting 1,1-P-OP ligands, which was selectively tuned through adequate combination of the configuration of the stereogenic carbon atom, its substituent, and the phosphite fragment, proved crucial for providing a rigid environment around the metal center, as evidenced by X-ray crystallography. These new ligands enabled very good catalytic properties in the Rh-mediated enantioselective hydrogenation and hydroformylation of challenging and model substrates (up to 99?%?ee). Whereas for asymmetric hydrogenation the optimal P-OP ligand depended on the substrate, for hydroformylation, a single ligand was the highest-performing one for almost all studied substrates: it contains an R-configured stereogenic carbon atom between the two phosphorus ligating groups, and an S-configured 3,3'-diphenyl-substituted biaryl unit. PMID:25335770

Fernández-Pérez, Héctor; Benet-Buchholz, Jordi; Vidal-Ferran, Anton

2014-11-17

32

Wide-angle and ultrathin camera module using a curved hexagonal microlens array and all spherical surfaces.  

PubMed

In this paper, we propose a wide-angle and thin camera module integrating the principles of an insect's compound eye and the human eye, mimicking them with a curved hexagonal microlens array and a hemispherical lens, respectively. Compared to typical mobile phone cameras with more than four lenses and a limited full field of view (FFOV), the proposed system uses only two lenses to achieve a wide FFOV. Furthermore, the thickness of our proposed system is only 2.7 mm. It has an f-number of 2.07, an image diameter of 4.032 mm, and a diagonal FFOV of 136°. The results showed good image quality with a modulation transfer function above 0.3 at a Nyquist frequency of 166??cycles/mm. PMID:25322408

Liang, Wei-Lun; Su, Guo-Dung J

2014-10-10

33

Miniature Wide-Angle Lens for Small-Pixel Electronic Camera  

NASA Technical Reports Server (NTRS)

A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.

Mouroulils, Pantazis; Blazejewski, Edward

2009-01-01

34

Lunar Reconnaissance Orbiter Camera (LROC) instrument overview  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

2010-01-01

35

Mars Observer camera  

NASA Technical Reports Server (NTRS)

The Mars Observer camera (MOC) is a three-component system (one narrow-angle and two wide-angle cameras) designed to take high spatial resolution pictures of the surface of Mars and to obtain lower spatial resolution, synoptic coverage of the planet's surface and atmosphere. The cameras are based on the 'push broom' technique; that is, they do not take 'frames' but rather build pictures, one line at a time, as the spacecraft moves around the planet in its orbit. MOC is primarily a telescope for taking extremely high resolution pictures of selected locations on Mars. Using the narrow-angle camera, areas ranging from 2.8 km x 2.8 km to 2.8 km x 25.2 km (depending on available internal digital buffer memory) can be photographed at about 1.4 m/pixel. Additionally, lower-resolution pictures (to a lowest resolution of about 11 m/pixel) can be acquired by pixel averaging; these images can be much longer, ranging up to 2.8 x 500 km at 11 m/pixel. High-resolution data will be used to study sediments and sedimentary processes, polar processes and deposits, volcanism, and other geologic/geomorphic processes.

Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Ravine, M. A.; Soulanille, T. A.

1992-01-01

36

Fast camera calibration technology applied to helicopter blades pyramid angle measurement  

Microsoft Academic Search

This paper presents a new method applied to helicopter blades pyramid angle measurement system based on CMOS technology, the overall design scheme of the system is offered and the imaging error is analyzed. Based on similarity criterion, the calculation methods on some critical components parameters of the helicopter dynamic balance test platform model are introduced. Provides a concise and rapid

Chengtao Cai; Mai Jiang; Fan Zhang; Huayun Lin

2010-01-01

37

Angles.  

National Technical Information Service (NTIS)

Shows the Brownstone Kids teaching others how to dance. Illustrates angles as they turn and sing 'Get the Angles.' The group goes to a community center for playing pool in 'Calling Shoots.' Discusses angles showing the 90-degree, 45-degree, and 180-degree...

1994-01-01

38

Angles  

NSDL National Science Digital Library

This interactive Flash applet enables students, using estimation and measurement skills, to investigate angles. Teachers can use this page for demonstrating how to read a protractor, and the protractor can be hidden to give students practice in estimating angle measures. The size of the angle can be controlled or chosen randomly.

Bunker, Dan

2011-01-01

39

Angles  

NSDL National Science Digital Library

This Java applet enables students to investigate acute, obtuse, and right angles. The student decides to work with one or two transversals and a pair of parallel lines. Angle measure is given for one angle. The student answers a short series of questions about the size of other angles, identifying relationships such as vertical and adjacent angles and alternate interior and alternate exterior angles. In addition to automatically checking the student's answers, the applet can keep score of correct answers. From the activity page, What, How, and Why buttons open pages that explain the activity's purpose, function, and how the mathematics fits into the curriculum. Supplemental resources include lesson plans and a handout with a grid for showing the relationship between all possible angles that occur when parallel lines are cut by a transversal. Copyright 2005 Eisenhower National Clearinghouse

Foundation, Shodor E.

2004-01-01

40

Angles  

NSDL National Science Digital Library

This lesson is designed to introduce students to different types of angles including acute, obtuse, and right. The lesson also introduces ways to compare angles such as alternate interior, corresponding, and many others. This lesson provides links to discussions and activities related to angles as well as suggested ways to integrate them into the lesson. Finally, the lesson provides links to follow-up lessons designed for use in succession with the current one.

2010-01-01

41

Post-trial anatomical frame alignment procedure for comparison of 3D joint angle measurement from magnetic/inertial measurement units and camera-based systems.  

PubMed

Magnetic and inertial measurement units (MIMUs) have been widely used as an alternative to traditional camera-based motion capture systems for 3D joint kinematics measurement. Since these sensors do not directly measure position, a pre-trial anatomical calibration, either with the assistance of a special protocol/apparatus or with another motion capture system is required to establish the transformation matrices between the local sensor frame and the anatomical frame (AF) of each body segment on which the sensors are attached. Because the axes of AFs are often used as the rotational axes in the joint angle calculation, any difference in the AF determination will cause discrepancies in the calculated joint angles. Therefore, a direct comparison of joint angles between MIMU systems and camera-based systems is less meaningful because the calculated joint angles contain a systemic error due to the differences in the AF determination. To solve this problem a new post-trial AF alignment procedure is proposed. By correcting the AF misalignments, the joint angle differences caused by the difference in AF determination are eliminated and the remaining discrepancies are mainly from the measurement accuracy of the systems themselves. Lower limb joint angles from 30 walking trials were used to validate the effectiveness of the proposed AF alignment procedure. This technique could serve as a new means for calibrating magnetic/inertial sensor-based motion capture systems and correcting for AF misalignment in scenarios where joint angles are compared directly. PMID:25340557

Li, Qingguo; Zhang, Jun-Tian

2014-12-01

42

Ultra-narrow angle-tunable Fabry-Perot bandpass interference filter for use as tuning element in infrared lasers  

NASA Astrophysics Data System (ADS)

We have developed a bandpass infrared interference filter with sufficiently narrow bandwidth to be potentially suitable for tuning a self-stabilizing external-cavity quantum-cascade laser (ECQCL) in single-mode operation and describe the process parameters for fabrication of such filters with central wavelengths in the 3-12 ?m range. The filter has a passband width of 6 nm or 0.14% with peak transmission of 55% and a central wavelength of approximately 4.0 ?m. It can be tuned through over 4% by tilting with respect to the incident beam and offers orders of magnitude larger angular dispersion than diffraction gratings. We compare filters with single-cavity and coupled-cavity Fabry-Perot designs.

Kischkat, Jan; Peters, Sven; Semtsiv, Mykhaylo P.; Wegner, Tristan; Elagin, Mikaela; Monastyrskyi, Grygorii; Flores, Yuri; Kurlov, Sergii; Masselink, W. Ted

2014-11-01

43

6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

44

2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

45

7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

46

Erratum: First Results from the Wide Angle Camera of the ROSETTA Mission [Mem.SAIt Suppl. 6, 28-33 (2005)  

NASA Astrophysics Data System (ADS)

The authors acknowledge that the paper fails to convey the correct information about the respective contributions and roles of the partners of the OSIRIS consortium. In particular, the hardware contributions of the Max-Planck Institut für Sonnensystemforschung, MPS, (Katlenburg Lindau, Germany, formerly MPAe), of the Instituto de Astrofisica de Andalucia (Granada, Spain), of the Department of Astronomy and Space Physics of Uppsala University (DASP), of ESA Research and Scientific Support Department (ESA/RSSD) to the Wide Angle Camera has not been mentioned or incorrectly expounded. The overall responsibility (PI ship) of MPS (MPAe) for OSIRIS and hence for the Wide Angle Camera is not correctly mentioned either. The correct information is given in the paper by Keller et al. (2006, Space Science Review, in press). The authors take this opportunity to acknowledge that the activity of the Italian team has been partly supported by the Italian Space Agency ASI through a contract to CISAS.

Barbieri, C.; Fornasier, S.; Bertini, I.; Angrilli, F.; Bianchini, G. A.; Debei, S.; de Cecco, M.; Parzianello, G.; Zaccariotto, M.; da Deppo, V.; Naletto, G.

47

Mars Global Surveyor Mars Orbiter Camera Image Gallery  

NSDL National Science Digital Library

This site from Malin Space Science Systems provides access to all of the images acquired by the Mars Orbiter Camera (MOC) during the Mars Global Surveyor mission through March 2005. MOC consists of several cameras: A narrow angle system that provides grayscale high resolution views of the planet's surface (typically, 1.5 to 12 meters/pixel), and red and blue wide angle cameras that provide daily global weather monitoring, context images to determine where the narrow angle views were actually acquired, and regional coverage to monitor variable surface features such as polar frost and wind streaks. Ancillary data for each image is provided and instructions regarding gallery usage are also available on the site.

Systems, Malin S.

48

MUSIC - Multifunctional stereo imaging camera system for wide angle and high resolution stereo and color observations on the Mars-94 mission  

NASA Astrophysics Data System (ADS)

Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.

Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.

1990-10-01

49

Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.  

PubMed

Abstract The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC)?=?0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC?=?0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC?=?0.86) and lowest during mid-stance at the hip without markers (ICC?=?0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited. PMID:25230893

Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

2015-01-01

50

Quasi-null lens optical system for the fabrication of an oblate convex ellipsoidal mirror: application to the Wide Angle Camera of the Rosetta space mission.  

PubMed

The design of a quasi-null lens system for the fabrication of an aspheric oblate convex ellipsoidal mirror is presented. The Performance and tolerance of the system have been analyzed. The system has been applied successfully for the fabrication of the primary mirror of the Wide Angle Camera (WAC), the imaging system onboard the Rosetta, the European Space Agency cornerstone mission dedicated to the exploration of a comet. The WAC is based on an off-axis two-mirror configuration, in which the primary mirror is an oblate convex ellipsoid with a significant conic constant. PMID:16892112

Pelizzo, Maria-Guglielmina; Da Deppo, Vania; Naletto, Giampiero; Ragazzoni, Roberto; Novi, Andrea

2006-08-20

51

Angle-of-arrival anemometry by means of a large-aperture Schmidt-Cassegrain telescope equipped with a CCD camera.  

PubMed

The frequency spectrum of angle-of-arrival (AOA) fluctuations of optical waves propagating through atmospheric turbulence carries information of wind speed transverse to the propagation path. We present the retrievals of the transverse wind speed, upsilon b, from the AOA spectra measured with a Schmidt-Cassegrain telescope equipped with a CCD camera by estimating the "knee frequency," the intersection of two power laws of the AOA spectrum. The rms difference between 30 s estimates of upsilon b retrieved from the measured AOA spectra and 30s averages of the transverse horizontal wind speed measured with an ultrasonic anemometer was 11 cm s(-1) for a 1 h period, during which the transverse horizontal wind speed varied between 0 and 80 cm s(-1). Potential and limitations of angle-of-arrival anemometry are discussed. PMID:17975575

Cheon, Yonghun; Hohreiter, Vincent; Behn, Mario; Muschinski, Andreas

2007-11-01

52

7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

53

3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

54

New stereoscopic video camera and monitor system with central high resolution  

NASA Astrophysics Data System (ADS)

A new stereoscopic video system (the Q stereoscopic video system), which has high resolution in the central area, has been developed using four video cameras and four video displays. The Q stereoscopic camera system is constructed using two cameras with wide-angle lenses, which are combined as the stereoscopic camera system, and two cameras with narrow-angle lenses, which are combined (using half mirrors) with each of the wide-angle cameras to have the same optical center axis. The Q stereoscopic display system is composed of two large video displays that receive images from the wide-angle stereoscopic cameras, and two smaller displays projecting images from the narrow-angle cameras. With this system, human operators are able to see the stereoscopic images of the smaller displays inserted in the images of the larger displays. Completion times for the pick-up task of a remote controlled robot were shorter when using the Q stereoscopic video system rather than a conventional stereoscopic video system.

Matsunaga, Katsuya; Nose, Yasuhiro; Minamoto, Masahiko; Shidoji, Kazunori; Ebuchi, Kazuhisa; Itoh, Daisuke; Inoue, Tomonori; Hayami, Taketo; Matsuki, Yuji; Arikawa, Yuko; Matsubara, Kenjiro

1998-04-01

55

1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

56

Miniaturized fundus camera  

NASA Astrophysics Data System (ADS)

We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.

Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.

2003-07-01

57

Wide Angle Movie  

NASA Technical Reports Server (NTRS)

This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

1999-01-01

58

Camera Obscura  

NSDL National Science Digital Library

Before photography was invented there was the camera obscura, useful for studying the sun, as an aid to artists, and for general entertainment. What is a camera obscura and how does it work ??? Camera = Latin for room Obscura = Latin for dark But what is a Camera Obscura? The Magic Mirror of Life What is a camera obscura? A French drawing camera with supplies A French drawing camera with supplies Drawing Camera Obscuras with Lens at the top Drawing Camera Obscuras with Lens at the top Read the first three paragraphs of this article. Under the portion Early Observations and Use in Astronomy you will find the answers to the ...

Engelman, Mr.

2008-10-28

59

Narrow-angle cosmic-ray anisotropies  

NASA Technical Reports Server (NTRS)

An alternate interpretation is presented for the diurnal cosmic ray anisotropy measurements made with underground muons in London. From the widely accepted models of cosmic ray diffusion in the Galaxy, a diurnal anisotropy (24 h wave) would be expected. But from a model predicting the occurrence of an excess within some small region of the celestial sphere, it is suggested that the direction of this excess would depend on the orientation (in space and time) of the source relative to galactic magnetic field lines which connect the source with the solar system.

Barrowes, S.

1975-01-01

60

Imaging Narrow Angle The Voyager Spacecraft  

E-print Network

NASA's Deep Space Network. The Voyager spacecraft are on a unique exploratory mission. The two components of the observatory that are, for now and in the foresee- able future, making measurements. The CRS instrument mea- sures the energy spectrum of electrons and cosmic ray nuclei and uses three

Waliser, Duane E.

61

8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

62

Camera Animation  

NSDL National Science Digital Library

A general discussion of the use of cameras in computer animation. This section includes principles of traditional film techniques and suggestions for the use of a camera during an architectural walkthrough. This section includes html pages, images and one video.

2011-01-30

63

MIT Media Lab Camera Culture Coded Computational Imaging  

E-print Network

Camera Culture Bokode fb Bokode image depends on camera angle camera #12;MIT Media Lab Camera CultureMIT Media Lab Camera Culture Coded Computational Imaging: Light Fields and Applications Ankit Mohan MIT Media Lab #12;Coded Computational Imaging Agrawal, Veeraraghavan, Narasimhan & Mohan Schedule

Agrawal, Amit

64

Dynamics of an oscillating bubble in a narrow gap  

E-print Network

The complex dynamics of a single bubble of a few millimeters in size oscillating inside a narrow fluid-filled gap between two parallel plates is studied using high-speed videography. Two synchronized high-speed cameras ...

Azam, Fahad Ibn

65

1. VARIABLEANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

1. VARIABLE-ANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER STRUCTURE LOOKING SOUTH AND ARCHED OPENING FOR ROADWAY. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

66

Faint Object Camera imaging and spectroscopy of NGC 4151  

NASA Technical Reports Server (NTRS)

We describe ultraviolet and optical imaging and spectroscopy within the central few arcseconds of the Seyfert galaxy NGC 4151, obtained with the Faint Object Camera on the Hubble Space Telescope. A narrowband image including (O III) lambda(5007) shows a bright nucleus centered on a complex biconical structure having apparent opening angle approximately 65 deg and axis at a position angle along 65 deg-245 deg; images in bands including Lyman-alpha and C IV lambda(1550) and in the optical continuum near 5500 A, show only the bright nucleus. In an off-nuclear optical long-slit spectrum we find a high and a low radial velocity component within the narrow emission lines. We identify the low-velocity component with the bright, extended, knotty structure within the cones, and the high-velocity component with more confined diffuse emission. Also present are strong continuum emission and broad Balmer emission line components, which we attribute to the extended point spread function arising from the intense nuclear emission. Adopting the geometry pointed out by Pedlar et al. (1993) to explain the observed misalignment of the radio jets and the main optical structure we model an ionizing radiation bicone, originating within a galactic disk, with apex at the active nucleus and axis centered on the extended radio jets. We confirm that through density bounding the gross spatial structure of the emission line region can be reproduced with a wide opening angle that includes the line of sight, consistent with the presence of a simple opaque torus allowing direct view of the nucleus. In particular, our modelling reproduces the observed decrease in position angle with distance from the nucleus, progressing initially from the direction of the extended radio jet, through our optical structure, and on to the extended narrow-line region. We explore the kinematics of the narrow-line low- and high-velocity components on the basis of our spectroscopy and adopted model structure.

Boksenberg, A.; Catchpole, R. M.; Macchetto, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.

1995-01-01

67

5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

68

Electronic still camera  

NASA Astrophysics Data System (ADS)

A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

Holland, S. Douglas

1992-09-01

69

Electronic Still Camera  

NASA Technical Reports Server (NTRS)

A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

Holland, S. Douglas (inventor)

1992-01-01

70

Camera Projector  

NSDL National Science Digital Library

In this activity (posted on March 14, 2011), learners follow the steps to construct a camera projector to explore lenses and refraction. First, learners use relatively simple materials to construct the projector. Then, learners discover that lenses project images upside down and backwards. They explore this phenomenon by creating their own slides (must be drawn upside down and backwards to appear normally). Use this activity to also introduce learners to spherical aberration and chromatic aberration.

Center, Oakland D.

2011-01-01

71

Characterization of Narrow Band Filters for Infrared The Brfl and H 2 filters  

E-print Network

Characterization of Narrow Band Filters for Infrared Astronomy The Brfl and H 2 filters L. Vanzi: Infrared, Narrow Band Filters, Imaging Abbreviations: IR -- Infrared; NIR -- Near infrared JEL codes: D24 experiments with narrow band filters mounted on the Infrared Camera ARNICA (Lisi et al. 1996; Hunt et al. 1996

Testi, Leonardo

72

The DRAGO gamma camera  

SciTech Connect

In this work, we present the results of the experimental characterization of the DRAGO (DRift detector Array-based Gamma camera for Oncology), a detection system developed for high-spatial resolution gamma-ray imaging. This camera is based on a monolithic array of 77 silicon drift detectors (SDDs), with a total active area of 6.7 cm{sup 2}, coupled to a single 5-mm-thick CsI(Tl) scintillator crystal. The use of an array of SDDs provides a high quantum efficiency for the detection of the scintillation light together with a very low electronics noise. A very compact detection module based on the use of integrated readout circuits was developed. The performances achieved in gamma-ray imaging using this camera are reported here. When imaging a 0.2 mm collimated {sup 57}Co source (122 keV) over different points of the active area, a spatial resolution ranging from 0.25 to 0.5 mm was measured. The depth-of-interaction capability of the detector, thanks to the use of a Maximum Likelihood reconstruction algorithm, was also investigated by imaging a collimated beam tilted to an angle of 45 deg. with respect to the scintillator surface. Finally, the imager was characterized with in vivo measurements on mice, in a real preclinical environment.

Fiorini, C.; Gola, A.; Peloso, R.; Longoni, A. [Dipartimento di Elettronica e Informazione, Politecnico di Milano, Milano 20133, Italy and INFN, Sezione di Milano, Milano 20133 (Italy); Lechner, P.; Soltau, H. [PNSensor GmbH and PNDetector GmbH, D-80803 Munich (Germany); Strueder, L. [Max Planck Institut Halbleiterlabor, D-81739 Munich (Germany); Ottobrini, L.; Martelli, C.; Lui, R. [Department of Biomedical Sciences and Technologies, University of Milan, Milano 20133, Italy and Centre of Molecular and Cellular Imaging-IMAGO, Milano (Italy); Madaschi, L. [Department of Medicine, Surgery, and Dentistry, University of Milan, Milano 20142 (Italy); Belloli, S. [IBFM-CNR, Istituto Scientifico Ospedale San Raffale, Milano 20132 (Italy)

2010-04-15

73

Narrow Band Infrared Filters with Broad Field of View  

Microsoft Academic Search

Optical interference notch filters shift to shorter wavelengths with increasing angles of incidence. This phenomenon restricts the filter's field of view and limits the practical application of narrow reflection notch filters. The amount of shift is inversely proportional to the effective average index of the composite film. A method of designing narrow notch optical filters with very broad field of

Thomas D. Rahmlow; Jeanne E. Lazo-Wasem; Edward J. Gratrix

74

Variant Narrowing and Extreme Termination  

Microsoft Academic Search

For narrowing with a set of rules modulo a set of axioms B almost nothing is known about terminating narrowing strategies, and basic narrowing is known to be incomplete for B = AC. In this work we ask and answer the question: Is there such a thing as an extremely terminating narrowing strategy modulo B? where we call a narrowing

Santiago Escobar; Jose Meseguer; Ralf Sasse

75

Angle Sums  

NSDL National Science Digital Library

With this applet, students can examine the angles in a triangle, quadrilateral, pentagon, hexagon, heptagon or octagon. They can change the shape of the figure by dragging the vertices; the size of each angle is shown and the sum of the interior angles calculated. Students are challenged to find a relationship between the number of sides and the sum of the interior angles.

Illuminations, Nctm

2000-01-01

76

Laboratory calibration and characterization of video cameras  

NASA Astrophysics Data System (ADS)

Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of nonperpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitably aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

1990-08-01

77

Lobate Scarp Modeling with Lunar Reconnaissance Orbiter Camera Digital Terrain Models  

NASA Astrophysics Data System (ADS)

Lobate scarps are a type of contractional tectonic landform expressed on the Moon's surface in both highlands and maria. Typically only tens of meters in relief, these linear or curvilinear topographic rises are interpreted to be low-angle thrust fault scarps resulting from global radial contraction. Radial contraction of the Moon can be inferred from shortening across the population of lobate scarps and is estimated at ~100 m. However, the geometry and depth of the underlying faults and mechanical properties of the near-surface lunar crustal materials are not well constrained. The Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NACs) acquire 0.5 to 2.0 m/pixel panchromatic images and digital terrain models (DTMs) with spatial resolutions of 2 m are derived from NAC stereo pairs. Topographic data are being used to constrain models of the lobate scarp thrust faults. DTMs are analyzed for relief and morphology of the Slipher (48.3°N, 160.6°E), Racah X-1 (10°S, 178°E), and Simpelius-1 (73.5°S, 13°E) scarps. Profiles are extracted, detrended, and compared along strike. LROC Wide Angle Camera (WAC) 100 m/pixel image mosaics and topography provide regional contexts. Using elastic dislocation modeling, the fault dip angles, depths, slip, and taper are each varied until the predicted surface displacement best fits the DTM profiles for each lobate scarp. Preliminary best-fit dip angles vary from 30-40°, maximum fault depths extend to several hundred meters, and the amount of slip varies from 10 to 30 meters for the three scarps. The modeled maximum depths suggest that the thrust faults are not deeply rooted.

Williams, N. R.; Watters, T. R.; Pritchard, M. E.; Banks, M. E.; Bell, J. F.; Robinson, M. S.; Tran, T.

2011-12-01

78

Readout electronics of physics of accelerating universe camera  

NASA Astrophysics Data System (ADS)

The Physics of Accelerating Universe Camera (PAUCam) is a new camera for dark energy studies that will be installed in the William Herschel telescope. The main characteristic of the camera is the capacity for high precision photometric redshift measurement. The camera is composed of eighteen Hamamatsu Photonics CCDs providing a wide field of view covering a diameter of one degree. Unlike the common five optical filters of other similar surveys, PAUCam has forty optical narrow band filters which will provide higher resolution in photometric redshifts. In this paper a general description of the electronics of the camera and its status is presented.

de Vicente, Juan; Castilla, Javier; Jiménez, Jorge; Cardiel-Sas, L.; Illa, José M.

2014-08-01

79

System Synchronizes Recordings from Separated Video Cameras  

NASA Technical Reports Server (NTRS)

A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

2009-01-01

80

Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data  

USGS Publications Warehouse

We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

2012-01-01

81

Narrowness and Liberality  

ERIC Educational Resources Information Center

John Agresto, whose task has been to rebuild the war-ravaged infrastructure of a Middle-Eastern university system, is discouraged to see that narrow expertise is the only goal of education there, to the utter exclusion of intellectual breadth. He comments that, although it is not that bad in the U.S., he feels that doctoral programs as currently…

Agresto, John

2003-01-01

82

Broad versus Narrow. Editorial.  

ERIC Educational Resources Information Center

Discusses the different roles played by eclectic versus specialized scholarly journals in education. The narrow focus of specialized journals is useful to scholars with limited time but may exclude authors with controversial viewpoints. Eclectic journals provide a broader picture of educational trends and a forum for scholarly debate, unorthodox…

Buck, George H.

2001-01-01

83

Flight path following guidance for unmanned air vehicles with pan-tilt camera for target observation  

Microsoft Academic Search

An Unmanned Autonomous Vehicle (UAV) is equipped with a nose-mounted camera capable of pan and tilt rotation for the observation of ground targets. The two camera angles are adjusted automatically in order to keep the target in the camera's field of view. While the camera actuators are fast enough to keep up with vehicle motion, the limited range of the

S. Stolle; Rolf Rysdyk

2003-01-01

84

Experience with duplex bearings in narrow angle oscillating applications  

NASA Technical Reports Server (NTRS)

Duplex ball bearings are matched pairs on which the abutting faces of the rings have been accurately ground so that when the rings are clamped together, a controlled amount of interference (preload) exists across the balls. These bearings are vulnerable to radial temperature gradients, blocking in oscillation and increased sensitivity to contamination. These conditions decrease the service life of these bearings. It was decided that an accelerated thermal vacuum life test should be conducted. The test apparatus and results are described and the rationale is presented for reducing a multiyear life test on oil lubricated bearings to less than a year.

Phinney, D. D.; Pollard, C. L.; Hinricks, J. T.

1988-01-01

85

Two-Camera Acquisition and Tracking of a Flying Target  

NASA Technical Reports Server (NTRS)

A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

2008-01-01

86

Angles (elementary)  

NSDL National Science Digital Library

This lesson is designed to introduce students to acute, obtuse, and right angles. This lesson provides links to discussions and activities related to angles as well as suggested ways to integrate them into the lesson. Finally, the lesson provides links to follow-up lessons designed for use in succession with the current one.

2010-01-01

87

Angle Hunting  

NSDL National Science Digital Library

In this activity, learners use a hand-made protractor to measure angles they find in playground equipment. Learners will observe that angle measurements do not change with distance, because they are distance invariant, or constant. Note: The "Pocket Protractor" activity should be done ahead as a separate activity (see related resource), but a standard protractor can be used as a substitute.

Exploratorium

2010-01-01

88

Evaluating intensified camera systems  

SciTech Connect

This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

S. A. Baker

2000-06-30

89

Vacuum Camera Cooler  

NASA Technical Reports Server (NTRS)

Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

Laugen, Geoffrey A.

2011-01-01

90

Making Ceramic Cameras  

ERIC Educational Resources Information Center

This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

Squibb, Matt

2009-01-01

91

Constrained space camera assembly  

DOEpatents

A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

Heckendorn, Frank M. (Aiken, SC); Anderson, Erin K. (Augusta, GA); Robinson, Casandra W. (Trenton, SC); Haynes, Harriet B. (Aiken, SC)

1999-01-01

92

Angle Measurer  

NSDL National Science Digital Library

This Flash applet gives students practice in creating an angle measuring between zero and 180 degrees. Two buttons control the increase or decrease of the opening between two rays. Points are awarded for accuracy.

2011-01-01

93

Right Angle  

NSDL National Science Digital Library

This article gives teachers background information on right angles. It provides geometric and practical examples, a paper folding construction method, and some history of the usage of the term 'right.'

Goldenberg, Paul

2011-06-09

94

Prediction of Viking lander camera image quality  

NASA Technical Reports Server (NTRS)

Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

1976-01-01

95

cameras are watching you  

E-print Network

of software expands the small field of view that traditional pan-tilt-zoom security cameras offer. When the viewspaces of all the security cameras in an area overlap. Then it can determine the geocameras are watching you New surveillance camera being developed by Ohio

Davis, James W.

96

Tower Press Camera  

USGS Multimedia Gallery

"4x5" enclosure camera with Rangefinder and Wollensak Raptar lens serial #A18388, 62 mm, 1950s-60s. Manufactured by Busch Optical Manufacturing Company, Chicago, Illinois (also known as Busch Precision Camera Corporation). The company was famous for its versatile press cameras which featured an opti...

2009-07-22

97

Omnifocus video camera  

NASA Astrophysics Data System (ADS)

The omnifocus video camera takes videos, in which objects at different distances are all in focus in a single video display. The omnifocus video camera consists of an array of color video cameras combined with a unique distance mapping camera called the Divcam. The color video cameras are all aimed at the same scene, but each is focused at a different distance. The Divcam provides real-time distance information for every pixel in the scene. A pixel selection utility uses the distance information to select individual pixels from the multiple video outputs focused at different distances, in order to generate the final single video display that is everywhere in focus. This paper presents principle of operation, design consideration, detailed construction, and over all performance of the omnifocus video camera. The major emphasis of the paper is the proof of concept, but the prototype has been developed enough to demonstrate the superiority of this video camera over a conventional video camera. The resolution of the prototype is high, capturing even fine details such as fingerprints in the image. Just as the movie camera was a significant advance over the still camera, the omnifocus video camera represents a significant advance over all-focus cameras for still images.

Iizuka, Keigo

2011-04-01

98

9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

99

Contrail study with ground-based cameras  

NASA Astrophysics Data System (ADS)

Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other, together with a short-lived one, are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 230 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails, suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

2013-12-01

100

Contrail study with ground-based cameras  

NASA Astrophysics Data System (ADS)

Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other together with a short-lived one are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 200 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

2013-08-01

101

The MMT all-sky camera  

NASA Astrophysics Data System (ADS)

The MMT all-sky camera is a low-cost, wide-angle camera system that takes images of the sky every 10 seconds, day and night. It is based on an Adirondack Video Astronomy StellaCam II video camera and utilizes an auto-iris fish-eye lens to allow safe operation under all lighting conditions, even direct sunlight. This combined with the anti-blooming characteristics of the StellaCam's detector allows useful images to be obtained during sunny days as well as brightly moonlit nights. Under dark skies the system can detect stars as faint as 6th magnitude as well as very thin cirrus and low surface brightness zodiacal features such as gegenschein. The total hardware cost of the system was less than $3500 including computer and framegrabber card, a fraction of the cost of comparable systems utilizing traditional CCD cameras.

Pickering, T. E.

2006-06-01

102

Narrow Escape, Part I  

E-print Network

A Brownian particle with diffusion coefficient $D$ is confined to a bounded domain of volume $V$ in $\\rR^3$ by a reflecting boundary, except for a small absorbing window. The mean time to absorption diverges as the window shrinks, thus rendering the calculation of the mean escape time a singular perturbation problem. We construct an asymptotic approximation for the case of an elliptical window of large semi axis $a\\ll V^{1/3}$ and show that the mean escape time is $E\\tau\\sim\\ds{\\frac{V}{2\\pi Da}} K(e)$, where $e$ is the eccentricity of the ellipse; and $K(\\cdot)$ is the complete elliptic integral of the first kind. In the special case of a circular hole the result reduces to Lord Rayleigh's formula $E\\tau\\sim\\ds{\\frac{V}{4aD}}$, which was derived by heuristic considerations. For the special case of a spherical domain, we obtain the asymptotic expansion $E\\tau=\\ds{\\frac{V}{4aD}} [1+\\frac{a}{R} \\log \\frac{R}{a} + O(\\frac{a}{R}) ]$. This problem is important in understanding the flow of ions in and out of narrow valves that control a wide range of biological and technological function.

A. Singer; Z. Schuss; D. Holcman; R. S. Eisenberg

2004-12-15

103

NYC Surveillance Camera Project  

NSDL National Science Digital Library

These two sites focus on the increasing numbers of surveillance cameras in New York City. The first provides a .pdf-formatted map of the more than 2,300 camera locations throughout New York as well as text listings broken down by community. The information was compiled by volunteers from the New York Civil Liberties Union (NYCLU). In addition to information on camera locations, in the news section of the site, users will find links to related Websites, FAQs, and sites related to taxi cameras and traffic cameras. Both of these sites are unabashedly anti-surveillance technology and will be appreciated by New Yorkers concerned with civil liberties issues.

1998-01-01

104

Tower Camera Handbook  

SciTech Connect

The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

Moudry, D

2005-01-01

105

Automatic camera tracking for remote manipulators  

SciTech Connect

The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2/sup 0/ deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

1984-04-01

106

Automatic camera tracking for remote manipulators  

SciTech Connect

The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

1984-07-01

107

Practical Physics: Pinhole Camera and Lens Camera  

NSDL National Science Digital Library

This is a classroom lab for grades 6-12 designed to allow students to compare the action of a pinhole camera and a lens camera. It gives directions for setting up a carbon filament lamp as the light source, plus technical tips provided by a physicist for getting best results. Required materials include a +7D lens and the pinhole camera itself, which can be constructed by students (see below for link). This item is part of a much larger collection of physics/astronomy experiments, sponsored by the UK's Institute of Physics and funded by the Nuffield Curriculum Centre. SEE RELATED ITEMS ON THIS PAGE for a link to detailed instructions for building the pinhole camera box.

Centre, Nuffield C.

2009-05-27

108

Single-Camera Panoramic-Imaging Systems  

NASA Technical Reports Server (NTRS)

Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

Lindner, Jeffrey L.; Gilbert, John

2007-01-01

109

Automated Camera Calibration  

NASA Technical Reports Server (NTRS)

Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

Chen, Siqi; Cheng, Yang; Willson, Reg

2006-01-01

110

Diffusion-induced Ramsey narrowing  

E-print Network

A novel form of Ramsey narrowing is identified and characterized. For long-lived coherent atomic states coupled by laser fields, the diffusion of atoms in-and-out of the laser beam induces a spectral narrowing of the atomic resonance lineshape. Illustrative experiments and an intuitive analytical model are presented for this diffusion-induced Ramsey narrowing, which occurs commonly in optically-interrogated systems.

Yanhong Xiao; Irina Novikova; David F. Phillips; Ronald L. Walsworth

2005-07-19

111

Estimating Angles  

NSDL National Science Digital Library

This Flash game for one or two players gives students practice in estimating the size of angles. A circle and a radius pointing in a random direction are given. The student activates a second sweeping radius, which can move in either direction, and tries to stop it at the specified measure. Three difficulty levels control the range of angle measures. Points are awarded based on closeness of the estimate. The Teachers' Notes page includes suggestions for implementation, discussion questions, ideas for extension and support.

2007-06-01

112

REAL-TIME 3D SLAM WITH WIDE-ANGLE Andrew J. Davison ,1  

E-print Network

REAL-TIME 3D SLAM WITH WIDE-ANGLE VISION Andrew J. Davison ,1 Yolanda Gonz´alez Cid Nobuyuki Kita of single-camera SLAM is improved when wide-angle optics provide a field of view greater than the 40 to 50 frames per second), fully automatic implementation of 3D SLAM using a hand-waved wide- angle camera

Davison, Andrew

113

UV Cameras for Volcanic Monitoring  

NASA Astrophysics Data System (ADS)

Levels of SO2 emission provide valuable information on the activity status of volcanic systems and are routinely used in hazard and risk assessment. A recent development in this field is UV camera technology, an effective and easy to use method for remote monitoring of volcanic emissions, which provides information across the full field of view and real time analysis of equipment set-up and performance. This study, carried out on Stromboli, Italy, in July 2010 sought to explore the range of data available from this technique and improve issues relating to instrument calibration, building on the findings of Kantazas et al (2010) and Kern et al (2010). A 1Hz passive and explosive degassing data set was obtained using a dual camera set-up, filters focused on 310 nm and 330 nm wavelengths, in conjunction with a fixed point USB2000 spectrometer. The cameras were initially calibrated using cells containing known values of SO2. During recording periods the adoption of a new rapid calibration protocol provided enhanced data quality whilst minimising monitoring down time. Data was analysed using an in house built Lab View VI routine (Tamburello et al 2011). The ability to take multi directional plume cross sections improved the accuracy of obliquely angled plume data, whilst enabling within program measurement of plume speed. Explosive masses were also measured with values obtained for both short duration and prolonged release events. In addition to emitted SO2, the visual aspect of data sets enabled measurement and monitoring of ascent velocities, direction of ejection, plume collimation and changes between explosive types. Furthermore, flexibility within post processing set-up permitted concurrent analysis of passive and active degassing behaviours. Time shifting of plume traces to the start times of explosive events allowing interplay between these two behaviours to be directly studied. This work demonstrates that UV cameras are versatile and a valuable contributor to the systematic study of volcanic degassing processes.

Tamburelllo, G.; Swanson, E.

2011-12-01

114

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

Wang, C.L.

1984-09-28

115

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

Wang, Ching L. (Livermore, CA)

1989-01-01

116

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

Wang, C.L.

1989-03-21

117

Control Architecture for a UAV-Mounted Pan\\/Tilt\\/Roll Camera Gimbal  

Microsoft Academic Search

This paper presents the architecture of a pan\\/tilt\\/roll camera control system implemented on the Georgia Tech's UAV research helicopter, the GTMax. The controller has currently three operating modes available: it can keep the camera at a fixed angle with respect to the helicopter, make the camera point in the direction of the helicopter velocity vector, or track a specific location.

Ole C. Jakobsen; Eric N. Johnson

118

FPA camera standardisation  

NASA Astrophysics Data System (ADS)

The temperature standardisation of an infrared camera is generally done with an internal black body. However, some cameras do not have such correction and some particular effects like Narcissus or other internal contributions disturb the measurements. The determination of the different contributions of the thermosignal given by the camera allows us to propose a procedure in order to obtain an absolute temperature with a precision of one degree.

Horny, N.

2003-04-01

119

Analytical multicollimator camera calibration  

USGS Publications Warehouse

Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

Tayman, W. P.

1978-01-01

120

Digital Electronic Still Camera  

NASA Technical Reports Server (NTRS)

Digital electronic still camera part of electronic recording, processing, tansmitting, and displaying system. Removable hard-disk drive in camera serves as digital electronic equivalent of photographic film. Images viewed, analyzed, or transmitted quickly. Camera takes images of nearly photographic quality and stores them in digital form. Portable, hand-held, battery-powered unit designed for scientific use. Camera used in conjunction with playback unit also serving as transmitting unit if images sent to remote station. Remote station equipped to store, process, and display images. Digital image data encoded with error-correcting code at playback/transmitting unit for error-free transmission to remote station.

Holland, Samuel D.; Yeates, Herbert D.

1993-01-01

121

LSST Camera Optics Design  

SciTech Connect

The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

2012-05-24

122

Angle detector  

NASA Technical Reports Server (NTRS)

An angle detector for determining a transducer's angular disposition to a capacitive pickup element is described. The transducer comprises a pendulum mounted inductive element moving past the capacitive pickup element. The capacitive pickup element divides the inductive element into two parts L sub 1 and L sub 2 which form the arms of one side of an a-c bridge. Two networks R sub 1 and R sub 2 having a plurality of binary weighted resistors and an equal number of digitally controlled switches for removing resistors from the networks form the arms of the other side of the a-c bridge. A binary counter, controlled by a phase detector, balances the bridge by adjusting the resistance of R sub 1 and R sub 2. The binary output of the counter is representative of the angle.

Parra, G. T. (inventor)

1978-01-01

123

New developments to improve SO2 cameras  

NASA Astrophysics Data System (ADS)

The SO2 camera is a remote sensing instrument that measures the two-dimensional distribution of SO2 (column densities) in volcanic plumes using scattered solar radiation as a light source. From these data SO2-fluxes can be derived. The high time resolution of the order of 1 Hz allows correlating SO2 flux measurements with other traditional volcanological measurement techniques, i.e., seismology. In the last years the application of SO2 cameras has increased, however, there is still potential to improve the instrumentation. First of all, the influence of aerosols and ash in the volcanic plume can lead to large errors in the calculated SO2 flux, if not accounted for. We present two different concepts to deal with the influence of ash and aerosols. The first approach uses a co-axial DOAS system that was added to a two filter SO2 camera. The camera used Filter A (peak transmission centred around 315 nm) to measures the optical density of SO2 and Filter B (centred around 330 nm) to correct for the influence of ash and aerosol. The DOAS system simultaneously performs spectroscopic measurements in a small area of the camera's field of view and gives additional information to correct for these effects. Comparing the optical densities for the two filters with the SO2 column density from the DOAS allows not only a much more precise calibration, but also to draw conclusions about the influence from ash and aerosol scattering. Measurement examples from Popocatépetl, Mexico in 2011 are shown and interpreted. Another approach combines the SO2 camera measurement principle with the extremely narrow and periodic transmission of a Fabry-Pérot interferometer. The narrow transmission window allows to select individual SO2 absorption bands (or series of bands) as a substitute for Filter A. Measurements are therefore more selective to SO2. Instead of Filter B, as in classical SO2 cameras, the correction for aerosol can be performed by shifting the transmission window of the Fabry-Pérot interferometer towards the SO2 absorption cross section minima. A correction of ash and aerosol influences with this technique can decrease deviation from the true column by more than 60%, since the wavelength difference between the two measurement channels is much smaller than in classical SO2 cameras. While the implementation of this approach for a 2D camera encompasses many challenges, it gives the possibility to build a relatively simple and robust scanning instrument for volcanic SO2 distributions. A second problem of the SO2 camera technique is the relatively high price, which prevents its use in many volcano observatories in developing countries. Most SO2 cameras use CCDs that were originally designed for astronomical purposes. The large pixel size and low noise of these detectors compensates for the low intensity of solar radiation in the UV and the low quantum efficiency of the detector in this spectral range. However, the detectors used cost several thousand US dollars. We present results from test measurements using a consumer DSLR camera as a detector of an SO2 camera. Since the camera is not sensitive in the UV, the incoming radiation is first imaged onto a screen that is covered with a suitable fluorescent dye converting the UV radiation to visible light.

Luebcke, P.; Bobrowski, N.; Hoermann, C.; Kern, C.; Klein, A.; Kuhn, J.; Vogel, L.; Platt, U.

2012-12-01

124

Improved Tracking of Targets by Cameras on a Mars Rover  

NASA Technical Reports Server (NTRS)

A paper describes a method devised to increase the robustness and accuracy of tracking of targets by means of three stereoscopic pairs of video cameras on a Mars-rover-type exploratory robotic vehicle. Two of the camera pairs are mounted on a mast that can be adjusted in pan and tilt; the third camera pair is mounted on the main vehicle body. Elements of the method include a mast calibration, a camera-pointing algorithm, and a purely geometric technique for handing off tracking between different camera pairs at critical distances as the rover approaches a target of interest. The mast calibration is an extension of camera calibration in which the camera images of calibration targets at known positions are collected at various pan and tilt angles. In the camerapointing algorithm, pan and tilt angles are computed by a closed-form, non-iterative solution of inverse kinematics of the mast combined with mathematical models of the cameras. The purely geometric camera-handoff technique involves the use of stereoscopic views of a target of interest in conjunction with the mast calibration.

Kim, Won; Ansar, Adnan; Steele, Robert

2007-01-01

125

Calibration of cameras with radially symmetric distortion.  

PubMed

We present algorithms for plane-based calibration of general radially distorted cameras. By this, we understand cameras that have a distortion center and an optical axis such that the projection rays of pixels lying on a circle centered on the distortion center form a right viewing cone centered on the optical axis. The camera is said to have a single viewpoint (SVP) if all such viewing cones have the same apex (the optical center); otherwise, we speak of NSVP cases. This model encompasses the classical radial distortion model [5], fisheyes, and most central or noncentral catadioptric cameras. Calibration consists in the estimation of the distortion center, the opening angles of all viewing cones, and their optical centers. We present two approaches of computing a full calibration from dense correspondences of a single or multiple planes with known euclidean structure. The first one is based on a geometric constraint linking viewing cones and their intersections with the calibration plane (conic sections). The second approach is a homography-based method. Experiments using simulated and a broad variety of real cameras show great stability. Furthermore, we provide a comparison with Hartley-Kang's algorithm [12], which, however, cannot handle such a broad variety of camera configurations, showing similar performance. PMID:19574618

Tardif, Jean-Philippe; Sturm, Peter; Trudeau, Martin; Roy, Sébastien

2009-09-01

126

Constrained space camera assembly  

DOEpatents

A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

1999-05-11

127

Magellan Instant Camera testbed  

E-print Network

The Magellan Instant Camera (MagIC) is an optical CCD camera that was built at MIT and is currently used at Las Campanas Observatory (LCO) in La Serena, Chile. It is designed to be both simple and efficient with minimal ...

McEwen, Heather K. (Heather Kristine), 1982-

2004-01-01

128

Cameras Everywhere Eric Hehner  

E-print Network

reported, the police can see who left the crime scene, follow their recorded movements from camera, the recorded movements of the people at a crime scene can be followed backward from the crime, leading up to it of every public area is covered by at least one camera. And I propose that the scenes viewed by all

Hehner, Eric C.R.

129

Cameras in mobile phones  

NASA Astrophysics Data System (ADS)

One of the fastest growing markets in consumer markets today are camera phones. During past few years total volume has been growing fast and today millions of mobile phones with camera will be sold. At the same time resolution and functionality of the cameras has been growing from CIF towards DSC level. From camera point of view the mobile world is an extremely challenging field. Cameras should have good image quality but in small size. They also need to be reliable and their construction should be suitable for mass manufacturing. All components of the imaging chain should be well optimized in this environment. Image quality and usability are the most important parameters to user. The current trend of adding more megapixels to cameras and at the same time using smaller pixels is affecting both. On the other hand reliability and miniaturization are key drivers for product development as well as the cost. In optimized solution all parameters are in balance but the process of finding the right trade-offs is not an easy task. In this paper trade-offs related to optics and their effects to image quality and usability of cameras are discussed. Key development areas from mobile phone camera point of view are also listed.

Nummela, Ville; Viinikanoja, Jarkko; Alakarhu, Juha

2006-04-01

130

CCD Luminescence Camera  

NASA Technical Reports Server (NTRS)

New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

Janesick, James R.; Elliott, Tom

1987-01-01

131

Cameras for Astronomy  

Microsoft Academic Search

\\u000a This chapter describes the cameras most commonly used for astronomical photography. The main component in these cameras is\\u000a a concave mirror S that forms the image of an object at infinity on its focal plane. The image formed by such a system usually exhibits aberrations\\u000a that are too large to be acceptable.

Antonio Romano

132

Dry imaging cameras  

PubMed Central

Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

2011-01-01

133

Everywhere-in-focus image fusion using controlablle cameras  

Microsoft Academic Search

Imaging parameters such as focus strongly influence data quality and the performance of content extraction techniques. Narrow depth of field gives clear focus but only over a short range of depths. This paper shows results from an algorithm that uses computer-controlled focus and pan camera movement in order to obtain a scene image that is a composite which is in

W. Brent Seales; Sandeep Dutta

1996-01-01

134

Constrained Optimization for Retinal Curvature Estimation Using an Affine Camera  

E-print Network

in the imaging process, including an actual fundus camera, a digital cam- era, and the human cornea, all of which because (1) NIH's retinal imaging protocols specify a narrow 30 field-of-view in each eye and (2) each cause significant non-linear distortions in the retinal images. In this work, we develop a new

Fan, Guoliang

135

Educational Applications for Digital Cameras.  

ERIC Educational Resources Information Center

Discusses uses of digital cameras in education. Highlights include advantages and disadvantages, digital photography assignments and activities, camera features and operation, applications for digital images, accessory equipment, and comparisons between digital cameras and other digitizers. (AEF)

Cavanaugh, Terence; Cavanaugh, Catherine

1997-01-01

136

Streak camera time calibration procedures  

NASA Technical Reports Server (NTRS)

Time calibration procedures for streak cameras utilizing a modulated laser beam are described. The time calibration determines a writing rate accuracy of 0.15% with a rotating mirror camera and 0.3% with an image converter camera.

Long, J.; Jackson, I.

1978-01-01

137

Operating Manual CCD Camera Models  

E-print Network

Operating Manual CCD Camera Models ST-7E, ST-8E, ST-9E, ST-10E and ST-1001E Santa Barbara ....................................................................................2 1.2.2. CCD Camera .............................................................................................3 2. Introduction to CCD Cameras

Natelson, Douglas

138

Helicopter rotors pyramid angle measurement based on CMOS technology  

Microsoft Academic Search

Based on computer image processing, give a new method on helicopter rotors pyramid angle by CMOS -Camera. give a overall design of the system, compare image results with different camera's location and analyze the system's qualitative error. design a synchronized system by external synchronized pulse in order to catch the accuracy image and calculate the blade's height differences by filter

Jiang Mai; Cai Cheng-Tao; Zhu Qi-Dan; Shi Zhen

2009-01-01

139

Night Vision Camera  

NASA Technical Reports Server (NTRS)

PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

1996-01-01

140

Comparing Cosmic Cameras  

NSDL National Science Digital Library

Learners will take and then compare the images taken by a camera - to learn about focal length (and its effects on field of view), resolution, and ultimately how cameras take close-up pictures of far away objects. Finally, they will apply this knowledge to the images of comet Tempel 1 taken by two different spacecraft with three different cameras, in this case Deep Impact and those expected/obtained from Stardust-NExT. This lesson could easily be adapted for use with other NASA missions.

141

Calibration of action cameras for photogrammetric purposes.  

PubMed

The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

2014-01-01

142

Calibration of Action Cameras for Photogrammetric Purposes  

PubMed Central

The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

2014-01-01

143

Pinhole Camera Model  

NSDL National Science Digital Library

The Pinhole Camera Model demonstrates the operation of a pinhole camera. Light rays leaving the top and bottom on an object of height h pass through a pinhole and strike a flat screen. These rays travel in straight lines accord with the principles of geometric optics. Drag the object and observe the image on the camera screen. Simple geometry shows that the image is inverted and that the ratio of the image to object size (the magnification) is the same as the ratio of the image to object distance. The Pinhole Camera Model was developed using the Easy Java Simulations (EJS) modeling tool. It is distributed as a ready-to-run (compiled) Java archive. Double clicking the jar file will run the program if Java is installed. You can modify this simulation if you have EJS installed by right-clicking within the map and selecting "Open Ejs Model" from the pop-up menu item.

Christian, Wolfgang

2012-04-20

144

Thermal Camera Pictures  

NSDL National Science Digital Library

This page includes pictures taken with an infrared camera. Though the pictures include very little explanation, they do demonstrate some properties of light, including reflection, as well as conduction of heat.

Falstad, Paul

2004-11-28

145

Metric camera experiment  

NASA Technical Reports Server (NTRS)

A test of the mapping capabilities of high resolution space photography taken at the resolution limit of image motion on large film format is planned. The metric camera system and its planned operation are described.

Reynolds, M.

1981-01-01

146

Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe  

NASA Astrophysics Data System (ADS)

Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. McMillen, and J.H. McClendon, "Leaf angle: an adaptive feature of sun and shade leaves," Botanical Gazette, vol. 140, pp. 437-442, 1979. [2] J. Pisek, O. Sonnentag, A.D. Richardson, and M. Mõttus, "Is the spherical leaf inclination angle distribution a valid assumption for temperate and boreal broadleaf tree species?" Agricultural and Forest Meteorology, vol. 169, pp. 186-194, 2013. [3] Y. Ryu, O. Sonnentag, T. Nilson, R. Vargas, H. Kobayashi, R. Wenk, and D. Baldocchi, "How to quantify tree leaf area index in a heterogenous savanna ecosystem: a multi-instrument and multimodel approach," Agricultural and Forest Meteorology, vol. 150, pp. 63-76, 2010.

Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

2014-05-01

147

Cardboard Box Camera Obscura  

NSDL National Science Digital Library

In this activity, learners construct a device that projects images onto a surface, so they can trace landscapes and other sights. The cardboard box camera captures the images using a mirror, holder, and lens, which work together like a simple camera. Use this activity to introduce learners to concepts related to optics, light, lenses, and mirrors. Note: this activity requires the use of a drill and saw, not included in the cost of materials.

Centers, Oakland D.

2012-01-01

148

The Star Formation Camera  

Microsoft Academic Search

The Star Formation Camera (SFC) is a wide-field (~15'x19, >280 arcmin^2), high-resolution (18x18 mas pixels) UV\\/optical dichroic camera designed for the Theia 4-m space-borne space telescope concept. SFC will deliver diffraction-limited images at lambda > 300 nm in both a blue (190-517nm) and a red (517-1075nm) channel simultaneously. Our aim is to conduct a comprehensive and systematic study of the

Paul A. Scowen; Rolf Jansen; Matthew Beasley; Daniela Calzetti; Steven Desch; Alex Fullerton; John Gallagher; Doug Lisman; Steve Macenka; Sangeeta Malhotra; Mark McCaughrean; Shouleh Nikzad; Robert O'Connell; Sally Oey; Deborah Padgett; James Rhoads; Aki Roberge; Oswald Siegmund; Stuart Shaklan; Nathan Smith; Daniel Stern; Jason Tumlinson; Rogier Windhorst; Robert Woodruff

2009-01-01

149

Aircraft Altitude Estimation Using Un-calibrated Onboard Cameras  

NASA Astrophysics Data System (ADS)

In the present study, implementation and study of aircraft altitude estimation using un-calibrated onboard camera is obtained. A camera model has been implemented to simulate the test data. From the results, it was observed that the rounding nature of pixel coordinates creates fluctuations around the true vanishing point (VP) angle and height computations. These fluctuations were smoothened using a Kalman filter based state estimator. The effects of camera tilt and focal length on VP angle and height computations were also studied. It is concluded that the camera should be perpendicular to the runway for there to be no effect of the focal length on the height computation. It is being planned to apply this algorithm for real time imaging data along with Integrated Enhanced Synthetic Vision (IESVS) on HANSA aircraft.

Naidu, V. P. S.; Mukherjee, J.

2012-10-01

150

Fire Surveillance System Using an Omnidirectional Camera for Remote Monitoring  

Microsoft Academic Search

This paper proposes new video-based fire surveillance and remote monitoring system for real-life application. Most previous video-based fire detection systems using color information and temporal variations of pixels produce frequent false alarms due to the use of many heuristic features. Plus, they need several cameras to overcome the dead angle problem of a normal CCD camera. Thus, to overcome these

ByoungChul Ko; Hyun-Jae Hwang; In-Gyu Lee; Jae-Yeal Nam

2008-01-01

151

REAL-TIME 3D SLAM WITH WIDE-ANGLE VISION  

Microsoft Academic Search

The performance of single-camera SLAM is improved when wide-angle optics provide a eld of view greater than the 40 to 50 degrees lenses normally used in computer vision. The issue is one of feature contact: each landmark object mapped remains visible through a larger range of camera motion, meaning that feature density can be reduced and camera movement range can

Andrew J. Davison; Nobuyuki Kita

2004-01-01

152

Image dissector camera system study  

NASA Technical Reports Server (NTRS)

Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.

Howell, L.

1984-01-01

153

Spectral narrowing via quantum coherence  

SciTech Connect

We have studied the transmission through an optically thick {sup 87}Rb vapor that is illuminated by monochromatic and noise-broadened laser fields in {lambda} configuration. The spectral width of the beat signal between the two fields after transmission through the atomic medium is more than 1000 times narrower than the spectral width of this signal before the medium.

Mikhailov, Eugeniy E.; Rostovtsev, Yuri V.; Zhang Aihua; Welch, George R. [Department of Physics and Institute of Quantum Studies, Texas A and M University, College Station, Texas 77843-4242 (United States); Sautenkov, Vladimir A. [Department of Physics and Institute of Quantum Studies, Texas A and M University, College Station, Texas 77843-4242 (United States); P. N. Lebedev Institute of Physics, 119991 Moscow (Russian Federation); Zubairy, M. Suhail [Department of Physics and Institute of Quantum Studies, Texas A and M University, College Station, Texas 77843-4242 (United States); Department of Electronics, Quaid-i-Azam University, Islamabad (Pakistan); Scully, Marlan O. [Department of Physics and Institute of Quantum Studies, Texas A and M University, College Station, Texas 77843-4242 (United States); Department of Chemistry, Princeton University, Princeton, New Jersey 08544 (United States)

2006-07-15

154

Spectral Narrowing via Quantum Coherence  

E-print Network

We have studied the transmission of an optically thick Rb vapor that is illuminated by monochromatic and noise broaden laser fields in Lambda configuration. The spectral width of the beat signal between the two fields after transmission through the atomic medium is more than 1000 times narrower than the spectral width of this signal before the medium.

Eugeniy E. Mikhailov; Vladimir A. Sautenkov; Yuri V. Rostovtsev; Aihua Zhang; M. Suhail Zubairy; Marlan O. Scully; George R. Welch

2005-03-08

155

Laboratory geometric calibration of areal digital aerial camera  

NASA Astrophysics Data System (ADS)

Digital aerial camera is non-metric camera. Geometric calibration, including the determination of interior orientation elements and distortion parameters, is the base of high precision photogrammetry. In this paper, a laboratory geometric calibration system of areal digital aerial cameras is developed. This system uses a collimator and a star tester as the target generator. After measurement of the coordinates of targets on the CCD plane and corresponding angles of parallel lights, the geometric calibration of digital aerial camera can be realized according to the geometric calibration model of this paper. Geometric calibration experiments are taken out based on this system using two kinds of mainstream digital aerial cameras, Cannon EOS 5D Mark II and Hasselblad H3D. Experiment results show that this system can satisfy the calibration requirements of aerial photogrammetric application and prove the correctness and the reliability of this calibration method.

Yuan, F.; Qi, W. J.; Fang, A. P.

2014-03-01

156

Embedded Smart Camera Performance Analysis  

Microsoft Academic Search

Increasingly powerful integrated circuits are making an entire range of new applications possible. Recent technological advances are enabling a new generation of smart cameras that represent a quantum leap in sophistication. While todaypsilas digital cameras capture images, smart camera capture high-level descriptions of the scene and analyze what they see. A smart camera combines video sensing, high-level video processing and

N. F. Kahar; R. B. Ahmad; Z. Hussin; A. N. C. Rosli

2009-01-01

157

Digital Camera Find out which  

E-print Network

-Champaign have recently built a spherical camera that follows the form and function of a human eye by building the retina of a human eye. Although the camera's resolution is only 256 pixels, the researchers. TFOT recently covered a camera based on the human eye as well a magnet controlled camera developed

Rogers, John A.

158

3-D Flow Visualization with a Light-field Camera  

NASA Astrophysics Data System (ADS)

Light-field cameras have received attention recently due to their ability to acquire photographs that can be computationally refocused after they have been acquired. In this work, we describe the development of a light-field camera system for 3D visualization of turbulent flows. The camera developed in our lab, also known as a plenoptic camera, uses an array of microlenses mounted next to an image sensor to resolve both the position and angle of light rays incident upon the camera. For flow visualization, the flow field is seeded with small particles that follow the fluid's motion and are imaged using the camera and a pulsed light source. The tomographic MART algorithm is then applied to the light-field data in order to reconstruct a 3D volume of the instantaneous particle field. 3D, 3C velocity vectors are then determined from a pair of 3D particle fields using conventional cross-correlation algorithms. As an illustration of the concept, 3D/3C velocity measurements of a turbulent boundary layer produced on the wall of a conventional wind tunnel are presented. Future experiments are planned to use the camera to study the influence of wall permeability on the 3-D structure of the turbulent boundary layer.Schematic illustrating the concept of a plenoptic camera where each pixel represents both the position and angle of light rays entering the camera. This information can be used to computationally refocus an image after it has been acquired. Instantaneous 3D velocity field of a turbulent boundary layer determined using light-field data captured by a plenoptic camera.

Thurow, B.

2012-12-01

159

Satellite camera image navigation  

NASA Technical Reports Server (NTRS)

Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

1987-01-01

160

Narrow-Band Processing and Fusion Approach for Explosive Hazard Detection in FLGPR  

E-print Network

algorithm for a forward-looking ground-penetrating radar (FLGPR). One challenge for threat detection using. Keywords: forward-looking explosive hazards detection, ground-penetrating radar, narrow-band processing included ground-penetrating-radar (GPR), infrared (IR) cameras, and acoustic technologies.1-3 Both handheld

Havens, Timothy

161

Divergent-ray projection method for measuring the flapping angle, lag angle, and torsional angle of a bumblebee wing  

NASA Astrophysics Data System (ADS)

A divergent-ray projection (DRP) method was developed for measuring the flapping angle, lag angle, and torsional angle of bumblebee wing during beating motion. This new method can measure the spatial coordinates of an insect wing by digitizing the images that are projected by two divergent laser rays from different directions. The advantage of the DRP method is its ability to measure those three angles simultaneously using only one high-speed camera. The resolution of the DRP method can be changed easily by adjusting system parameters to meet the needs of different types of objects. The measurement results for these angles of a bumblebee wing probe the effectiveness of the DRP method in studying the flight performance of insects.

Zeng, Lijiang; Matsumoto, Hirokazu; Kawachi, Keiji

1996-11-01

162

Solid state television camera  

NASA Technical Reports Server (NTRS)

The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

1976-01-01

163

[Drugs and closed-angle glaucoma risk].  

PubMed

Closed-angle glaucomas arise among predisposed patients (narrow iridocorneal angle) in response to various stimuli. Most of the attacks are of iatrogenic origin: all the topical and systemic mydriatic drugs can provoke an angle closure glaucoma attack. Dangerous active ingredients with closed-angle glaucoma are active substances with anticholinergic activity (peripheral action, central action, with anticholinergic side-effects), active ingredients with sympathomimetic alpha activity (alpha 1, alpha and beta with indirect effects), and the active ingredients with parasympathomimetic activity (anticholinesterases). The proprietary medicine, whether or not they are included in the French dictionary Vidal((R)), are classified according to the administration route and their different indications. The closed-angle glaucoma risk after administration of these drugs is noted in the items'contraindications and precautions in the summary of the product characteristics enclosed in the marketing authorization. PMID:11965126

Pozzi, D; Giraud, C; Callanquin, M

2002-01-01

164

Cameras in the courtroom  

Microsoft Academic Search

The present experiment examined some of the key psychological issues associated with electronic media coverage (EMC) of courtroom trials. Undergraduate student subjects served as eitherwitnesses orjurors in one of three types of trials:EMC, in which a video camera was present; conventional media coverage (CMC), in which a journalist was present; or, ano-media control, in which no media representative or equipment

Eugene Borgida; Kenneth G. DeBono; Lee A. Buckmant

1990-01-01

165

Jack & the Video Camera  

ERIC Educational Resources Information Center

This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

Charlan, Nathan

2010-01-01

166

The LSST Camera Overview  

SciTech Connect

The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

2007-01-10

167

Profit by Cameras.  

National Technical Information Service (NTIS)

By using cameras, processors can increase immediate-profit goals more quickly and can plan for better long-range benefits. The authors demonstrate how super-8 motion picture analyst projectors define and measure current sea-food processing operations. Act...

W. F. Engesser, R. Conrads, D. Cheung, W. Mercer

1973-01-01

168

The Martian Atmosphere as seen by the OSIRIS camera  

NASA Astrophysics Data System (ADS)

Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started a February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC o February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (se Figures 1 and 2). Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started at February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC on February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (see Figures 1 and 2). In this work we will focus on our findings about the vertical structure of the atmosphere over the Martian limbs and report on the observed altitudes and optical densities of dust and (partially detached) clouds and put the findings in context with data from other satellites in orbit around Mars at the same time (e.g. Mars Express). Based on previous datasets (MGS/TES, MOd/THEMIS, MRO/MCS, see, e.g., [2], [3] and [4]) we can expect to observe the waning of the South polar hood and the development of the Northern one. Some remains of the aphelion cloud belt might still be visible near the equator. Detached layers have been recently observed at this season by MEx/SPICAM [5] and MRO/MCS [6].

Moissl, R.; Pajola, M.; Määttänen, A.; Küppers, M.

2013-09-01

169

Image Sensors Enhance Camera Technologies  

NASA Technical Reports Server (NTRS)

In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

2010-01-01

170

Camera motion estimation using normal flows  

NASA Astrophysics Data System (ADS)

An autonomous system must have the capability of estimating or controlling its own motion parameters. There already exit tens of research work to fulfill the task. However, most of them are based on the motion correspondences establishment or full optical flows estimation. The above solutions put restrictions on the scene: either there must be presence of enough distinct features, or there must be dense texture. Different from the traditional works, utilizing no motion correspondences or epipolar geometry, we start from the normal flow data, ensure good use of every piece of them because they could only be sparsely available. We apply the spherical image model to avoid the ambiguity in describing the camera motion. Since each normal flow gives a locus for the location of the camera motion, the intersection of such loci offered by different data points will narrow the possibilities of the camera motion and even pinpoint it. A voting scheme in ?-? domain is applied to simplify the 3D voting space to a 2D voting space. We tested the algorithms introduced above by using both synthetic image data and real image sequences. Experimental results are shown to illustrate the potential of the methods.

Yuan, Ding; Liu, Miao; Zhang, Hong

2013-03-01

171

CCD-camera system for stereoscopic optical observations of the aurora  

NASA Astrophysics Data System (ADS)

A system of three identical CCD-cameras was developed enabling stereoscopic auroral observations. An image intensifier allows for real-time imaging of auroral arcs with interference or broad-band filters. The combination of a small-angle optics with a CCD-chip of 756 by 580 pixels provides spatial resolutions of auroral small-scale structures down to 20 m. The cameras are controlled by personal computers with integrated global positioning (GPS) modules enabling time synchronization of the cameras and providing the exact geographical position for the portable cameras. Calibration with a standard light source is the basis for quantitative evaluation of images by image processing techniques. The current technical development is the combination with local operating networks (LON) for monitoring camera parameters like voltage and temperature and remote control of parameters like filter positions, mounting tilt angles and camera gain.

Frey, Harald U.; Lieb, Werner; Bauer, Otto H.; Hoefner, Herwig; Haerendel, Gerhard

1996-11-01

172

Spectral narrowing via quantum coherence  

E-print Network

Spectral narrowing via quantum coherence Eugeniy E. Mikhailov,1 Vladimir A. Sautenkov,1,2 Yuri V. Rostovtsev,1 Aihua Zhang,1 M. Suhail Zubairy,1,3 Marlan O. Scully,1,4 and George R. Welch1 1Department of Physics and Institute of Quantum Studies... discussions and grate- fully acknowledge the support from the Office of Naval Re- search, the Air Force Research Laboratory #1;Rome, NY#2;, De- fense Advanced Research Projects Agency, and the Robert A. Welch Foundation #1;Grant No. A1261#2;. #3;1#4; S. E...

Mikhailov, Eugeniy E.; Sautenkov, Vladimir A.; Rostovtsev, Yuri V.; Zhang, Aihua; Zubairy, M. Suhail; Scully, Marlan O.; Welch, George R.

2006-01-01

173

Streak camera receiver definition study  

NASA Technical Reports Server (NTRS)

Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

1990-01-01

174

Gas Metal Arc-Narrow Gap Welding.  

National Technical Information Service (NTIS)

Narrow gap welding techniques have been developed to diminish welding work and to meet the requirements of high fracture toughness in heavy plate structures. Welding in narrow grooves has been found sensitive to weld defects. It requires strict procedural...

J. Koivula, P. Groeger

1984-01-01

175

Operating Manual CCD Camera Models  

E-print Network

Operating Manual CCD Camera Models ST-7XE, ST-8XE, ST-9XE, ST-10XE, ST-10XME and ST-2000XM:.......................................................17 1.2.7. Capturing Images with the CCD Camera.........................................................18 2. Introduction to CCD Cameras

176

Camera detection, classification, and positioning of vehicles on a multi-lane road  

Microsoft Academic Search

The authors report on the development of a wide-angle camera for the detection, classification, and positioning of vehicles on a multi-lane road, inside the detection zone of an electronic road pricing system. By using a laboratory road model with toy cars and a colour CCD camera with associated image processing software they are able to count correctly the number of

T. K. Lim; M. A. Do

1993-01-01

177

The Dark Energy Camera  

NASA Astrophysics Data System (ADS)

The DES Collaboration has completed construction of the Dark Energy Camera (DECam), a 3 square degree, 570 Megapixel CCD camera which is now mounted at the prime focus of the Blanco 4-meter telescope at the Cerro Tololo Inter-American Observatory. DECam is comprised of 74 250 micron thick fully depleted CCDs: 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. A filter set of u,g,r,i,z, and Y, a hexapod for focus and lateral alignment as well as thermal management of the cage temperature. DECam will be used to perform the Dark Energy Survey with 30% of the telescope time over a 5 year period. During the remainder of the time, and after the survey, DECam will be available as a community instrument. An overview of the DECam design, construction and initial on-sky performance information will be presented.

Flaugher, Brenna; DES Collaboration

2013-01-01

178

Combustion pinhole camera system  

DOEpatents

A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

Witte, A.B.

1984-02-21

179

Camera Trajectory fromWide Baseline Images  

NASA Astrophysics Data System (ADS)

Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mi?ušík's two-parameter model, that links the radius of the image point r to the angle ? of its corresponding rays w.r.t. the optical axis as ? = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendin

Havlena, M.; Torii, A.; Pajdla, T.

2008-09-01

180

Mobile Phones Digital Cameras  

E-print Network

News· Tutorials· Reviews· Features· Videos· Search· Mobile Phones· Notebooks· Digital Cameras· Gaming· Computers· Audio· Software· Follow Us· Subscribe· Airport Security to Get New Scanning Device Relations Accredited online university. Get an international relations degree. www.AMUOnline.com security

Suslick, Kenneth S.

181

Gamma ray camera  

DOEpatents

A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

Perez-Mendez, Victor (Berkeley, CA)

1997-01-01

182

Gamma ray camera  

DOEpatents

A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

Perez-Mendez, V.

1997-01-21

183

Orbiter Camera Payload System  

NASA Technical Reports Server (NTRS)

Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

1980-01-01

184

The Star Formation Camera  

E-print Network

The Star Formation Camera (SFC) is a wide-field (~15'x19, >280 arcmin^2), high-resolution (18x18 mas pixels) UV/optical dichroic camera designed for the Theia 4-m space-borne space telescope concept. SFC will deliver diffraction-limited images at lambda > 300 nm in both a blue (190-517nm) and a red (517-1075nm) channel simultaneously. Our aim is to conduct a comprehensive and systematic study of the astrophysical processes and environments relevant for the births and life cycles of stars and their planetary systems, and to investigate and understand the range of environments, feedback mechanisms, and other factors that most affect the outcome of the star and planet formation process. This program addresses the origins and evolution of stars, galaxies, and cosmic structure and has direct relevance for the formation and survival of planetary systems like our Solar System and planets like Earth. We present the design and performance specifications resulting from the implementation study of the camera, conducted ...

Scowen, Paul A; Beasley, Matthew; Calzetti, Daniela; Desch, Steven; Fullerton, Alex; Gallagher, John; Lisman, Doug; Macenka, Steve; Malhotra, Sangeeta; McCaughrean, Mark; Nikzad, Shouleh; O'Connell, Robert; Oey, Sally; Padgett, Deborah; Rhoads, James; Roberge, Aki; Siegmund, Oswald; Shaklan, Stuart; Smith, Nathan; Stern, Daniel; Tumlinson, Jason; Windhorst, Rogier; Woodruff, Robert

2009-01-01

185

Laboratory geometric calibration of non-metric digital camera  

NASA Astrophysics Data System (ADS)

Digital camera is non-metric camera. Its geometric calibration, including the determination of interior orientation elements and distortion parameters, is the base of high precision photogrammetry. In this paper, a laboratory geometric calibration system of digital cameras is developed. This system uses a collimator and a star tester as the target generator. After high precision positioning of targets and corresponding angles of parallel lights, geometric calibration can be accomplished according to the method of this paper. Experiments are taken out based on this system using two kinds of mainstream digital aerial cameras, Cannon EOS 5D Mark II and Hasselblad H3D, and the repeatability experiment is also taken out. Experiment results show that this system can satisfy the calibration requirements of photogrammetric application and the correctness and reliability of the measure precision of this system is verified.

Yuan, Feng; Qi, Weijun; Fang, Aiping; Ding, Penghui; YU, Xiujuan

2013-10-01

186

Utilization of consumer level digital cameras in astronomy  

NASA Astrophysics Data System (ADS)

This paper presents a study of possible utilization of digital single-lens reflex (DSLR) cameras in astronomy. The DSLRs have a great advantage over the professional equipments in better cost efficiency with comparable usability for selected purposes. The quality of electro-optical system in the DSLR camera determines the area where it can be used with acceptable precision. At first a set of important camera parameters for astronomical utilization is introduced in the paper. Color filter array (CFA) structure, demosaicing algorithm, image sensor spectral properties, noise and transfer characteristics are the parameters that belong among the very important ones and these are further analyzed in the paper. Compression of astronomical images using the KLT approach is also described below. The potential impact of these parameters on position and photometric measurement is presented based on the analysis and measurements with the wide-angle lens. The prospective utilization of consumer DSLR camera as a substitute for expensive devices is discussed.

Páta, Petr; Fliegel, Karel; Klíma, Miloš; Blažek, Martin; ?e?ábek, Martin

2010-08-01

187

Multispectral Photometry of the Moon and Absolute Calibration of the Clementine UV/Vis Camera  

NASA Astrophysics Data System (ADS)

We present a multispectral photometric study of the Moon between solar phase angles of 0 and 85°. Using Clementine images obtained between 0.4 and 1.0 ?m, we produce a comprehensive study of the lunar surface containing the following results: (1) empirical photometric functions for the spectral range and viewing and illumination geometries mentioned, (2) photometric modeling that derives the physical properties of the upper regolith and includes a detailed study of the causes for the lunar opposition surge, (3) an absolute calibration of the Clementine UV/Vis camera. The calibration procedure given on the Clementine calibration web site produces reflectances relative to a halon standard and further appear significantly higher than those seen in groundbased observations. By comparing Clementine observations with prior groundbased observations of 15 sites on the Moon we have determined a good absolute calibration of the Clementine UV/Vis camera. A correction factor of 0.532 has been determined to convert the web site (www.planetary.brown.edu/clementine/calibration.html) reflectances to absolute values. From the calibrated data, we calculate empirical phase functions useful for performing photometric corrections to observations of the Moon between solar phase angles of 0 and 85° and in the spectral range 0.4 to 1.0?m. Finally, the calibrated data is used to fit a version of Hapke's photometric model modified to incorporate a new formulation, developed in this paper, of the lunar opposition surge which includes coherent backscatter. Recent studies of the lunar opposition effect have yielded contradictory results as to the mechanism responsible: shadow hiding, coherent backscatter, or both. We find that most of the surge can be explained by shadow hiding with a halfwidth of ˜8°. However, for the brightest regions (the highlands at 0.75-1.0?m) a small additional narrow component (halfwidth of <2°) of total amplitude ˜1/6 to 1/4 that of the shadow hiding surge is observed, which may be attributed to coherent backscatter. Interestingly, no evidence for the narrow component is seen in the maria or in the highlands at 0.415?m. A natural explanation for this is that these regions are too dark to exhibit enough multiple scattering for the effects of coherent backscatter to be seen. Finally, because the Moon is the only celestial body for which we have "ground truth" measurements, our results provide an important test for the robustness of photometric models of remote sensing observations.

Hillier, John K.; Buratti, Bonnie J.; Hill, Kathryn

1999-10-01

188

The Discovery of a Very Narrow Line Star-forming Object at a Redshift of 5.66  

Microsoft Academic Search

We report on the discovery of a very narrow line star-forming object beyond a redshift of 5. Using the prime-focus camera, Suprime-Cam, on the 8.2 m Subaru Telescope together with a narrow-passband filter centered at lambdac=8150 Å with a passband of Deltalambda=120 Å, we have obtained a very deep image of the field surrounding the quasar SDSSp J104433.04-012502.2 at a

Yoshiaki Taniguchi; Masaru Ajiki; Takashi Murayama; Tohru Nagao; Sylvain Veilleux; David B. Sanders; Yutaka Komiyama; Yasuhiro Shioya; Shinobu S. Fujita; Yuko Kakazu; Sadanori Okamura; Hiroyasu Ando; Tetsuo Nishimura; Masahiko Hayashi; Ryusuke Ogasawara; Shin-ichi Ichikawa

2003-01-01

189

Tacoma Narrows Bridge: Extreme History  

NSDL National Science Digital Library

Stretching across the southern portion of Puget Sound, the elegant Tacoma Narrows bridge is considered one of the finest suspension bridges in the United States. The current bridge is the second on the site, as it was constructed in 1950 to serve as a replacement to the famous "Galloping Gertie" bridge, which collapsed in a windstorm in the fall of 1940. Currently, the Washington State Department of Transportation is building a bridge to replace the existing structure, and it is anticipated that it will be completed in 2007. This site offers a host of materials on all three structures, including ample information on the construction of the bridges and their aesthetic appeal. Along with these materials, the site also provides a glossary of related terms, Weird Facts, and some information about the dog "Tubby", who perished when "Galloping Gertie" collapsed on that fateful fall day back in 1940.

190

100-ps framing-camera tube.  

PubMed

The optoelectronic framing-camera tube described is capable of recording two-dimensional image frames with high spatial resolution in the <100-ps range. Framing is performed by streaking a two-dimensional electron image across narrow slits. The resulting dissected electron line images from the slits are restored into framed images by a restorer deflector operating synchronously with the dissector deflector. The number of framed images on the tube's viewing screen equals the number of dissecting slits in the tube. Performance has been demonstrated in a prototype tube by recording 135-ps-duration framed images of 2.5-mm patterns at the cathode. The limitation in the framing speed is in the external drivers for the deflectors and not in the tube design characteristics. Faster frame speeds in the <100-ps range can be obtained by use of faster deflection drivers. PMID:18699219

Kalibjian, R

1978-07-01

191

Cryogenic Detectors (Narrow Field Instruments)  

NASA Astrophysics Data System (ADS)

Two cryogenic imaging spectrometer arrays are currently considered as focal plane instruments for XEUS. The narrow field imager 1 (NFI 1) will cover the energy range from 0.05 to 3 keV with an energy resolution of 2 eV, or better, at 500 eV. A second narrow field imager (NFI 2) covers the energy range from 1 to 15 keV with an energy resolution of 2 eV (at 1 keV) and 5 eV (at 7 keV), creating some overlap with part of the NFI 1 energy window. Both narrow field imagers have a 0.5 arcmin field of view. Their imaging capabilities are matched to the XEUS optics of 2 to 5 arcsec leading to 1 arcsec pixels. The detector arrays will be cooled by a closed cycle system comprising a mechanical cooler with a base temperature of 2.5 K and either a low temperature 3He sorption pump providing the very low temperature stage and/or an Adiabatic Demagnetization Refrigerator (ADR). The ADR cooler is explicitly needed to cool the NFI 2 array. The narrow field imager 1} Currently a 48 times 48 element array of superconducting tunnel junctions (STJ) is envisaged. Its operating temperature is in the range between 30 and 350 mK. Small, single Ta STJs (20-50 mum on a side) have shown 3.5 eV (FWHM) resolution at E = 525 eV and small arrays have been successfully demonstrated (6 times 6 pixels), or are currently tested (10 times 12 pixels). Alternatively, a prototype Distributed Read-Out Imaging Device (DROID), consisting of a linear superconducting Ta absorber of 20 times 100 mum2, including a 20 times 20 mum STJ for readout at either end, has shown a measured energy resolution of 2.4 eV (FWHM) at E = 500 eV. Simulations involving the diffusion properties as well as loss and tunnel rates have shown that the performance can be further improved by slight modifications in the geometry, and that the size of the DROIDS can be increased to 0.5-1.0 mm without loss in energy resolution. The relatively large areas and good energy resolution compared to single STJs make DROIDS good candidates for the basic elements of the NFI 1 detector array. With a DROID-based array of 48 times 10 elements covering the NFI 1 field of view of 0.5 arcmin, the number of signal wires would already be reduced by a factor 2.4 compared to a 48 times 48 array of single pixels. While the present prototype DROIDS are still covered with a 480 nm thick SiOx insulation layer, this layer could easily be reduced in thickness or omitted. The detection efficiency of such a device with a 500 nm thick Ta absorber would be >80% in the energy range of 100-3000eV, without any disturbing contributions from other layers as in single STJs. Further developments involve devices of lower Tc-superconductors for better energy resolution and faster diffusion (e.g. Mo). The narrow field imager 2 The NFI 2 will consist of an array of 32 times 32 detector pixels. Each detector is a microcalorimeter which consists of a a superconducting to normal phase transition edge thermometer (transition edge sensor, TES) with an operating temperature of 100 mK, and an absorber which allows a detection efficiency of >90% and a filling factor of the focal plane in excess of 90%. Single pixel microcalorimeters with a Ti/Au TES have already shown an energy resolution of 3.9 eV at 5.89 keV in combination with a thermal response time of 100 mus. These results imply that they the high-energy requirement for XEUS can be met, in terms of energy resolution and response time. It has been demonstrated that bismuth can be applied as absorber material without impeding on the detector performance. Bi increases the stopping power in excess of 90 % and allows for a high filling factor since the absorber is can be modeled in the shape of a mushroom, allowing that the wiring to the detector and the thermal support structure are placed under the hat of the mushroom. In order to realize the NFI 2 detector array, there are two major development areas. Firstly, there is the development of micromachined Si and SiN structures that will provide proper cooling for each of the pixels and the production of small membranes to support the

Hoevers, H.; Verhoeve, P.

192

Star identification independence on the camera parameters  

NASA Astrophysics Data System (ADS)

Star identification is the key problem of satellite attitude determination from star sensor. Since star angular distance is directly used as matching feature in traditional star identification, the precision and success rate of traditional method are highly dependent on the calibrated accuracy of the star camera parameters. In this paper, a star identification algorithm is improved. The algorithm uses interior angles of a triangle composed of observation stars as matching feature. The triangles composed by nautical star and observed star are homothetic. Interior angles of triangle are independent from both the focal length f. Thus this method is not dependent on camera parameters, and the position information is unnecessarily priori. Monte Carlo experiment shows that the probability of failing Star identification is less than 6.63%. Generally, the time of star identification process is restricted to 30ms. In addition, it can work well within the 50% error of f. Compared with traditional algorithm; this algorithm has advantage on successful identification rate and reliability.

Su, Dezhi; Chen, Dong; Zhou, Mingyu; Wang, Kun

2014-09-01

193

Application of narrow-band television to industrial and commercial communications  

NASA Technical Reports Server (NTRS)

The development of narrow-band systems for use in space systems is presented. Applications of the technology to future spacecraft requirements are discussed along with narrow-band television's influence in stimulating development within the industry. The transferral of the technology into industrial and commercial communications is described. Major areas included are: (1) medicine; (2) education; (3) remote sensing for traffic control; and (5) weather observation. Applications in data processing, image enhancement, and information retrieval are provided by the combination of the TV camera and the computer.

Embrey, B. C., Jr.; Southworth, G. R.

1974-01-01

194

Refraction of narrow probing light beam in multilayer optical fiber  

NASA Astrophysics Data System (ADS)

For an experimental design and performance evaluation of optical fibers, it is necessary to establish an analytical relation between the refraction angle of a probing light beam and the geometrical parameters of the fiber. Here the problem is solved by application of Snell's law for a narrow light beam, first to a double-layer fiber consisting of a gradient core inside a homogeneous sheath and then to a triple-layer fiber consisting of a gradient core inside a double-layer sheath.

Mirovitskaya, S. D.; Kudryavtsev, D. L.

1984-11-01

195

A real-time single-camera, stereoscopic video device  

NASA Astrophysics Data System (ADS)

This patent application discloses a real-time, single-camera, stereoscopic video imaging device for use with a standard 60 Hz camera and monitor is being developed The device uses a single objective lens to focus disparate views of the object at the focal plane of the lens. Each view is represented by a set of parallel rays emanating from the object at a specific angle. The lens focuses these parallel rays to a single point at the focal plane These views are then shuttered at the focal plane using a Liquid-Crystal Device (LCD) shutter such that one view at a time is passed to the camera. The camera then transmits alternating video fields (individual TV images) to the monitor, such that alternate fields display stereoscopically-related views of the object being imaged. The user views the monitor using off-the-shelf LCD stereoglasses, modified to allow synchronization with the standard field rate of the camera and monitor. The glasses shutter the light alternately to each eye so that the left eye views the left-hand image and the right eye views the right-hand image. The resulting 3-D image is independent of the user's viewing angle or distance from the monitor.

Converse, Blake L.

1994-12-01

196

PAU camera: detectors characterization  

NASA Astrophysics Data System (ADS)

The PAU Camera (PAUCam) [1,2] is a wide field camera that will be mounted at the corrected prime focus of the William Herschel Telescope (Observatorio del Roque de los Muchachos, Canary Islands, Spain) in the next months. The focal plane of PAUCam is composed by a mosaic of 18 CCD detectors of 2,048 x 4,176 pixels each one with a pixel size of 15 microns, manufactured by Hamamatsu Photonics K. K. This mosaic covers a field of view (FoV) of 60 arcmin (minutes of arc), 40 of them are unvignetted. The behaviour of these 18 devices, plus four spares, and their electronic response should be characterized and optimized for the use in PAUCam. This job is being carried out in the laboratories of the ICE/IFAE and the CIEMAT. The electronic optimization of the CCD detectors is being carried out by means of an OG (Output Gate) scan and maximizing it CTE (Charge Transfer Efficiency) while the read-out noise is minimized. The device characterization itself is obtained with different tests. The photon transfer curve (PTC) that allows to obtain the electronic gain, the linearity vs. light stimulus, the full-well capacity and the cosmetic defects. The read-out noise, the dark current, the stability vs. temperature and the light remanence.

Casas, Ricard; Ballester, Otger; Cardiel-Sas, Laia; Castilla, Javier; Jiménez, Jorge; Maiorino, Marino; Pío, Cristóbal; Sevilla, Ignacio; de Vicente, Juan

2012-07-01

197

The All Sky Camera Network  

NSDL National Science Digital Library

In 2001, the All Sky Camera Network came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit Space Odyssey with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Students involved in the network participate in an authentic, inquiry-based experience by tracking meteor events. This article discusses the past, present, and future of the All Sky Camera Network.

Caldwell, Andy

2005-02-01

198

Reflectance characteristics of the Viking lander camera reference test charts  

NASA Technical Reports Server (NTRS)

Reference test charts provide radiometric, colorimetric, and spatial resolution references for the Viking lander cameras on Mars. Reflectance measurements of these references are described, including the absolute bidirectional reflectance of the radiometric references and the relative spectral reflectance of both radiometric and colorimetric references. Results show that the bidirection reflectance of the radiometric references is Lambertian to within + or - 7% for incidence angles between 20 deg and 60 deg, and that their spectral reflectance is constant with wavelength to within + or - 5% over the spectral range of the cameras. Estimated accuracy of the measurements is + or - 0.05 in relative spectral reflectance.

Wall, S. D.; Burcher, E. E.; Jabson, D. J.

1975-01-01

199

Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds  

NASA Astrophysics Data System (ADS)

A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

Buckner, Benjamin D.; L'Esperance, Drew

2013-08-01

200

Distortion corrections for better character recognition of camera-based document images  

NASA Astrophysics Data System (ADS)

The usage of cellular camera phones and digital cameras is rapidly increasing, but camera imaging applications are not so much due to the lack of practical camera imaging technology. Especially the acquisition environments of camera images are very different from those of scanner images. Light illumination, viewing distance and viewing angles constantly varies when we take a picture in indoor and outdoor. These variations make it difficult to extract character areas from images through binarization and the variation of camera viewing angles makes the images distorted geometrically. In this paper, these problems are totally discussed and the resolving methods are suggested for a better image recognition. The solutions such as adaptive binarization, color conversion, correction of lens distortion and correction of geometrical distortion are discussed and the sequence of correction processes are suggested for accurate document image recognition. In experiment, we use the various types of document images captured by mobile phone cameras and digital cameras. The results of distortion correction show that our image processing methods are efficient to increase the accuracy of character recognition for camera based document image.

Chung, YunKoo; Jang, DaeGeun; Yu, WonPil; Chi, SooYoung; Kim, KyeKyung; Soh, Jung

2004-10-01

201

X-Ray Variability of the Narrow-Line Seyfert 1 Galaxy Markarian 478  

Microsoft Academic Search

We report the results of a timing analysis of X-ray observations of the bright narrow-line Seyfert 1 (NLS1) galaxy Mrk 478, obtained with the European Photon Imaging Camera (EPIC) on board XMM-Newton. The source was observed on four separate occasions between 2001 December and 2003 January, with the longest observation being ≈ 32 ks. X-ray light curves of Mrk 478

Adam R. Villarreal; T. E. Strohmayer

2006-01-01

202

Narrow gap electronegative capacitive discharges  

NASA Astrophysics Data System (ADS)

Narrow gap electronegative (EN) capacitive discharges are widely used in industry and have unique features not found in conventional discharges. In this paper, plasma parameters are determined over a range of decreasing gap length L from values for which an electropositive (EP) edge exists (2-region case) to smaller L-values for which the EN region connects directly to the sheath (1-region case). Parametric studies are performed at applied voltage Vrf=500 V for pressures of 10, 25, 50, and 100 mTorr, and additionally at 50 mTorr for 1000 and 2000 V. Numerical results are given for a parallel plate oxygen discharge using a planar 1D3v (1 spatial dimension, 3 velocity components) particle-in-cell (PIC) code. New interesting phenomena are found for the case in which an EP edge does not exist. This 1-region case has not previously been investigated in detail, either numerically or analytically. In particular, attachment in the sheaths is important, and the central electron density ne0 is depressed below the density nesh at the sheath edge. The sheath oscillations also extend into the EN core, creating an edge region lying within the sheath and not characterized by the standard diffusion in an EN plasma. An analytical model is developed using minimal inputs from the PIC results, and compared to the PIC results for a base case at Vrf=500 V and 50 mTorr, showing good agreement. Selected comparisons are made at the other voltages and pressures. A self-consistent model is also developed and compared to the PIC results, giving reasonable agreement.

Kawamura, E.; Lieberman, M. A.; Lichtenberg, A. J.

2013-10-01

203

Narrow gap electronegative capacitive discharges  

SciTech Connect

Narrow gap electronegative (EN) capacitive discharges are widely used in industry and have unique features not found in conventional discharges. In this paper, plasma parameters are determined over a range of decreasing gap length L from values for which an electropositive (EP) edge exists (2-region case) to smaller L-values for which the EN region connects directly to the sheath (1-region case). Parametric studies are performed at applied voltage V{sub rf}=500 V for pressures of 10, 25, 50, and 100 mTorr, and additionally at 50 mTorr for 1000 and 2000 V. Numerical results are given for a parallel plate oxygen discharge using a planar 1D3v (1 spatial dimension, 3 velocity components) particle-in-cell (PIC) code. New interesting phenomena are found for the case in which an EP edge does not exist. This 1-region case has not previously been investigated in detail, either numerically or analytically. In particular, attachment in the sheaths is important, and the central electron density n{sub e0} is depressed below the density n{sub esh} at the sheath edge. The sheath oscillations also extend into the EN core, creating an edge region lying within the sheath and not characterized by the standard diffusion in an EN plasma. An analytical model is developed using minimal inputs from the PIC results, and compared to the PIC results for a base case at V{sub rf}=500 V and 50 mTorr, showing good agreement. Selected comparisons are made at the other voltages and pressures. A self-consistent model is also developed and compared to the PIC results, giving reasonable agreement.

Kawamura, E.; Lieberman, M. A.; Lichtenberg, A. J. [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States)] [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States)

2013-10-15

204

Universal Streak Camera  

NASA Astrophysics Data System (ADS)

A universal streak camera which works in synchroscan and single-shot modes with 3 plug-ins (synchroscan, fast and slow) has been developed utilizing a microchannel plate-incorporated streak tube. The synchroscan plug-in features low jitter of less than 4 ps, and high-speed sine-wave deflection from 80 to 160 MHz, achieving a limiting temporal resolution of 5 ps. The fast plug-in offers temporal resolution of better than 2 ps with triggering jitter of less than t20 ps. The slow plug-in provides longer time windows from 10 ns to 1 ms/15 mm in 16 ranges. Shutter operation down to 100 ns in duration is available with a maximum repetition rate of 1 - 10 KHz for the 3 plug-ins. An on-off ratio of more than 1:106 has been achieved by using double gate operation at photocathode and built-in microchannel plate.

Tsuchiya, Y.; Takeshima, A.; Inuzuka, E.; Suzuki, K.; Koishi, M.; Kinoshita, K.

1985-02-01

205

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

2008-01-01

206

In-flight calibration of the Dawn Framing Camera II: Flat fields and stray light correction  

NASA Astrophysics Data System (ADS)

The NASA Dawn spacecraft acquired thousands of images of asteroid Vesta during its year-long orbital tour, and is now on its way to asteroid Ceres. A method for calibrating images acquired by the onboard Framing Camera was described by Schröder et al. (Schröder et al. [2013]. Icarus 226, 1304). However, their method is only valid for point sources. In this paper we extend the calibration to images of extended sources like Vesta. For this, we devise a first-order correction for in-field stray light, which is known to plague images taken through the narrow band filters, and revise the flat fields that were acquired in an integrating sphere before launch. We used calibrated images of the Vesta surface to construct simple photometric models for all filters, that allow us to study how the spectrum changes with increasing phase angle (phase reddening). In combination with these models, our calibration method can be used to create near-seamless mosaics that are radiometrically accurate to a few percent. Such mosaics are provided in JVesta, the Vesta version of the JMARS geographic information system.

Schröder, S. E.; Mottola, S.; Matz, K.-D.; Roatsch, T.

2014-05-01

207

Imaging chlorophyll fluorescence with an airborne narrow-band multispectral camera for vegetation stress detection  

Microsoft Academic Search

Progress in assessing the feasibility for imaging fluorescence using the O2-A band with 1 nm full-width half-maximum (FWHM) bands centered at 757.5 and 760.5 nm is reported in this paper. Multispectral airborne data was acquired at 150 m above ground level in the thermal, visible and near infrared regions yielding imagery at 15 cm spatial resolution. Simultaneous field experiments conducted in olive, peach, and

P. J. Zarco-Tejada; J. A. J. Berni; L. Suárez; G. Sepulcre-Cantó; F. Morales; J. R. Miller

2009-01-01

208

Describe Angle Pair Relationships  

NSDL National Science Digital Library

This lesson will explain the types of pairs of angles you will find in Geometry. Note taking time on page 5: Angle Information Now, let's see if you get it: Angle Relationship Quiz (fun) Ok! Now for your assignment, #8 on page 38! Class Zone Geometry Textbook ...

Neubert, Mrs.

2011-09-01

209

What's Your Angle?  

NSDL National Science Digital Library

In this activity, students devise procedures for using a protractor to measure the number of degrees in an angle, and use inductive reasoning to develop "angle sense." Then they describe circumstances and careers that require a working knowledge of angles and their measurements.

2010-01-01

210

Angles All Around  

NSDL National Science Digital Library

Standard: Identify and measure right, obtuse, and acute angles. This is a two day activity. OBJECTIVE: We have learned about five different types of angles: right, acute, obtuse, straight, and reflex. We have also learned how to use a protractor to measure angles. With this lesson, you will practice what ...

Bennett, Mrs.

2011-12-14

211

Critical Heat Flux In Inclined Rectangular Narrow Long Channel  

SciTech Connect

In the TMI-2 accident, the lower part of the reactor pressure vessel had been overheated and then rather rapidly cooled down, as was later identified in a vessel investigation project. This accounted for the possibility of gap cooling feasibility. For this reason, several investigations were performed to determine the critical heat flux (CHF) from the standpoint of invessel retention. The experiments are conducted to investigate the general boiling phenomena, and the triggering mechanism for the CHF in a narrow gap using a 5 x 105 mm2 crevice type heater assembly and de-mineralized water. The test parameters include the gap size of 5 mm, and the surface orientation angles from the downward facing position (180o) to the vertical position (90o). The orientation angle affects the bubble layer and escape from the narrow gap. The CHF is less than that in a shorter channel, compared with the previous experiments having a heated length of 35 mmin the copper test section.

J. L. Rempe; S. W. Noh; Y. H. Kim; K. Y. Suh; F.B.Cheung; S. B. Kim

2005-05-01

212

Multi-PSPMT scintillation camera  

SciTech Connect

Gamma ray imaging is usually accomplished by the use of a relatively large scintillating crystal coupled to either a number of photomultipliers (PMTs) (Anger Camera) or to a single large Position Sensitive PMT (PSPMT). Recently the development of new diagnostic techniques, such as scintimammography and radio-guided surgery, have highlighted a number of significant limitations of the Anger camera in such imaging procedures. In this paper a dedicated gamma camera is proposed for clinical applications with the aim of improving image quality by utilizing detectors with an appropriate size and shape for the part of the body under examination. This novel scintillation camera is based upon an array of PSPMTs (Hamamatsu R5900-C8). The basic concept of this camera is identical to the Anger Camera with the exception of the substitution of PSPMTs for the PMTs. In this configuration it is possible to use the high resolution of the PSPMTs and still correctly position events lying between PSPMTs. In this work the test configuration is a 2 by 2 array of PSPMTs. Some advantages of this camera are: spatial resolution less than 2 mm FWHM, good linearity, thickness less than 3 cm, light weight, lower cost than equivalent area PSPMT, large detection area when coupled to scintillating arrays, small dead boundary zone (< 3 mm) and flexibility in the shape of the camera.

Pani, R.; Pellegrini, R.; Trotta, G.; Scopinaro, F. [Univ. of Rome (Italy). Dept. of Experimental Medicine] [Univ. of Rome (Italy). Dept. of Experimental Medicine; Soluri, A.; Vincentis, G. de [CNR (Italy). Inst. of Biomedical Technologies] [CNR (Italy). Inst. of Biomedical Technologies; Scafe, R. [ENEA-INN, Rome (Italy)] [ENEA-INN, Rome (Italy); Pergola, A. [PSDD, Rome (Italy)] [PSDD, Rome (Italy)

1999-06-01

213

An Educational PET Camera Model  

ERIC Educational Resources Information Center

Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

2006-01-01

214

The "All Sky Camera Network"  

ERIC Educational Resources Information Center

In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

Caldwell, Andy

2005-01-01

215

Multi-Cameras Visual Servoing  

Microsoft Academic Search

In this paper, the classical visual servoing techniques have been extended to the use of several cameras observing different parts of an object. The multi-camera visual servoing has been designed as a part of the task function approach. The particular choice of the task function allows us to simplify the design of the control law and the stability analysis. A

Ezio Malis; François Chaumette; Sylvie Boudet

2000-01-01

216

An educational PET camera model  

NASA Astrophysics Data System (ADS)

Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

Johansson, K. E.; Nilsson, Ch; Tegner, P. E.

2006-09-01

217

Mars Exploration Rover engineering cameras  

USGS Publications Warehouse

NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

Maki, J.N.; Bell, J.F., III; Herkenhoff, K.E.; Squyres, S.W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, A.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

2003-01-01

218

THE DEATH OF THE CAMERA  

Microsoft Academic Search

In this paper I examine how Edward Branigan, in his new book Projecting a Camera: Language?Games in Film Theory (2006), uses Wittgenstein's later philosophy to describe the multiple, contradictory, literal and metaphorical meanings of fundamental concepts in film theory—such as ‘movement’, ‘point of view’, ‘camera’, ‘frame’ and ‘causality’. Towards the end of the paper I rationally reconstruct Branigan's main arguments

Warren Buckland

2006-01-01

219

On Narrowing, Refutation Proofs and Constraints  

E-print Network

such that E j= soe = toe. Narrowing was originally devised as an efficient E­unification procedure using can be applied to narrow a term s into soe[roe] p , denoted s ; soe[roe] p , if oe is the mgu of sj p of previous steps), one easily shows that for each (irreducible) solution oe every rewrite proof goal(soe; toe

Nieuwenhuis, Robert

220

Sensitive IR narrow band optimized microspectrometer  

Microsoft Academic Search

Customization of a standard model confocally masked FT-IR microspectrometer to maximize the signal for a particular narrow band of the spectrum and minimize noise is described. In this case the motivation was to detect minor concentrations of deuterated species in a matrix of tissue. However, the instrumental modifications used for this particular task are applicable to narrow band sensitization in

David L Wetzel

2002-01-01

221

Wf\\/pc Narrow Band Filter Calibration  

Microsoft Academic Search

This is a slightly modified version of the narrow band calibration portion of proposal 1481. This proposal is to obtain narrow band images of a diffuse emission line source which has been well characterized from the ground. The purpose of the test is to directly provide calibration of these filters using a ~0 velocity emission line source, as well as

J. Westphal

1990-01-01

222

IMAX camera (12-IML-1)  

NASA Technical Reports Server (NTRS)

The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

1992-01-01

223

Advisory Surveillance Cameras Page 1 of 2  

E-print Network

be produced and how will it be secured, who will have access to the tape? 7. At what will the camera to ensure the cameras' presence doesn't create a false sense of security #12;Advisory ­ Surveillance CamerasAdvisory ­ Surveillance Cameras May 2008 Page 1 of 2 ADVISORY -- USE OF CAMERAS/VIDEO SURVEILLANCE

Liebling, Michael

224

Properties of Creeping Discharge in Narrow Gap between Dielectric Plates in PFAE Oil  

NASA Astrophysics Data System (ADS)

Using the lightning impulse voltages of ±1.2/50 ?s and ±1.2/1000 ?s with ±140 kVpeak in maximum, the behaviors of creeping streamers progressed in a narrow gap between two solid dielectric plates immersed in palm fatty acid ester (PFAE) oil were investigated. The discharge shapes and the streamer lengths were observed using a still camera equipped with a night viewer and the progression steps of the streamer were observed using a high-speed image converter camera. The effects of two interfaces between solid dielectrics, the back side electrode (BSE) and the wave tail of impulse voltages were examined on the growth of positive and negative streamers, the flashover voltage and the streamer velocity. The streamer length is extended by the presence of the BSE and a long wave tail of impulse voltage. This results in the reduction of the flashover voltage. It is worthy of notice that the negative streamer progressed in a narrow gap grows longer than the positive streamer under identical applied voltage. This polarity effect on the streamer length is the inverse from that in an oil/pressboard interface without a narrow gap. The streamers in both polarities also slow down its velocity because of a narrow gap with two solid interfaces. These results on the creeping discharge have been compared with those in commercial mineral oil.

Usui, Takuro; Hanaoka, Ryoichi; Takata, Shinzo; Kanamaru, Yasunori; Koide, Hidenobu; Nakagami, Yoshitake

225

Reflectance Calibration Scheme for Airborne Frame Camera Images  

NASA Astrophysics Data System (ADS)

The image quality of photogrammetric images is influenced by various effects from outside the camera. One effect is the scattered light from the atmosphere that lowers contrast in the images and creates a colour shift towards the blue. Another is the changing illumination during the day which results in changing image brightness within an image block. In addition, there is the so-called bidirectional reflectance of the ground (BRDF effects) that is giving rise to a view and sun angle dependent brightness gradient in the image itself. To correct for the first two effects an atmospheric correction with reflectance calibration is chosen. The effects have been corrected successfully for ADS linescan sensor data by using a parametrization of the atmospheric quantities. Following Kaufman et al. the actual atmospheric condition is estimated by the brightness of a dark pixel taken from the image. The BRDF effects are corrected using a semi-empirical modelling of the brightness gradient. Both methods are now extended to frame cameras. Linescan sensors have a viewing geometry that is only dependent from the cross track view zenith angle. The difference for frame cameras now is to include the extra dimension of the view azimuth into the modelling. Since both the atmospheric correction and the BRDF correction require a model inversion with the help of image data, a different image sampling strategy is necessary which includes the azimuth angle dependence. For the atmospheric correction a sixth variable is added to the existing five variables visibility, view zenith angle, sun zenith angle, ground altitude, and flight altitude - thus multiplying the number of modelling input combinations for the offline-inversion. The parametrization has to reflect the view azimuth angle dependence. The BRDF model already contains the view azimuth dependence and is combined with a new sampling strategy.

Beisl, U.

2012-07-01

226

Saturn's hydrogen aurora: Wide field and planetary camera 2 imaging from the Hubble Space Telescope  

Microsoft Academic Search

Wide field and planetary camera 2\\/Hubble Space Telescope (WFPC2\\/HST) images of Saturn's far ultraviolet aurora reveal emissions confined to a narrow band of latitudes near Saturn's north and south poles. The aurorae are most prominent in the morning sector with patterns that appear fixed in local time. The geographic distribution and vertical extent of the auroral emissions seen in these

John T. Trauger; John T. Clarke; Gilda E. Ballester; Robin W. Evans; Christopher J. Burrows; David Crisp; John S. Gallagher; Richard E. Griffiths; J. Jeff Hester; John G. Hoessel; Jon A. Holtzman; John E. Krist; Jeremy R. Mould; Raghvendra Sahai; Paul A. Scowen; Karl R. Stapelfeldt; Alan M. Watson

1998-01-01

227

Camera evidence: visibility analysis through a multicamera viewpoint  

NASA Astrophysics Data System (ADS)

A major criterion in the design of backhoes (and other heavy machinery) is the ability of the operator to see all critical portions of the vehicle and the surrounding environment. Computer graphics provides a method for analyzing this ability prior to the building of full-scale wooden models. By placing the computer graphic camera at the operator's eyepoint, designers can detect poor placement of supports, blind spots, etc. In this type of analysis, the camera becomes an active, yet somewhat imperfect, participant in our understanding of what an operator of the backhoe 'sees'. In order to simulate a backhoe operator's vision from within a cab, one needs to expand the angle of view of the camera to mimic unfocused, peripheral vision. A traditional wide-angle lens creates extreme distortions that are not present in 'natural' vision, and is therefore hardly an adequate representation. The solution we arrived at uses seven cameras fanned out horizontally in order to capture a relatively undistorted 155 degree angle of view. In addition, another camera displays and numerically analyzes the percentage of the loader bucket visible and blocked. These two views are presented simultaneously in order to address both the 'naturalistic' and quantitative needs of the designers, as well as to point to the incompleteness of any one representation of a scene. In the next phase of this project we will bring this type of analysis into a machine environment more conducive to interactivity: a backhoe simulator with levers to control the vehicle and bucket positions, viewed through a virtual reality environment.

Bajuk, Mark

1992-06-01

228

Wide Angle View of Arsia Mons Volcano  

NASA Technical Reports Server (NTRS)

Arsia Mons (above) is one of the largest volcanoes known. This shield volcano is part of an aligned trio known as the Tharsis Montes--the others are Pavonis Mons and Ascraeus Mons. Arsia Mons is rivaled only by Olympus Mons in terms of its volume. The summit of Arsia Mons is more than 9 kilometers (5.6 miles) higher than the surrounding plains. The crater--or caldera--at the volcano summit is approximately 110 km (68 mi) across. This view of Arsia Mons was taken by the red and blue wide angle cameras of the Mars Global Surveyor Mars Orbiter Camera (MOC) system. Bright water ice clouds (the whitish/bluish wisps) hang above the volcano--a common sight every martian afternoon in this region. Arsia Mons is located at 120o west longitude and 9o south latitude. Illumination is from the left.

1999-01-01

229

Colorimetric Correction for Stereoscopic Camera Arrays  

E-print Network

, i.e. gain, brightness or shutter speed may not solve the problem. Moreover, the camera response between multiple cameras of a camera array. 2 Related Work The problem of transferring the colorimetric

Paris-Sud XI, Université de

230

CSc 165 Lecture Note Slides Camera Control  

E-print Network

) Cameras: o Located at the player's "point of view" o Player's loc/dir changed by manipulating camera" to reduce jerkiness Examples: 13 Mario Kart Mario Kart 64 Battle Mode CSc 165 Lecture Notes Camera Control

Gordon, Scott

231

Engineer reconnaissance with a video camera: feasibility study  

E-print Network

campaign from Field Marshal Montgomery's failed seizure of a "bridge too far" in Arnheim, Holland to the Soviet Union's unique use of ice bridges to resupply the defenders of Stalingrad. From a military standpoint, bridges either exist or need... OBJECTIVES This study will evaluate the feasibility of the acquisition of remote sensed engineer reconnaissance data (through the use of a Charged-Coupled Device !CCD) video camera, laser rangefinder and angle measuring capability) and subsequent...

Bergner, Kirk Michael

2012-06-07

232

Viscous flow features on the surface of Mars: Observations from high-resolution Mars Orbiter Camera (MOC)  

E-print Network

Viscous flow features on the surface of Mars: Observations from high-resolution Mars Orbiter Camera-thick surface layer was identified in high-resolution (Mars Orbiter Camera (MOC) images. A global features. Slope angles derived from Mars Orbiter Laser Altimeter (MOLA) data, along with an experimentally

Head III, James William

233

Polarizing aperture stereoscopic cinema camera  

NASA Astrophysics Data System (ADS)

The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor (the size of the standard 35mm frame) with the means to select left and right image information. Even with the added stereoscopic capability the appearance of existing camera bodies will be unaltered.

Lipton, Lenny

2012-03-01

234

Polarizing aperture stereoscopic cinema camera  

NASA Astrophysics Data System (ADS)

The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor, the size of the standard 35 mm frame, with the means to select left and right image information. Even with the added stereoscopic capability, the appearance of existing camera bodies will be unaltered.

Lipton, Lenny

2012-07-01

235

X-ray Pinhole Camera Measurements  

SciTech Connect

The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

Nelson, D. S. [NSTec; Berninger, M. J. [NSTec; Flores, P. A. [NSTec; Good, D. E. [NSTec; Henderson, D. J. [NSTec; Hogge, K. W. [NSTec; Huber, S. R. [NSTec; Lutz, S. S. [NSTec; Mitchell, S. E. [NSTec; Howe, R. A. [NSTec; Mitton, C. V. [NSTec; Molina, I. [NSTec; Bozman, D. R. [SNL; Cordova, S. R. [SNL; Mitchell, D. R. [SNL; Oliver, B. V. [SNL; Ormond, E. C. [SNL

2013-07-01

236

Vision Sensors and Cameras  

NASA Astrophysics Data System (ADS)

Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

Hoefflinger, Bernd

237

A lexicon for Camera Obscura  

E-print Network

The camera obscura has allowed artists, scientists, and philosophers to view the world as a flat image. Two - dimensional renditions of visual reality seem to be more manageable and easier to grasp than reality itself. A ...

Rosinsky, Robert David

1984-01-01

238

National Park Service Web Cameras  

NSDL National Science Digital Library

The National Park Service (NPS) operates digital cameras at many parks in the lower 48 states, Alaska, and Hawaii to help educate the public on air quality issues. These cameras often show the effects of air pollution, especially visibility impairment. Because the cameras are typically located near air quality monitoring sites, their web pages display other information along with the photo, such as current levels of ozone, particulate matter, or sulfur dioxide, visual range, and weather conditions. The digital photos are usually updated every 15 minutes, while air quality data values are revised hourly. Charts of the last ten days of hourly weather, ozone, particulate matter, or sulfur dioxide data are also available. The cameras are accessible by clicking on an interactive map.

239

THZ EMISSION SPECTROSCOPY OF NARROW BANDGAP SEMICONDUCTORS  

E-print Network

THZ EMISSION SPECTROSCOPY OF NARROW BANDGAP SEMICONDUCTORS By Ricardo Asc´azubi A Thesis Submitted-Domain Spectroscopy . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1 Optically Excited THz Emission Processes Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 5.1 THz-TDS Setup

Wilke, Ingrid

240

Solid State Television Camera (CID)  

NASA Technical Reports Server (NTRS)

The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

Steele, D. W.; Green, W. T.

1976-01-01

241

Polygon Angle Applet  

NSDL National Science Digital Library

This interactive Java applet supports the investigation of the relationship between the number of vertices of a polygon and its interior angle sum. Learners choose and locate the vertices, the angle measures are displayed, and then the student can drag the measures into a circle to see them summed relative to 360 degrees.

Exner, Nicholas

2000-05-31

242

What's My Angle?  

NSDL National Science Digital Library

This interactive module offers learners the opportunity to check their knowledge of angle measure and estimation, and the use of a protractor. There are ten activities that vary the tasks and the degree of precision. The site is designed for whiteboard demonstration as well, and it includes a tutorial on angle types and protractor use.

2011-01-01

243

SunAngle Calculator  

NSDL National Science Digital Library

SunAngle is an on-line tool that calculates solar angles and related information for a given location, date, and time. It computes the declination of the Sun, sunrise and sunset times, azimuth of the Sun, solar time and more. Complete instructions and definitions of variables are included.

Gronbeck, Christopher

244

Feature-based automatic configuration of semi-stationary multi-camera components  

NASA Astrophysics Data System (ADS)

Autonomously operating semi-stationary multi-camera components are the core modules of ad-hoc multi-view methods. On the one hand a situation recognition system needs overview of an entire scene, as given by a wide-angle camera, and on the other hand a close-up view from e.g. an active pan-tilt-zoom (PTZ) camera of interesting agents is required to further increase the information to e.g. identify those agents. To configure such a system we set the field of view (FOV) of the overview-camera in correspondence to the motor configuration of a PTZ camera. Images are captured from a uniformly moving PTZ camera until the entire field of view of the master camera is covered. Along the way, a lookup table (LUT) of motor coordinates of the PTZ camera and image coordinates in the master camera is generated. To match each pair of images, features (SIFT, SURF, ORB, STAR, FAST, MSER, BRISK, FREAK) are detected, selected by nearest neighbor distance ratio (NNDR), and matched. A homography is estimated to transform the PTZ image to the master image. With that information comprehensive LUTs are calculated via barycentric coordinates and stored for every pixel of the master image. In this paper the robustness, accuracy, and runtime are quantitatively evaluated for different features.

Grosselfinger, Ann-Kristin; Münch, David; Hübner, Wolfgang; Arens, Michael

2013-10-01

245

Vibration detection and calibration method used to remote sensing optical camera  

NASA Astrophysics Data System (ADS)

In order to obtain sharp remote sensing images, the image stabilization technology of space camera and the remote sensing image restoration technology are usually used now. Vibration detection is the key to realize these technologies: an image stabilization system needs the displacement vector derived from vibration detection to drive the compensation mechanism; and the remote sensing image restoration technology needs the vibration displacement vector to construct the point spread function (PSF). Vibration detection not only can be used to improve image quality of panchromatic camera, infrared cameras and other optical camera, also is motion compensation basis of satellite radar equipment. In this paper we have constructed a vibration measuring method based on Fiber optic gyro (FOG). FOG is a device sensitive to angular velocity or angular displacement. High-precision FOG can be used to measure the jitter angle of the optic axis of a space camera fixed on satellite platform. According to the measured data, the vibration displacement vector of the imaging plane can be calculated. Consequently the vibration data provide a basis for image stabilization of space camera and restoration of remote sensing images. We simulated the vibration of a space camera by using a piezoelectric ceramic deflection platform, and calibrated vibration measurement by using laser beam and a high-speed linear array camera. We compared the feedback output of the deflection platform, the FOG measured data and the calibrated data of the linear array camera, and obtained a calibration accuracy better than 1.5 ?rad.

Li, Qi; Dong, Wende; Xu, Zhihai; Feng, Huajun

2013-09-01

246

The MC and LFC cameras. [metric camera (MC); large format camera (LFC)  

NASA Technical Reports Server (NTRS)

The characteristics of the shuttle-borne Large Format Camera are listed. The LFC focal plane format was 23 by 46 cm, double the usual size, thereby acquiring approximately double the ground area. Forward motion compensation was employed. With the stable platform (shuttle) it was possible to use the slow exposure, high resolution, Kodak aerial films; 3414 and 3412 black and white, SO-242 color, and SO-131 aerochrome infrared. The camera was designed to maintain stability during varying temperature extremes of space.

Norton, Clarice L.; Schroeder, Manfried; Mollberg, Bernard

1986-01-01

247

Special Angle Pairs Discovery Activity  

NSDL National Science Digital Library

This lesson uses a discovery approach to identify the special angles formed when a set of parallel lines is cut by a transversal. During this lesson students identify the angle pair and the relationship between the angles. Students use this relationship and special angle pairs to make conjectures about which angle pairs are considered special angles.

Henry, Barbara

2012-04-16

248

Using Narrow Band Photometry to Detect Young Brown Dwarfs in IC348  

E-print Network

We report the discovery of a population of young brown dwarf candidates in the open star cluster IC348 and the development of a new spectroscopic classification technique using narrow band photometry. Observations were made using FLITECAM, the First Light Camera for SOFIA, at the 3-m Shane Telescope at Lick Observatory. FLITECAM is a new 1-5 micron camera with an 8 arcmin field of view. Custom narrow band filters were developed to detect absorption features of water vapor (at 1.495 microns) and methane (at 1.66 microns) characteristic of brown dwarfs. These filters enable spectral classification of stars and brown dwarfs without spectroscopy. FLITECAM's narrow and broadband photometry was verified by examining the color-color and color-magnitude characteristics of stars whose spectral type and reddening was known from previous surveys. Using our narrow band filter photometry method, it was possible to identify an object measured with a signal-to-noise ratio of 20 or better to within +/-3 spectral class subtypes for late-type stars. With this technique, very deep images of the central region of IC348 (H ~ 20.0) have identified 18 sources as possible L or T dwarf candidates. Out of these 18, we expect that between 3 - 6 of these objects are statistically likely to be background stars, with the remainder being true low-mass members of the cluster. If confirmed as cluster members then these are very low-mass objects (~5 Mjupiter). We also describe how two additional narrow band filters can improve the contrast between M, L, and T dwarfs as well as provide a means to determine the reddening of an individual object.

A. K. Mainzer; Ian S. McLean

2003-06-30

249

VLSI-distributed architectures for smart cameras  

Microsoft Academic Search

Smart cameras use video\\/image processing algorithms to capture images as objects, not as pixels. This paper describes architectures for smart cameras that take advantage of VLSI to improve the capabilities and performance of smart camera systems. Advances in VLSI technology aid in the development of smart cameras in two ways. First, VLSI allows us to integrate large amounts of processing

Wayne H. Wolf

2001-01-01

250

Smart Camera Networks in Virtual Reality  

Microsoft Academic Search

We present smart camera network research in the context of a unique new synthesis of advanced computer graphics and vision simulation technologies. We design and experiment with simulated camera networks within visually and behaviorally realistic virtual environments. Specifically, we demonstrate a smart camera network comprising static and active simulated video surveillance cameras that provides perceptive coverage of a large virtual

Faisal Qureshi; Demetri Terzopoulos

2007-01-01

251

Virtual Vision and Smart Camera Networks  

Microsoft Academic Search

This paper presents camera network research in the context of a unique synthesis of advanced computer graphics and vision sim- ulation technologies. In particular, we propose the design of and experimentation with simulated camera networks within visually and behaviorally realistic virtual environments. Specifically, we demonstrate a smart camera network comprising static and active simulated video surveillance cameras that provides perceptive

Faisal Qureshi; Demetri Terzopoulos

252

Optimal Camera Network Configurations for Visual Tagging  

Microsoft Academic Search

Proper placement of cameras in a distributed smart camera network is an important design problem. Not only does it determine the coverage of the surveillance, but it also has a direct impact on the appearance of objects in the cameras which dictates the performance of all subsequent computer vision tasks. In this paper, we propose a generic camera placement model

Jian Zhao; Thinh Nguyen

2008-01-01

253

Photometric Calibration of Consumer Video Cameras  

NASA Technical Reports Server (NTRS)

Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

Suggs, Robert; Swift, Wesley, Jr.

2007-01-01

254

Video camera use at nuclear power plants  

SciTech Connect

A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs.

Estabrook, M.L.; Langan, M.O.; Owen, D.E. (ENCORE Technical Resources, Inc., Middletown, PA (USA))

1990-08-01

255

Surveillance camera scheduling: a virtual vision approach  

Microsoft Academic Search

ABSTRACT We present a surveillance system, comprising wide field-of-view (FOV) passive cameras and pan\\/tilt\\/zoom (PTZ) active cameras, which automatically captures and labels high-resolution videos of pedestrians as they move,through a designated area. A wide-FOV stationary camera can track multiple pedestrians, while any PTZ active camera can capture high-quality videos of a single pedestrian at a time. We propose a multi-camera

Faisal Z. Qureshi; Demetri Terzopoulos

2006-01-01

256

Surveillance camera scheduling: a virtual vision approach  

Microsoft Academic Search

We present a surveillance system, comprising wide field-of-view (FOV) passive cameras and pan\\/tilt\\/zoom (PTZ) active cameras, which automatically captures and labels high-resolution videos of pedestrians as they move through a designated area. A wide-FOV stationary camera can track multiple pedestrians, while any PTZ active camera can capture high-quality videos of a single pedestrian at a time. We propose a multi-camera

Faisal Z. Qureshi; Demetri Terzopoulos

2005-01-01

257

WIDE-FIELD ASTRONOMICAL MULTISCALE CAMERAS  

SciTech Connect

In order to produce sufficiently low aberrations with a large aperture, telescopes have a limited field of view. Because of this narrow field, large areas of the sky at a given time are unobserved. We propose several telescopes based on monocentric reflective, catadioptric, and refractive objectives that may be scaled to wide fields of view and achieve 1.''1 resolution, which in most locations is the practical seeing limit of the atmosphere. The reflective and Schmidt catadioptric objectives have relatively simple configurations and enable large fields to be captured at the expense of the obscuration of the mirror by secondary optics, a defect that may be managed by image plane design. The refractive telescope design does not have an obscuration but the objective has substantial bulk. The refractive design is a 38 gigapixel camera which consists of a single monocentric objective and 4272 microcameras. Monocentric multiscale telescopes, with their wide fields of view, may observe phenomena that might otherwise be unnoticed, such as supernovae, glint from orbital space debris, and near-earth objects.

Marks, Daniel L.; Brady, David J., E-mail: dbrady@ee.duke.edu [Department of Electrical and Computer Engineering and Fitzpatrick Institute for Photonics, Box 90291, Duke University, Durham, NC 27708 (United States)

2013-05-15

258

11/18/2006 Mohanty 1 Secure Digital CameraSecure Digital Camera  

E-print Network

11/18/2006 Mohanty 1 Secure Digital CameraSecure Digital Camera Saraju P. Mohanty VLSI Design Management (DRM) Secure Digital Camera (SDC) One of our Digital Integrated Circuit solutions Invisible;11/18/2006 Mohanty 10 Our Solution for DRM:Our Solution for DRM: Secure Digital Camera (SDC)Secure Digital Camera

Mohanty, Saraju P.

259

1/15/2007 Mohanty 1 Secure Digital CameraSecure Digital Camera  

E-print Network

1/15/2007 Mohanty 1 Secure Digital CameraSecure Digital Camera Saraju P. Mohanty VLSI Design Management (DRM) Secure Digital Camera (SDC) One of our Digital Integrated Circuit solutions Invisible;1/15/2007 Mohanty 10 Our Solution for DRM:Our Solution for DRM: Secure Digital Camera (SDC)Secure Digital Camera

Mohanty, Saraju P.

260

Narrow escape and leakage of Brownian particles  

E-print Network

Questions of flux regulation in biological cells raise renewed interest in the narrow escape problem. The often inadequate expansions of the narrow escape time are due to a not so well known fact that the boundary singularity of Green's function for Poisson's equation with Neumann and mixed Dirichlet-Neumann boundary conditions in three-dimensions contains a logarithmic singularity. Using this fact, we find the second term in the expansion of the narrow escape time and in the expansion of the principal eigenvalue of the Laplace equation with mixed Dirichlet-Neumann boundary conditions, with small Dirichlet and large Neumann parts. We also find the leakage flux of Brownian particles that diffuse from a source to an absorbing target on a reflecting boundary of a domain, if a small perforation is made in the reflecting boundary.

A. Singer; Z. Schuss; D. Holcman

2008-08-17

261

Cameras for semiconductor process control  

NASA Technical Reports Server (NTRS)

The application of X-ray topography to semiconductor process control is described, considering the novel features of the high speed camera and the difficulties associated with this technique. The most significant results on the effects of material defects on device performance are presented, including results obtained using wafers processed entirely within this institute. Defects were identified using the X-ray camera and correlations made with probe data. Also included are temperature dependent effects of material defects. Recent applications and improvements of X-ray topographs of silicon-on-sapphire and gallium arsenide are presented with a description of a real time TV system prototype and of the most recent vacuum chuck design. Discussion is included of our promotion of the use of the camera by various semiconductor manufacturers.

Porter, W. A.; Parker, D. L.

1977-01-01

262

The GISMO-2 Bolometer Camera  

NASA Technical Reports Server (NTRS)

We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

2012-01-01

263

The Discovery of a Very Narrow-Line Star Forming Obat a Redshift of 5.66ject  

Microsoft Academic Search

We report on the discovery of a very narrow-line star forming object beyond\\u000aredshift of 5. Using the prime-focus camera, Suprime-Cam, on the 8.2 m Subaru\\u000atelescope together with a narrow-passband filter centered at $\\\\lambda_{\\\\rm c}$\\u000a= 8150 \\\\AA with passband of $\\\\Delta\\\\lambda$ = 120 \\\\AA, we have obtained a very\\u000adeep image of the field surrounding the quasar SDSSp

Y. Taniguchi; M. Ajiki; T. Murayama; T. Nagao; S. Veilleux; D. B. Sanders; Y. Komiyama; Y. Shioya; S. S. Fujita; Y. Kakazu; S. Okamura; H. Ando; T. Nishimura; M. Hayashi; R. Ogasawara; S. Ichikawa

2003-01-01

264

'Magic Angle Precession'  

SciTech Connect

An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

Binder, Bernd [Quanics.com, Germany, 88679 Salem, P.O. Box 1247 (United States)], E-mail: binder@quanics.com

2008-01-21

265

Development of filter exchangeable 3CCD camera for multispectral imaging acquisition  

NASA Astrophysics Data System (ADS)

There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

2012-05-01

266

Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165;

Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

267

Narrow-band ELF events observed from South Pole Station  

NASA Astrophysics Data System (ADS)

Extremely Low Frequency (ELF) waves are typically in the range of 3 Hz - 3 kHz and can play a role in acceleration and pitch-angle scattering of energetic particles in the radiation belts. Observations of a not uncommon, but not well studied ELF phenomenon are presented with ground-based data from South Pole Station. The narrow-band waves last approximately one or two minutes maintaining bandwidth over the course of the event, begin around 100 Hz, decrease to about 70 Hz, and typically show a higher frequency harmonic. The waves have only been documented at four locations - Heacock, 1974 (Alaska); Sentman and Ehring, 1994 (California); Wang et al, 2005 and Wang et al, 2011 (Taiwan); and Kim et al, 2006 (South Pole). The waves observed at the South Pole are not detected when the Sun drops below a 10 degree elevation angle, which is not true for the other locations. We extend the study of Kim et al, 2006, and explore possible generation mechanisms including sunlit ionosphere and ion cyclotron wave modes, as well as correspondence with energetic particle precipitation.

Heavisides, J.; Weaver, C.; Lessard, M.; Weatherwax, A. T.

2012-12-01

268

Multiple Sensor Camera for Enhanced Video Capturing  

NASA Astrophysics Data System (ADS)

A resolution of camera has been drastically improved under a current request for high-quality digital images. For example, digital still camera has several mega pixels. Although a video camera has the higher frame-rate, the resolution of a video camera is lower than that of still camera. Thus, the high-resolution is incompatible with the high frame rate of ordinary cameras in market. It is difficult to solve this problem by a single sensor, since it comes from physical limitation of the pixel transfer rate. In this paper, we propose a multi-sensor camera for capturing a resolution and frame-rate enhanced video. Common multi-CCDs camera, such as 3CCD color camera, has same CCD for capturing different spectral information. Our approach is to use different spatio-temporal resolution sensors in a single camera cabinet for capturing higher resolution and frame-rate information separately. We build a prototype camera which can capture high-resolution (2588×1958 pixels, 3.75 fps) and high frame-rate (500×500, 90 fps) videos. We also proposed the calibration method for the camera. As one of the application of the camera, we demonstrate an enhanced video (2128×1952 pixels, 90 fps) generated from the captured videos for showing the utility of the camera.

Nagahara, Hajime; Kanki, Yoshinori; Iwai, Yoshio; Yachida, Masahiko

269

Adverse effects of prohibiting narrow provider networks.  

PubMed

Many insurers participating in the new insurance exchanges are controlling costs by offering plans with narrow provider networks. Proposed regulations would promote network adequacy, but a pro-provider stance may not be inherently pro-consumer or even pro-patient. PMID:25119604

Howard, David H

2014-08-14

270

Congenital narrowing of the spinal canal  

Microsoft Academic Search

Further examples of congenital narrowing of the spinal canal in the lumbar and cervical regions are presented. It is implied that the condition is a nosological entity. Neurogenic intermittent claudication often accompanies the lumbar variety; the diagnosis in the cervical region is, however, radiological. The similarity to some of the features of achondroplasia is stressed. Symptoms are usually relieved by

D O Hancock

1967-01-01

271

From Reduction Machines To Narrowing Machines  

Microsoft Academic Search

Narrowing, the evaluation mechanism of functional logic languages, can be seen as a generalizationof reduction, the evaluation mechanism of purely functional languages. The unidirectionalpattern matching, which is used for parameter passing in functional languages, is simplyreplaced by the bidirectional unification known from logic programming languages. We showin this paper, how to extend a reduction machine, that has been designed for

Rita Loogen; RWTH Aachen

1991-01-01

272

Policy message A narrow focus on conventional  

E-print Network

Policy message n A narrow focus on conventional sanitation technologies and top- down planning often prevents improvement of sanitation in poor settlements. n Simple, affordable, effective tech studies featured here were conducted in: Lao PDR, Tanzania, and Nepal Local solutions for sanitation Urban

Richner, Heinz

273

Gas and Liquid Transfer in Narrow Pores  

Microsoft Academic Search

A microhydrodynamic approach is formulated to describe the molecular flows in micro- and mesoporous systems, including the region of capillary phenomena, where one needs a single set of equations characterizing the flows of both dense gases and liquids. The set of equations of the transfer of dense fluids in narrow pores is closed by using the simplest molecular model, specifically,

Yu. K. Tovbin

2002-01-01

274

Narrow-Band Applications of Communications Satellites.  

ERIC Educational Resources Information Center

This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by…

Cowlan, Bert; Horowitz, Andrew

275

Perceptual Narrowing During Infancy: A Comparison of  

E-print Network

/ or experience-expectant) initial sensitivities that prepare the infant for learning about aspects of their world species, own race). We then consider possible reasons for the apparent differences in the timing of narrowing (e.g., apparently earlier for own race than for own species). Throughout we consider whether

Maurer, Daphne M.

276

CIT: Mechanical Rotating Mirror Cameras  

Microsoft Academic Search

Mechanical Rotating Mirror (RM) Cameras Nuclear uses are: (1) Developing high explosive (HE) components and initiation systems for nuclear explosive devices; (2) Dynamic materials properties studies; and (3) Provide useful information on the performance of nuclear explosive device components driven by HE. Other uses are: (1) Military applications in shaped charge development; projectile ballistics and impact; (2) Diagnose high-speed phenomena

Davis R. Thomsen; Loretta A. Weiss

2012-01-01

277

High speed multiwire photon camera  

NASA Technical Reports Server (NTRS)

An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

Lacy, Jeffrey L. (Inventor)

1989-01-01

278

High speed multiwire photon camera  

NASA Technical Reports Server (NTRS)

An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

Lacy, Jeffrey L. (Inventor)

1991-01-01

279

Directing Performers for the Cameras.  

ERIC Educational Resources Information Center

An excellent way for an undergraduate, novice director of television and film to pick up background experience in directing performers for cameras is by participating in nonbroadcast-film activities, such as theatre, dance, and variety acts, both as performer and as director. This document describes the varieties of activities, including creative,…

Wilson, George P., Jr.

280

All-sky camera revitalized  

Microsoft Academic Search

Development and implementation of a low-cost all-sky camera (ASC) system is reported. The ASC system provides continuous unmanned recording for about a month before tapes are replaced. Radiometric calibration, geometric correction and projection of the image onto a geographic or geomagnetic coordinate system are performed by a user-friendly software.

Israel Oznovich; Ronald Yee; Andreas Schiffler; Donald J. McEwen; Gerorge J. Sofko

1994-01-01

281

All-sky camera revitalized  

NASA Astrophysics Data System (ADS)

Development and implementation of a low-cost all-sky camera (ASC) system is reported. The ASC system provides continuous unmanned recording for about a month before tapes are replaced. Radiometric calibration, geometric correction and projection of the image onto a geographic or geomagnetic coordinate system are performed by a user-friendly software.

Oznovich, Israel; Yee, Ronald; Schiffler, Andreas; McEwen, Donald J.; Sofko, Gerorge J.

1994-10-01

282

Binarising Camera Images for OCR  

Microsoft Academic Search

In this paper we describe a new binarisation method de- signed specifically for OCR of low quality camera images: Background Surface Thresholding or BST. This method is robust to lighting variations and produces images with very little noise and consistent stroke width. BST computes a \\

Mauritius Seeger; Christopher R. Dance

2001-01-01

283

Measuring Distances Using Digital Cameras  

ERIC Educational Resources Information Center

This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

Kendal, Dave

2007-01-01

284

Stratoscope 2 integrating television camera  

NASA Technical Reports Server (NTRS)

The development, construction, test and delivery of an integrating television camera for use as the primary data sensor on Flight 9 of Stratoscope 2 is described. The system block diagrams are presented along with the performance data, and definition of the interface of the telescope with the power, telemetry, and communication system.

1973-01-01

285

NICMOS Narrow-band Infrared Photometry of TW Hya Association Stars  

Microsoft Academic Search

We have obtained 1.64, 1.90 and 2.15 micron narrow-band images of five T\\u000aTauri stars in the TW Hya Association (TWA) using the Near-Infrared Camera and\\u000aMultiobject Spectrometer aboard the Hubble Space Telescope. Most of the T Tauri\\u000astars in our study show evidence of absorption by H2O vapor in their\\u000aatmospheres; in addition, the low-mass brown dwarf candidate, TWA

David A. Weintraub; Didier Saumon; Joel H. Kastner; Thierry Forveille

2000-01-01

286

In-flight calibration of the Dawn Framing Camera  

NASA Astrophysics Data System (ADS)

We present a method for calibrating images acquired by the Dawn Framing Camera (FC) that is based on the results of an in-flight calibration campaign performed during the cruise from Earth to Vesta. We describe this campaign and the data analysis in full. Both the primary camera FC2 and the backup camera FC1 are radiometrically and geometrically calibrated through observations of standard stars, star fields, and Solar System objects. The calibration in each spectral filter is accurate to within a few percent for point sources. Geometric distortion, small by design, is characterized with high accuracy. Dark current, monitored on a regular basis, is very low at flight operational temperatures. Out-of-field stray light was characterized using the Sun as a stray light source. In-field stray light is confirmed in narrow-band filter images of Vesta. Its magnitude and distribution are scene-dependent, and expected to contribute significantly to images of extended objects. Description of a method for in-field stray light correction is deferred to a follow-up paper, as is a discussion of the closely related topic of flat-fielding.

Schröder, S. E.; Maue, T.; Gutiérrez Marqués, P.; Mottola, S.; Aye, K. M.; Sierks, H.; Keller, H. U.; Nathues, A.

2013-11-01

287

Using Inscribed Angles and Polygons  

NSDL National Science Digital Library

This unit will teach you about inscribed angles, intercepted arcs, their measures, inscribed polygons, and their associated theorems. OK, time for notes! Define Inscribed Angles, using the following website (Only define the inscribed angle from this site): Inscribed Angle Definition Using this new idea, you can use the following activity to figure out the formula for the measure of an inscribed angle: Inscribed Angle Formula Discovery The whole lesson depends upon this definition. Define Intercepted Arc, Inscribed polygons, ...

Neubert, Mrs.

2011-03-10

288

What Is the Angle?  

NSDL National Science Digital Library

This activity will help students understand how the angle of the Sun affects temperatures around the globe. After experimenting with a heat lamp and thermometers at differing angles, students apply what they learned to explain temperature variations on Earth. The printable six-page handout includes a series of inquiry-based questions to get students thinking about what they already know about temperature patterns, detailed experiment directions, and a worksheet that will help students use the experiment results to gain a deeper understanding of seasonal temperature changes and why Antarctica is always so cold. The students will explore all the angles of sunlight with a few thermometers and a heat lamp and understand why there is such a dramatic temperature change between the equator and the South Pole.

289

What's the Angle?  

NSDL National Science Digital Library

This classroom activity helps students understand how the angle of the Sun affects temperatures around the globe. After experimenting with a heat lamp and thermometers at differing angles, students apply what they learned to explain temperature variations on Earth. The printable six-page handout includes a series of inquiry-based questions to get students thinking about what they already know about temperature patterns, detailed experiment directions and a worksheet that helps students use the experiment results to gain a deeper understanding of seasonal temperature changes and why Antarctica is always so cold.

290

The role of contact angle on unstable flow formation during infiltration and drainage in wettable porous media  

NASA Astrophysics Data System (ADS)

The impact of contact angle on 2-D spatial and temporal water-content distribution during infiltration and drainage was experimentally studied. The 0.3-0.5 mm fraction of a quartz dune sand was treated and turned subcritically repellent (contact angle of 33°, 48°, 56°, and 75° for S33, S48, S56, and S75, respectively). The media were packed uniformly in transparent flow chambers and water was supplied to the surface as a point source at different rates (1-20 ml/min). A sequence of gray-value images was taken by CCD camera during infiltration and subsequent drainage; gray values were converted to volumetric water content by water volume balance. Narrow and long plumes with water accumulation behind the downward moving wetting front (tip) and negative water gradient above it (tail) developed in the S56 and S75 media during infiltration at lower water application rates. The plumes became bulbous with spatially uniform water-content distribution as water application rates increased. All plumes in these media propagated downward at a constant rate during infiltration and did not change their shape during drainage. In contrast, regular plume shapes were observed in the S33 and S48 media at all flow rates, and drainage profiles were nonmonotonic with a transition plane at the depth that water reached during infiltration. Given that the studied media have similar pore-size distributions, the conclusion is that imbibition hindered by the nonzero contact angle induced pressure buildup at the wetting front (dynamic water-entry value) that controlled the plume shape and internal water-content distribution during infiltration and drainage.

Wallach, Rony; Margolis, Michal; Graber, Ellen R.

2013-10-01

291

General linear cameras : theory and applications  

E-print Network

I present a General Linear Camera (GLC) model that unifies many previous camera models into a single representation. The GLC model describes all perspective (pinhole), orthographic, and many multiperspective (including ...

Yu, Jingyi, 1978-

2005-01-01

292

21 CFR 892.1110 - Positron camera.  

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2014-04-01

293

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2011 CFR

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2011-04-01

294

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2012 CFR

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2012-04-01

295

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2013 CFR

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2013-04-01

296

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2010 CFR

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2010-04-01

297

Selecting a digital camera for telemedicine.  

PubMed

The digital camera is an essential component of store-and-forward telemedicine (electronic consultation). There are numerous makes and models of digital cameras on the market, and selecting a suitable consumer-grade camera can be complicated. Evaluation of digital cameras includes investigating the features and analyzing image quality. Important features include the camera settings, ease of use, macro capabilities, method of image transfer, and power recharging. Consideration needs to be given to image quality, especially as it relates to color (skin tones) and detail. It is important to know the level of the photographer and the intended application. The goal is to match the characteristics of the camera with the telemedicine program requirements. In the end, selecting a digital camera is a combination of qualitative (subjective) and quantitative (objective) analysis. For the telemedicine program in Alaska in 2008, the camera evaluation and decision process resulted in a specific selection based on the criteria developed for our environment. PMID:19519277

Patricoski, Chris; Ferguson, A Stewart

2009-06-01

298

Subnanosecond x-ray framing camera  

SciTech Connect

A subnanosecond x-ray framing camera is described. Experiments conducted at the Rutherford Appleton Laboratory in which the camera was used to observe six beam laser implosions of microballoons with an interframe time of 500 ps are also described.

Finn, N.; Hall, T.A.; McGoldrick, E.

1985-04-15

299

Optimising Camera Traps for Monitoring Small Mammals  

PubMed Central

Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

2013-01-01

300

Quadrotor control using dual camera visual feedback  

Microsoft Academic Search

In this paper, a vision-based stabilization and output tracking control method for a four-rotor helicopter has been proposed. A novel 2 camera method has been described for estimating the full 6 DOF pose of the heli- copter. This two camera system is consisting of a pan-tilt ground camera and an onboard camera. The pose estimation algorithm is compared in simulation

Erdinç Altug; James P. Ostrowski; Camillo J. Taylor

2003-01-01

301

Thermal Modeling of Multipass Narrow Gap Pulse Current GMA Welding by Single Seam per Layer Deposition Technique  

Microsoft Academic Search

A thermal model has been developed for production of multipass single seam per layer narrow gap pulse current gas metal arc (GMA) weld joint of thick plates free from lack of groove wall fusion under no angle of attack to groove wall. The model considers the fusion of groove wall as well as a part of earlier deposited weld based

Banshi Prasad Agrawal; P. K. Ghosh

2010-01-01

302

Atmospheric monitoring of Mars by the Mars Orbiter Camera on Mars global surveyor  

Microsoft Academic Search

MOC wide-angle cameras routinely produce daily global maps of Mars in two colors at ~7.5 km\\/pixel resolution. These images have been used to study several seasonal phenomena linked to atmospheric processes and condensate cycles: dust storms, clouds, and polar recessions. Preliminary results of observations of polar caps and dust storms are presented here.

P. B. James; B. A. Cantor

2002-01-01

303

Atmospheric monitoring of Mars by the Mars Orbiter Camera on Mars global surveyor  

Microsoft Academic Search

MOC wide-angle cameras routinely produce daily global maps of Mars in two colors at ?7.5 km\\/pixel resolution. These images have been used to study several seasonal phenomena linked to atmospheric processes and condensate cycles: dust storms, clouds, and polar recessions. Preliminary results of observations of polar caps and dust storms are presented here.

P. B. James; B. A. Cantor

2002-01-01

304

Scene recovery from many randomly distributed single pixel cameras R. B. Fisher  

E-print Network

and local communication system (e.g. radio or optical). The `camera' measures the intensity of light arriving within a fixed angle from its optical axis. These devices are feasible with current technology, given the sensors and their positions. If the sensors were regularly placed, then methods inspired by CT/MRI

Fisher, Bob

305

Contact Angle Measurements Using a Simplified Experimental Setup  

ERIC Educational Resources Information Center

A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

2010-01-01

306

Image quality testing of assembled IR camera modules  

NASA Astrophysics Data System (ADS)

Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

Winters, Daniel; Erichsen, Patrik

2013-10-01

307

Narrow bandwidth tunable optical parametric generator  

NASA Astrophysics Data System (ADS)

The output of a periodically poled lithium niobate (PPLN) optical parametric generator (OPG) is filtered using an off axis Fabry-Perot etalon. The filtered output is then parametrically amplified in the same PPLN crystal resulting in a tunable narrow-band infrared source. The PPLN OPG is pumped with a10 nsec pulse duration, 1.064 ?m singlefrequency pump laser, with an output signal and idler determined by the PPLN periodicity. The polarization of the pump laser is rotated so that only a portion of it is phase matched on the first pass through the PPLN crystal. The portion that is phase matched generates a signal that is directed to an off-axis Fabry-Perot etalon, which, in the off-axis configuration has a narrow bandwidth reflection. The pump beam is transmitted through a quarter wave plate and reflected with a mirror so that when passed back through the PPLN crystal, its polarization is rotated 90 degrees with respect to the input. Hence the portion of the pump not phase matched on the first pass is now phase matched for the second pass. The reflected and filtered signal is co-aligned with the pump resulting in a narrow bandwidth amplified signal. This system is capable of generating narrow bandwidth over the tuning range of the PPLN crystal and is only restricted by the etalon reflectivity range. We demonstrate tunability in the 1.4 ?m -1.6 ?m signal range (3.0 ?m-4.4 ?m idler range), which is restricted by our etalon reflectivity.

Dolasinski, Brian; Powers, Peter

2013-03-01

308

Cooperative Object Tracking with Multiple PTZ Cameras  

Microsoft Academic Search

Research in visual surveillance systems is shifting from using few stationary, passive cameras to employing large heterogeneous sensor networks. One promising type of sensor in particular is the Pan-Tilt-Zoom (PTZ) camera, which can cover a potentially much larger area than pas- sive cameras, and can obtain much higher resolution im- agery through zoom capacity. In this paper, a system that

I. Everts; Nicu Sebe; G. A. Jones

2007-01-01

309

Combustion pinhole-camera system  

DOEpatents

A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

Witte, A.B.

1982-05-19

310

The Advanced Camera for Surveys  

NSDL National Science Digital Library

The Johns Hopkins University describes the Advanced Camera for Surveys (ACS), which was installed in the Hubble Space Telescope in 2002 to "detect light from the ultraviolet to the near infrared." Users can view a photo gallery of the filters, detectors, optical bench, astronomers, and other aspects of ACS optical and mechanical components. While some parts of the website are restricted, scientists can find abstracts and full-text scientific papers, explanations of calibration, the coronagraph and other instruments, and press releases.

311

Unassisted 3D camera calibration  

NASA Astrophysics Data System (ADS)

With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

2012-03-01

312

Dynamic camera calibration of roadside traffic management cameras for vehicle speed estimation  

Microsoft Academic Search

In this paper, we present a new three-stage algorithm to calibrate roadside traffic management cameras and track vehicles to create a traffic speed sensor. The algorithm first estimates the camera position relative to the roadway using the motion and edges of the vehicles. Given the camera position, the algorithm then calibrates the camera by estimating the lane boundaries and the

Todd N. Schoepflin; Daniel J. Dailey

2003-01-01

313

Ultra-narrow laser linewidth measurement  

NASA Astrophysics Data System (ADS)

In this report, we give a deeper investigation of the loss-compensated recirculating delayed self-heterodyne interferometer (LC-RDSHI) for ultra-narrow linewidth measurement, including the theoretical analysis, experimental implementation, further modification on the system and more applications. Recently, less than 1kHz linewidth fiber lasers have been commercialized. But even the manufacturers face a challenge on accurately measuring the linewidth of such lasers. There is a need to develop more accurate methods to characterize ultra-narrow laser linewidth and frequency noises. Compared with other currently available linewidth measurement techniques, the loss-compensated recirculating delayed-heterodyne interferometer (LC-RDSHI) technique is the most promising one. It overcomes the bottle-neck of the high resolution requirement on the delayed self-heterodyne interferometer (DSHI) by using a short length of fiber delay line. This method does not need another narrower and more stable laser as the reference which is the necessary component in heterodyne detection. The laser spectral lineshape can be observed directly instead of complicated interpretation in frequency discriminator techniques. The theoretical analysis of a LC-RDSHI gives us a guidance on choosing the optimal parameters of the system and assists us to interpret the recorded spectral lineshape. Laser linewidth as narrow as 700Hz has been proved to be measurable by using the LC-RDSHI method. The non-linear curve fitting of Voigt lineshape to separate Lorentzian and Gaussian components was investigated. Voigt curve fitting results give us a clear view on laser frequency noises and laser linewidth nature. It is also shown that for a ultra-narrow linewidth laser, simply taking 20dB down from the maximum value of the beat spectrum and dividing by 2 99 will over estimate the laser linewidth and coherent length. Besides laser linewidth measurement in the frequency domain, we also implemented time-domain frequency noise measurement by using a LC-RDSHI. The long fiber delay obtained by a fiber recirculating loop provides a higher resolution of frequency noise measurement. However, spectral width broadening due to fiber nonlinearity, environmental perturbations and laser intrinsic 1/f frequency noises are still potential problems in the LC-RDSHI method. A new method by adding a transmitter switch and a loop switch is proposed to minimize the Kerr effect caused by multiple recirculation.

Chen, Xiaopei

314

Find Angle Measures in Polygons  

NSDL National Science Digital Library

This lesson will introduce you to polygons and their angle measures. Focus on the Interior angles and exterior angles and their properties. First, let's discuss diagonals. What is a diagonal in a polygon? Play with and take notes on the following web site: Diagonals of a Polygon Now you are ready to learn the Polygon Interior Angles Theorem. It involves finding the measure of all of the angles inside a polygon, no matter how big or little ...

Neubert, Mrs.

2011-02-09

315

On the absolute calibration of SO2 cameras  

NASA Astrophysics Data System (ADS)

Sulphur dioxide emission rate measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 300 and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. One important step for correct SO2 emission rate measurements that can be compared with other measurement techniques is a correct calibration. This requires conversion from the measured optical density to the desired SO2 column density (CD). The conversion factor is most commonly determined by inserting quartz cells (cuvettes) with known amounts of SO2 into the light path. Another calibration method uses an additional narrow field-of-view Differential Optical Absorption Spectroscopy system (NFOV-DOAS), which measures the column density simultaneously in a small area of the camera's field-of-view. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different, calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (I-DOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an I-DOAS to verify the calibration curve over the spatial extent of the image. The results show that calibration cells, while working fine in some cases, can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. The measurements presented in this work were taken at Popocatépetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 and 14.34 kg s-1 were observed.

Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

2013-03-01

316

Angles in the Sky?  

NASA Astrophysics Data System (ADS)

Tycho Brahe lived and worked in the late 1500s before the telescope was invented. He made highly accurate observations of the positions of planets, stars, and comets using large angle-measuring devices of his own design. You can use his techniques to observe the sky as well. For example, the degree, a common unit of measurement in astronomy, can be measured by holding your fist at arm's length up to the sky. Open your fist and observe the distance across the sky covered by the width of your pinky fingernail. That is, roughly, a degree! After some practice, and knowing that one degree equals four minutes, you can measure elapsed time by measuring the angle of the distance that the Moon appears to have moved and multiplying that number by four. You can also figure distances and sizes of things. These are not precise measurements, but rough estimates that can give you a "close-enough" answer.

Behr, Bradford

2005-09-01

317

Mini gamma camera, camera system and method of use  

SciTech Connect

A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

Majewski, Stanislaw (Grafton, VA); Weisenberger, Andrew G. (Grafton, VA); Wojcik, Randolph F. (Yorktown, VA)

2001-01-01

318

Normal Q-angle in an Adult Nigerian Population  

Microsoft Academic Search

The Q-angle has been studied among the adult Caucasian population with the establishment of reference values. Scientists are\\u000a beginning to accept the concept of different human races. Physical variability exists between various African ethnic groups\\u000a and Caucasians as exemplified by differences in anatomic features such as a flat nose compared with a pointed nose, wide rather\\u000a than narrow faces, and

Bade B. Omololu; Olusegun S. Ogunlade; Vinod K. Gopaldasani

2009-01-01

319

Correction of calculation method for boresight on aerial remote sensing camera  

NASA Astrophysics Data System (ADS)

The boresight of the aerial remote sensing camera (ARSC) need to be elicited to the reference coordinate system of the satellite after the assemblage of the whole satellite. Because it is difficult to aim the boresight after finish fixing the camera to the satellite, the boresight must be elicited by a cube before the fixing. So the cube coordinate system can be transited to the reference coordinate system. The boresight of the camera is measured by a theodolite. The orientation of the boresight can be solved through measuring four angles of the CCD, the top left corner, the bottom left corner, the top right corner, and the bottom right corner, and then the spatial angle of the boresight can be solved. According to the traditional methods of the data processing after boresight measuring, the limitation has been analyzed by using the Mat lab software. The trace of motion of the theodolite is provided, while it is rotating horizontally with a vertical angle around the vertical axis and rotating vertically with a horizontal angle around the horizontal axis. Based on the vector combination theory, the normalized vector of the boresight can be obtained, so the spatial angle of the boresight can also be calculated. At last, this paper shows two applications in factual measuring.

Xing, Hui; Mu, Sheng-bo; Chen, Jia-yi

2012-10-01

320

Ejs Brewsterâs Angle Model  

NSDL National Science Digital Library

The Ejs Brewsterâs Angle model displays the electric field of an electromagnetic wave incident on a change of index of refraction. The simulation allows an arbitrarily linearly (in parallel and perpendicular components) polarized wave to encounter the change of index of refraction. The initial electric field and incidence angle change of index of refraction can all be changed via sliders. You can modify this simulation if you have Ejs installed by right-clicking within the plot and selecting âOpen Ejs Modelâ from the pop-up menu item. Ejs Brewsterâs Angle model was created using the Easy Java Simulations (Ejs) modeling tool. It is distributed as a ready-to-run (compiled) Java archive. Double clicking the ejs_ehu_waves_brewster.jar file will run the program if Java is installed. Ejs is a part of the Open Source Physics Project and is designed to make it easier to access, modify, and generate computer models. Additional Ejs models for wave optics are available. They can be found by searching ComPADRE for Open Source Physics, OSP, or Ejs.

Aguirregabiria, Juan

2008-08-20

321

Small Angle Neutron Scattering  

SciTech Connect

Small Angle Neutron Scattering (SANS) probes structural details at the nanometer scale in a non-destructive way. This article gives an introduction to scientists who have no prior small-angle scattering knowledge, but who seek a technique that allows elucidating structural information in challenging situations that thwart approaches by other methods. SANS is applicable to a wide variety of materials including metals and alloys, ceramics, concrete, glasses, polymers, composites and biological materials. Isotope and magnetic interactions provide unique methods for labeling and contrast variation to highlight specific structural features of interest. In situ studies of a material s responses to temperature, pressure, shear, magnetic and electric fields, etc., are feasible as a result of the high penetrating power of neutrons. SANS provides statistical information on significant structural features averaged over the probed sample volume, and one can use SANS to quantify with high precision the structural details that are observed, for example, in electron microscopy. Neutron scattering is non-destructive; there is no need to cut specimens into thin sections, and neutrons penetrate deeply, providing information on the bulk material, free from surface effects. The basic principles of a SANS experiment are fairly simple, but the measurement, analysis and interpretation of small angle scattering data involves theoretical concepts that are unique to the technique and that are not widely known. This article includes a concise description of the basics, as well as practical know-how that is essential for a successful SANS experiment.

Urban, Volker S [ORNL

2012-01-01

322

On the absolute calibration of SO2 cameras  

NASA Astrophysics Data System (ADS)

Sulphur dioxide emission flux measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 305 nm and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. While this approach is simple and delivers valuable insights into the two-dimensional SO2 distribution, absolute calibration has proven to be difficult. An accurate calibration of the SO2 camera (i.e., conversion from optical density to SO2 column density, CD) is crucial to obtain correct SO2 CDs and flux measurements that are comparable to other measurement techniques and can be used for volcanological applications. The most common approach for calibrating SO2 camera measurements is based on inserting quartz cells (cuvettes) containing known amounts of SO2 into the light path. It has been found, however, that reflections from the windows of the calibration cell can considerably affect the signal measured by the camera. Another possibility for calibration relies on performing simultaneous measurements in a small area of the camera's field-of-view (FOV) by a narrow-field-of-view Differential Optical Absorption Spectroscopy (NFOV-DOAS) system. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (IDOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an IDOAS to verify the calibration curve over the spatial extend of the image. Our results show that calibration cells can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. These effects can lead to an even more significant overestimation or, depending on the measurement conditions, an underestimation of the true CD. Previous investigations found that possible errors can be more than an order of magnitude. However, the spectral information from the DOAS measurements allows to correct for these radiative transfer effects. The measurement presented in this work were taken at Popocatépetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 kg s-1 and 14.34 kg s-1 were observed.

Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

2012-09-01

323

AWiFS camera for Resourcesat  

NASA Astrophysics Data System (ADS)

Remote sensors were developed and used extensively world over using aircraft and space platforms. India has developed and launched many sensors into space to survey natural resources. The AWiFS is one such Camera, launched onboard Resourcesat-1 satellite by ISRO in 2003. It is a medium resolution camera with 5-day revisit designed for studies related to forestry, vegetation, soil, snow and disaster warning. The camera provides 56m (nadir) resolution from 817 km altitude in three visible bands and one SWIR band. This paper deals with configuration features of AWiFS Camera of Resourcesat-1, its onboard performance and also the highlights of Camera being developed for Resourcesat-2. The AWiFS is realized with two identical cameras viz. AWiFS-A and AWiFS-B, which cover the large field of view of 48°. Each camera consists of independent collecting optics and associated 6000 element detectors and electronics catering to 4 bands. The visible bands use linear Silicon CCDs, with 10? × 7? element while SWIR band uses 13? staggered InGaAs linear active pixels. Camera Electronics are custom designed for each detector based on detector and system requirements. The camera covers the total dynamic range up to 100% albedo with a single gain setting and 12-bit digitization of which 10 MSBs are transmitted. The Camera saturation radiance of each band can also be selected through telecommand. The Camera provides very high SNR of about 700 near saturation. The camera components are housed in specially designed Invar structures. The AWiFS Camera onboard Resourcesat-1 is providing excellent imageries and the data is routinely used world over. AWiFS for Resourcesat-2 is being developed with overall performance specifications remaining same. The Camera electronics is miniaturized with reductions in hardware packages, size and weight to one third.

Dave, Himanshu; Dewan, Chirag; Paul, Sandip; Sarkar, S. S.; Pandya, Himanshu; Joshi, S. R.; Mishra, Ashish; Detroja, Manoj

2006-12-01

324

Fraunhofer diffraction to determine the twin angle in single-crystal BaTiO3.  

PubMed

We present a new method for determining the electrically induced twin angle alpha of a (100) bulk single crystal of barium titanate (BaTiO3) using a nondestructive optical technique based on Fraunhofer diffraction. The technique required two steps that were performed simultaneously. First, we analyzed the diffracted light intensity captured with a line camera. Second, we measured the size of the diffracting element by analyzing images of crystal's surface taken with a CCD camera. The value obtained for the twin angle is 0.67 degrees +/- 0.05 degrees, which compares favorably with the theoretical value of 0.63 degrees. PMID:12916610

Melnichuk, Mike; Wood, Lowell T

2003-08-01

325

Search for narrow resonances lighter than ? mesons  

Microsoft Academic Search

We report a search for narrow resonances, produced in \\u000a \\u000a collisions at \\u000a \\u000a  TeV, that decay into muon pairs with invariant mass between 6.3 and 9.0 GeV\\/c\\u000a 2. The data, collected with the CDF II detector at the Fermilab Tevatron collider, correspond to an integrated luminosity of\\u000a 630 pb?1. We use the dimuon invariant mass distribution to set 90% upper credible limits of about 1%

T. Aaltonen; J. Adelman; T. Akimoto; S. Amerio; D. Amidei; A. Anastassov; A. Annovi; J. Antos; G. Apollinari; A. Apresyan; T. Arisawa; A. Aurisano; W. Ashmanskas; A. Attal; F. Azfar; W. Badgett; A. Barbaro-Galtieri; V. E Barnes; B. A. Barnett; P. Barria; V. Bartsch; G. Bauer; P.-H. Beauchemin; F. Bedeschi; D. Beecher; S. Behari; G. Bellettini; J. Bellinger; D. Benjamin; A. Beretvas; J. Beringer; A. Bhatti; M. Binkley; D. Bortoletto; I. Bizjak; R. E. Blair; C. Blocker; B. Blumenfeld; A. Bocci; A. Bodek; V. Boisvert; G. Bolla; J. Boudreau; A. Boveia; B. Brau; A. Bridgeman; L. Brigliadori; C. Bromberg; E. Brubaker; J. Budagov; H. S. Budd; S. Budd; S. Burke; K. Burkett; G. Busetto; P. Bussey; A. Buzatu; K. L. Byrum; S. Cabrera; C. Calancha; M. Cavalli-Sforza; M. Campbell; F. Canelli; A. Canepa; B. Carls; D. Carlsmith; R. Carosi; S. Carrillo; S. Carron; B. Casal; M. Casarsa; A. Castro; P. Catastini; D. Cauz; V. Cavaliere; A. Cerri; L. Cerrito; S. H. Chang; Y. C. Chen; M. Chertok; G. Chiarelli; G. Chlachidze; F. Chlebana; K. Cho; D. Chokheli; J. P. Chou; G. Choudalakis; S. H. Chuang; K. Chung; W. H. Chung; Y. S. Chung; T. Chwalek; C. I. Ciobanu; M. A. Ciocci; A. Clark; D. Clark; G. Compostella; M. E. Convery; J. Conway; M. Cordelli; G. Cortiana; C. A. Cox; D. J. Cox; F. Crescioli; J. Cuevas; R. Culbertson; J. C. Cully; D. Dagenhart; M. Datta; T. Davies; P. de Barbaro; S. De Cecco; A. Deisher; G. De Lorenzo; M. Dell’Orso; C. Deluca; L. Demortier; J. Deng; M. Deninno; P. F. Derwent; A. Di Canto; G. P. di Giovanni; C. Dionisi; B. Di Ruzza; J. R. Dittmann; M. D’Onofrio; S. Donati; P. Dong; J. Donini; T. Dorigo; S. Dube; J. Efron; A. Elagin; R. Erbacher; D. Errede; S. Errede; R. Eusebi; H. C. Fang; S. Farrington; W. T. Fedorko; R. G. Feild; M. Feindt; J. P. Fernandez; C. Ferrazza; R. Field; G. Flanagan; R. Forrest; M. J. Frank; M. Franklin; J. C. Freeman; I. Furic; M. Gallinaro; J. Galyardt; F. Garberson; J. E. Garcia; A. F. Garfinkel; P. Garosi; K. Genser; H. Gerberich; D. Gerdes; A. Gessler; S. Giagu; V. Giakoumopoulou; P. Giannetti; K. Gibson; J. L. Gimmell; C. M. Ginsburg; N. Giokaris; M. Giordani; P. Giromini; M. Giunta; G. Giurgiu; V. Glagolev; D. Glenzinski; M. Gold; N. Goldschmidt; A. Golossanov; G. Gomez; G. Gomez-Ceballos; M. Goncharov; O. González; I. Gorelov; A. T. Goshaw; K. Goulianos; A. Gresele; S. Grinstein; C. Grosso-Pilcher; U. Grundler; Z. Gunay-Unalan; C. Haber; K. Hahn; S. R. Hahn; E. Halkiadakis; B.-Y. Han; J. Y. Han; F. Happacher; K. Hara; D. Hare; M. Hare; S. Harper; R. F. Harr; R. M. Harris; M. Hartz; K. Hatakeyama; C. Hays; M. Heck; A. Heijboer; J. Heinrich; C. Henderson; M. Herndon; J. Heuser; S. Hewamanage; D. Hidas; C. S. Hill; D. Hirschbuehl; A. Hocker; S. Hou; M. Houlden; S.-C. Hsu; B. T. Huffman; R. E. Hughes; U. Husemann; M. Hussein; J. Huston; J. Incandela; G. Introzzi; M. Iori; A. Ivanov; E. James; D. Jang; B. Jayatilaka; E. J. Jeon; M. K. Jha; S. Jindariani; W. Johnson; M. Jones; K. K. Joo; S. Y. Jun; J. E. Jung; T. R. Junk; T. Kamon; D. Kar; P. E. Karchin; Y. Kato; R. Kephart; W. Ketchum; J. Keung; V. Khotilovich; B. Kilminster; D. H. Kim; H. S. Kim; H. W. Kim; J. E. Kim; M. J. Kim; S. B. Kim; S. H. Kim; Y. K. Kim; N. Kimura; L. Kirsch; S. Klimenko; B. Knuteson; B. R. Ko; K. Kondo; D. J. Kong; J. Konigsberg; A. Korytov; A. V. Kotwal; M. Kreps; J. Kroll; D. Krop; N. Krumnack; M. Kruse; V. Krutelyov; T. Kubo; T. Kuhr; N. P. Kulkarni; M. Kurata; S. Kwang; A. T. Laasanen; S. Lami; S. Lammel; M. Lancaster; R. L. Lander; K. Lannon; A. Lath; G. Latino; I. Lazzizzera; T. LeCompte; E. Lee; H. S. Lee; S. W. Lee; S. Leone; J. D. Lewis; C.-S. Lin; J. Linacre; M. Lindgren; E. Lipeles; A. Lister; D. O. Litvintsev; C. Liu; T. Liu; N. S. Lockyer; A. Loginov; M. Loreti; L. Lovas; D. Lucchesi; C. Luci; J. Lueck; P. Lujan; P. Lukens; G. Lungu; L. Lyons; J. Lys; R. Lysak; D. MacQueen; R. Madrak; K. Maeshima; K. Makhoul; T. Maki; P. Maksimovic; S. Malde; S. Malik; G. Manca; A. Manousakis-Katsikakis; F. Margaroli; C. Marino; A. Martin; V. Martin; M. Martínez; R. Martínez-Ballarín; T. Maruyama; P. Mastrandrea; T. Masubuchi; M. Mathis; M. E. Mattson; P. Mazzanti; K. S. McFarland; P. McIntyre; R. McNulty; A. Mehta; P. Mehtala; A. Menzione; P. Merkel; C. Mesropian; T. Miao; N. Miladinovic; R. Miller; C. Mills; M. Milnik; A. Mitra; G. Mitselmakher; H. Miyake; N. Moggi; C. S. Moon; R. Moore; M. J. Morello; J. Morlock; J. Mülmenstädt; A. Mukherjee; Th. Muller; R. Mumford; P. Murat; M. Mussini; J. Nachtman; Y. Nagai; A. Nagano; J. Naganoma; K. Nakamura; I. Nakano; A. Napier; V. Necula; J. Nett; C. Neu; M. S. Neubauer; S. Neubauer; J. Nielsen; L. Nodulman; M. Norman; O. Norniella; E. Nurse; L. Oakes; S. H. Oh; Y. D. Oh; I. Oksuzian; T. Okusawa; R. Orava; K. Osterberg; E. Palencia; V. Papadimitriou; A. Papaikonomou; A. A. Paramonov; B. Parks; S. Pashapour; J. Patrick; G. Pauletta; M. Paulini

2009-01-01

326

The Flow of Gases in Narrow Channels  

NASA Technical Reports Server (NTRS)

Measurements were made of the flow of gases through various narrow channels a few microns wide at average pressures from 0.00003 to 40 cm. Hg. The flow rate, defined as the product of pressure and volume rate of flow at unit pressure difference, first decreased linearly with decrease in mean pressure in the channel, in agreement with laminar-flow theory, reached a minimum when the mean path length was approximately equal to the channel width, and then increased to a constant value. The product of flow rate and square root of molecular number was approximately the same function of mean path length for all gases for a given channel.

Rasmussen, R E H

1951-01-01

327

NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM  

SciTech Connect

GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 ?m) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 ?m) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 ?m and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

Colón, Knicole D.; Gaidos, Eric, E-mail: colonk@hawaii.edu [Department of Geology and Geophysics, University of Hawaii at Manoa, Honolulu, HI 96822 (United States)

2013-10-10

328

Using a digital video camera to examine coupled oscillations  

NASA Astrophysics Data System (ADS)

In our previous paper (Debowska E, Jakubowicz S and Mazur Z 1999 Eur. J. Phys. 20 89-95), thanks to the use of an ultrasound distance sensor, experimental verification of the solution of Lagrange equations for longitudinal oscillations of the Wilberforce pendulum was shown. In this paper the sensor and a digital video camera were used to monitor and measure the changes of both the pendulum's coordinates (vertical displacement and angle of rotation) simultaneously. The experiments were performed with the aid of the integrated software package COACH 5. Fourier analysis in Microsoft^{\\circledR} Excel 97 was used to find normal modes in each case of the measured oscillations. Comparison of the results with those presented in our previous paper (as given above) leads to the conclusion that a digital video camera is a powerful tool for measuring coupled oscillations of a Wilberforce pendulum. The most important conclusion is that a video camera is able to do something more than merely register interesting physical phenomena - it can be used to perform measurements of physical quantities at an advanced level.

Greczylo, T.; Debowska, E.

2002-07-01

329

Digital cameras with designs inspired by the arthropod eye.  

PubMed

In arthropods, evolution has created a remarkably sophisticated class of imaging systems, with a wide-angle field of view, low aberrations, high acuity to motion and an infinite depth of field. A challenge in building digital cameras with the hemispherical, compound apposition layouts of arthropod eyes is that essential design requirements cannot be met with existing planar sensor technologies or conventional optics. Here we present materials, mechanics and integration schemes that afford scalable pathways to working, arthropod-inspired cameras with nearly full hemispherical shapes (about 160 degrees). Their surfaces are densely populated by imaging elements (artificial ommatidia), which are comparable in number (180) to those of the eyes of fire ants (Solenopsis fugax) and bark beetles (Hylastes nigrinus). The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors into integrated sheets that can be elastically transformed from the planar geometries in which they are fabricated to hemispherical shapes for integration into apposition cameras. Our imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes). PMID:23636401

Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Xiao, Jianliang; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B; Huang, Yonggang; Rogers, John A

2013-05-01

330

The Advanced Camera for the Hubble Space Telescope  

Microsoft Academic Search

The Advanced Camera for the Hubble Space Telescope has three cameras. The first, the Wide Field Camera, will be a high- throughput, wide field, 4096 X 4096 pixel CCD optical and I-band camera that is half-critically sampled at 500 nm. The second, the High Resolution Camera (HRC), is a 1024 X 1024 pixel CCD camera that is critically sampled at

G. D. Illingworth; Paul D. Feldman; David A. Golimowski; Zlatan Tsvetanov; Christopher J. Burrows; James H. Crocker; Pierre Y. Bely; George F. Hartig; Randy A. Kimble; Michael P. Lesser; Richard L. White; Tom Broadhurst; William B. Sparks; Robert A. Woodruff; Pamela Sullivan; Carolyn A. Krebs; Douglas B. Leviton; William Burmester; Sherri Fike; Rich Johnson; Robert B. Slusher; Paul Volmer

1997-01-01

331

Making Oatmeal Box Pinhole Cameras  

NSDL National Science Digital Library

This web site provides step-by-step directions for constructing a pinhole camera out of an oatmeal box and other common household items. Each step is supplemented with photos to show exactly how to build the apparatus so that it will actually take pictures. Also included are detailed procedures for shooting the photographs and developing them in an amateur darkroom. **NOTE: If performing this activity with children, follow safety procedures for using the photo developing agent. SEE THIS LINK for safety information on Kodak Dektol: http://www2.itap.purdue.edu/msds/docs/9735.pdf

Woodruff, Stewart

2009-05-28

332

A remote camera operation system using a marker attached cap  

NASA Astrophysics Data System (ADS)

In this paper, we propose a convenient system to control a remote camera according to the eye-gazing direction of the operator, which is approximately obtained through calculating the face direction by means of image processing. The operator put a marker attached cap on his head, and the system takes an image of the operator from above with only one video camera. Three markers are set up on the cap, and 'three' is the minimum number to calculate the tilt angle of the head. The more markers are used, the robuster system may be made to occlusion, and the wider moving range of the head is tolerated. It is supposed that the markers must not exist on any three dimensional straight line. To compensate the marker's color change due to illumination conditions, the threshold for the marker extraction is adaptively decided using a k-means clustering method. The system was implemented with MATLAB on a personal computer, and the real-time operation was realized. Through the experimental results, robustness of the system was confirmed and tilt and pan angles of the head could be calculated with enough accuracy to use.

Kawai, Hironori; Hama, Hiromitsu

2005-12-01

333

Light field panorama by a plenoptic camera  

NASA Astrophysics Data System (ADS)

Consumer-grade plenoptic camera Lytro draws a lot of interest from both academic and industrial world. However its low resolution in both spatial and angular domain prevents it from being used for fine and detailed light field acquisition. This paper proposes to use a plenoptic camera as an image scanner and perform light field stitching to increase the size of the acquired light field data. We consider a simplified plenoptic camera model comprising a pinhole camera moving behind a thin lens. Based on this model, we describe how to perform light field acquisition and stitching under two different scenarios: by camera translation or by camera translation and rotation. In both cases, we assume the camera motion to be known. In the case of camera translation, we show how the acquired light fields should be resampled to increase the spatial range and ultimately obtain a wider field of view. In the case of camera translation and rotation, the camera motion is calculated such that the light fields can be directly stitched and extended in the angular domain. Simulation results verify our approach and demonstrate the potential of the motion model for further light field applications such as registration and super-resolution.

Xue, Zhou; Baboulaz, Loic; Prandoni, Paolo; Vetterli, Martin

2013-03-01

334

Narrow field electromagnetic sensor system and method  

DOEpatents

A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments. 12 figs.

McEwan, T.E.

1996-11-19

335

Studies of narrow autoionizing resonances in gadolinium  

SciTech Connect

The autoionization (AI) spectrum of gadolinium between the first and second limits has been investigated by triple-resonance excitation with high-resolution cw lasers. A large number of narrow AI resonances have been observed and assigned total angular momentum J values. The resonances are further divided into members of AI Rydberg series converging to the second limit or other ''interloping'' levels. Fine structure in the Rydberg series has been identified and interpreted in terms of Jc j coupling. A number of detailed studies have been performed on the interloping resonances: These include lifetime determination by lineshape analysis, isotope shifts, hyperfine structure, and photoionization saturation parameters. The electronic structure of the interloping levels is discussed in terms of these studies. Linewidths generally decrease with increasing total angular momentum and the J = 7 resonances are extremely narrow with Lorentzian widths ranging from < 1 MHz up to 157 MHz. The strongest resonances are found to have cross-sections of {approx}10-12 cm{sup 2} and photoionization can be saturated with powers available from cw diode lasers.

Bushaw, Bruce A.; Nortershauser, W.; Blaum, K.; Wendt, Klaus

2003-06-30

336

Narrow field electromagnetic sensor system and method  

DOEpatents

A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments.

McEwan, Thomas E. (Livermore, CA)

1996-01-01

337

Development of narrow gap welding technology for extremely thick steel  

NASA Astrophysics Data System (ADS)

In the field of extremely thick steel, various narrow gap welding methods were developed on the basis of former welding methods and are used in practice. It is important to develop and improve automatic narrow gap welding, J edge preparation by gas cutting, the prevention of welding defects, wires for narrow gap welding and so on in order to expand the scope of application of the method. Narrow gap welding technologies are described, based on new concepts developed by Nippon Steel Corporation.

Imai, K.; Saito, T.; Okumura, M.

338

Measuring Non-spherical Airborne Dust with Space-based MISR Multi-angle Imaging.  

NASA Astrophysics Data System (ADS)

Some of the world's largest dust plumes emanate from Northern Eurasian deserts and are expected to increasingly affect Asian ergonomics. Together with field experiments, satellite observations of dust outbreaks, placed into the context of large-scale dust transport modeling, can help understand the impact of mineral dust aerosols on past and present climate and climate predictions in North and Central Asia. Multi-angle instruments such as the Multi-angle Imaging SpectroRadiometer (MISR) provide independent constraints on aerosol properties based on sensitivity to the shape of the scattering phase function. We present an analysis of the Multi-angle Imaging SpectroRadiometer (MISR) Standard Aerosol Retrieval algorithm, updated with new non-spherical dust models (Version 16 and higher). We compare the MISR products with coincident AERONET surface sun-photometer observations taken during the passage of dust fronts. Our analysis shows that during such events MISR retrieves Angstrom exponents characteristic of large particles, having little spectral variation in extinction over the MISR wavelength range (442, 550, 672 and 866 nm channels), as expected. Also, the retrieved fraction of non-spherical particles is very high. This quantity is not retrieved by satellite instruments having only nadir-viewing cameras. We assess whether MISR aerosol optical thickness (AOT) acquired at about 10:30 AM local time, can be used to represent daily mean AOT in dust climate forcing studies, by comparing MISR-retrieved aerosol optical thickness (AOT) with AERONET daily-mean values. We also compare the effect of particle shape on MISR and MODIS dust retrievals, using co-located MISR, MODIS, and AERONET AOTs and Angstrom exponents. In most cases obtained for this study, MODIS had no retrievals due to sun-glint when MISR's narrower swath observed AERONET sties on islands surrounded by dark water. For the few coincident MISR-MODIS-AERONET dark-water, dusty condition retrievals we obtained, the MISR retrievals were in better agreement with AERONET than those from MODIS. Over bright desert sites, MODIS AOTs at visible wavelengths was systematically higher than those of AERONET and MISR. MISR-derived aerosol type mixtures for these cases included non-spherical dust components with high frequency in retrievals over dark water, and slightly lower frequency over land. The frequency with which non-spherical dust models were selected by the algorithm also decreased in dusty regions affected by pollution. Both MISR and MODIS retrievals have a high fail rate over optically thick dust plumes.

Kalashnikova, O. V.; Diner, D. J.; Abdou, W.; Kahn, R.; Gaitley, B. J.; Gasso, S.

2004-12-01

339

Feasibility study for the application of the large format camera as a payload for the Orbiter program  

NASA Technical Reports Server (NTRS)

The large format camera (LFC) designed as a 30 cm focal length cartographic camera system that employs forward motion compensation in order to achieve the full image resolution provided by its 80 degree field angle lens is described. The feasibility of application of the current LFC design to deployment in the orbiter program as the Orbiter Camera Payload System was assessed and the changes that are necessary to meet such a requirement are discussed. Current design and any proposed design changes were evaluated relative to possible future deployment of the LFC on a free flyer vehicle or in a WB-57F. Preliminary mission interface requirements for the LFC are given.

1978-01-01

340

Angles and Area  

NSDL National Science Digital Library

In this activity (page 10 of PDF), learners approximate the area of the uppermost cross section of an impact crater using a variety of square grids. They conclude which angle of impact results in the greatest area. There are two versions of this activity: Challenge, where students construct a launcher and create their own craters; and Non-Challenge where students analyze pictures of craters. Includes a pre-lesson activity (p54). The Moon Math: Craters! guide follows a 5E approach, applying concepts of geometry, modeling, data analysis to the NASA lunar spacecraft mission, LCROSS.

Nasa

2012-05-08

341

The Viking Mars lander camera  

NASA Technical Reports Server (NTRS)

The 7.3 kg cameras for the 1976 Viking Mars expedition feature an array of 12 silicon photodiodes, including six spectral bands for color and near-infrared imaging with an angular resolution of 0.12 deg and four focus steps for broadband imaging, with an improved angular resolution of 0.04 deg. The field of view in elevation ranges from 40 deg above to 60 deg below the horizon, and in azimuth ranges to 342.5 deg. The cameras are mounted 0.8 m apart to provide a stereo view of the area accessible to a surface sampler for biological and chemical investigations. The scanning rates are synchronized to the lander data transmission rates of 16000 bits per sec to the Viking orbiters as relay stations and 250 bits per sec directly to earth. However, image data can also be stored on a lander tape recorder. About 10 million bits of image data will be transmitted during most days of the 60-day-long mission planned for each lander.

Huck, F. O.; Taylor, G. R.; Mccall, H. F.; Patterson, W. R.

1975-01-01

342

The Dark Energy Camera (DECam)  

E-print Network

In this paper we describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). It consists of a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), a modern data acquisition and control system and the associated infrastructure for operation in the prime focus cage. The focal plane includes of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view and 12 smaller 2K x 2K CCDs for guiding, focus and alignment. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled for 2010.

K. Honscheid; D. L. DePoy; for the DES Collaboration

2008-10-20

343

HDTV camera using digital contour  

NASA Astrophysics Data System (ADS)

The authers have developed the HSC-100 solid-state High-Definition TV(Camera. The canra promises a 6dB S/N and +6dB sensilivity far superior to conventional IIDTV cameras due to an imaging device construction. It also improves picture quality throughusing a digital contour unit. To satisfy IiDTV (SMPTE 240M) requirements a photo-conductive layered semiconductor imaging device (PSID) with 2 pixels has been developed. An amorphous silicon (a-Si) layeris added to the CCD scanner in this device. The a-Si layer carries out photoelectric conversion then interline transfer CCD reads out the photo induced electric charges. This configuraon provides a pixel aperture ratio of 100 thereby improving sensitivity comparedwith existing models. The layer structure also permits a wide dynamic range. A digital contour unit was developed to improve contour corrector characteristics. S/N and frequency response are improved by introducing digital signal processing. The 56dB S/N value is achieved with an 8 bit A/D converter. This S/N is about 10 dB better than that for conventional ultra-sonic delay lines. In addilion digital processing improves frequency response and delay time stability. A more natural contour correction characteristic has been attained with a contour correction signal derived from the luminance signal. 1.

Sugiki, Tadashi; Nakao, Akria; Uchida, Tomoyuki

1992-08-01

344

MRI of surgically created pulmonary artery narrowing in the dog  

Microsoft Academic Search

Narrowing of the pulmonary arteries was created surgically in twelve dogs. In six of the dogs the narrowing was central (main pulmonary artery), and in the remaining six the narrowing was located peripherally at the hilar level of the right pulmonary artery beyond the pericardial reflection. MRI and angiography were performed in all dogs. MRI clearly delineated the site of

R. J. Hernandez; A. P. Rocchini; E. L. Bove; T. L. Chenevert; B. Gubin

1989-01-01

345

Measures on Mixing Angles  

E-print Network

We address the problem of the apparently very small magnitude of CP violation in the standard model, measured by the Jarlskog invariant J. In order to make statements about probabilities for certain values of J, we seek to find a natural measure on the space of Kobayashi-Maskawa matrices, the double quotient U(1)^2\\SU(3)/U(1)^2. We review several possible, geometrically motivated choices of the measure, and compute expectation values for powers of J for these measures. We find that different choices of the measure generically make the observed magnitude of CP violation appear finely tuned. Since the quark masses and the mixing angles are determined by the same set of Yukawa couplings, we then do a second calculation in which we take the known quark mass hierarchy into account. We construct the simplest measure on the space of 3 x 3 Hermitian matrices which reproduces this known hierarchy. Calculating expectation values for powers of J in this second approach, we find that values of J close to the observed value are now rather likely, and there does not seem to be any fine tuning. Our results suggest that the choice of Kobayashi-Maskawa angles is closely linked to the observed mass hierarchy. We close by discussing the corresponding case of neutrinos.

Gary W. Gibbons; Steffen Gielen; C. N. Pope; Neil Turok

2008-10-27

346

Measures on mixing angles  

SciTech Connect

We address the problem of the apparently very small magnitude of CP violation in the standard model, measured by the Jarlskog invariant J. In order to make statements about probabilities for certain values of J, we seek to find a natural measure on the space of Kobayashi-Maskawa matrices, the double quotient U(1){sup 2}/SU(3)/U(1){sup 2}. We review several possible, geometrically motivated choices of the measure, and compute expectation values for powers of J for these measures. We find that different choices of the measure generically make the observed magnitude of CP violation appear finely tuned. Since the quark masses and the mixing angles are determined by the same set of Yukawa couplings, we then do a second calculation in which we take the known quark mass hierarchy into account. We construct the simplest measure on the space of 3x3 Hermitian matrices which reproduces this known hierarchy. Calculating expectation values for powers of J in this second approach, we find that values of J close to the observed value are now rather likely, and there does not seem to be any fine-tuning. Our results suggest that the choice of Kobayashi-Maskawa angles is closely linked to the observed mass hierarchy. We close by discussing the corresponding case of neutrinos.

Gibbons, Gary W.; Gielen, Steffen [D.A.M.T.P., Centre for Mathematical Sciences, Cambridge University, Wilberforce Road, Cambridge, CB3 0WA (United Kingdom); Pope, C. N. [D.A.M.T.P., Centre for Mathematical Sciences, Cambridge University, Wilberforce Road, Cambridge, CB3 0WA (United Kingdom); George P. and Cynthia W. Mitchell Institute for Fundamental Physics and Astronomy, Texas A and M University, College Station, Texas 77843-4242 (United States); Turok, Neil [D.A.M.T.P., Centre for Mathematical Sciences, Cambridge University, Wilberforce Road, Cambridge, CB3 0WA (United Kingdom); Perimeter Institute for Theoretical Physics, 31 Caroline St. N., Waterloo, Ontario, Canada N2L 2Y5 (Canada)

2009-01-01

347

Variable angle correlation spectroscopy  

SciTech Connect

In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

Lee, Y.K. [Univ. of California, Berkeley, CA (United States)]|[Lawrence Berkeley Lab., CA (United States). Chemical Biodynamics Div.

1994-05-01

348

Television camera video level control system  

NASA Technical Reports Server (NTRS)

A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (inventors)

1985-01-01

349

Optimum Projection Angle for Attaining Maximum Distance in a Soccer Punt Kick  

PubMed Central

To produce the greatest horizontal distance in a punt kick the ball must be projected at an appropriate angle. Here, we investigated the optimum projection angle that maximises the distance attained in a punt kick by a soccer goalkeeper. Two male players performed many maximum-effort kicks using projection angles of between 10° and 90°. The kicks were recorded by a video camera at 100 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity, projection angle, projection height, ball spin rate, and foot velocity at impact. The player’s optimum projection angle was calculated by substituting mathematical equations for the relationships between the projection variables into the equations for the aerodynamic flight of a soccer ball. The calculated optimum projection angles were in agreement with the player’s preferred projection angles (40° and 44°). In projectile sports even a small dependence of projection velocity on projection angle is sufficient to produce a substantial shift in the optimum projection angle away from 45°. In the punt kicks studied here, the optimum projection angle was close to 45° because the projection velocity of the ball remained almost constant across all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle and so the optimum projection angle is well below 45°. Key points The optimum projection angle that maximizes the distance of a punt kick by a soccer goalkeeper is about 45°. The optimum projection angle is close to 45° because the projection velocity of the ball is almost the same at all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the optimum projection angle is well below 45° because the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle. PMID:24149315

Linthorne, Nicholas P.; Patel, Dipesh S.

2011-01-01

350

Electrostatic camera system functional design study  

NASA Technical Reports Server (NTRS)

A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.

Botticelli, R. A.; Cook, F. J.; Moore, R. F.

1972-01-01

351

The Mars Science Laboratory Engineering Cameras  

NASA Astrophysics Data System (ADS)

NASA's Mars Science Laboratory (MSL) Rover is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover cameras described in Maki et al. (J. Geophys. Res. 108(E12): 8071, 2003). Images returned from the engineering cameras will be used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The Navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The Hazard Avoidance Cameras (Hazcams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a 1024×1024 pixel detector and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer "A" and the other set is connected to rover computer "B". The Navcams and Front Hazcams each provide similar views from either computer. The Rear Hazcams provide different views from the two computers due to the different mounting locations of the "A" and "B" Rear Hazcams. This paper provides a brief description of the engineering camera properties, the locations of the cameras on the vehicle, and camera usage for surface operations.

Maki, J.; Thiessen, D.; Pourangi, A.; Kobzeff, P.; Litwin, T.; Scherr, L.; Elliott, S.; Dingizian, A.; Maimone, M.

2012-09-01

352

HiRISE: The People's Camera  

Microsoft Academic Search

The High Resolution Imaging Science Experiment (HiRISE) camera, orbiting Mars since 2006 on the Mars Reconnaissance Orbiter (MRO), has returned more than 17,000 large images with scales as small as 25 cm\\/pixel. From it's beginning, the HiRISE team has followed ``The People's Camera'' concept, with rapid release of useful images, explanations, and tools, and facilitating public image suggestions. The camera

A. S. McEwen; E. Eliason; V. C. Gulick; Y. Spinoza; R. A. Beyer

2010-01-01

353

Multiplex imaging with multiple-pinhole cameras  

NASA Technical Reports Server (NTRS)

When making photographs in X rays or gamma rays with a multiple-pinhole camera, the individual images of an extended object such as the sun may be allowed to overlap. Then the situation is in many ways analogous to that in a multiplexing device such as a Fourier spectroscope. Some advantages and problems arising with such use of the camera are discussed, and expressions are derived to describe the relative efficacy of three exposure/postprocessing schemes using multiple-pinhole cameras.

Brown, C.

1974-01-01

354

Development of biostereometric experiments. [stereometric camera system  

NASA Technical Reports Server (NTRS)

The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

Herron, R. E.

1978-01-01

355

A grazing incidence x-ray streak camera for ultrafast, single-shot measurements  

SciTech Connect

An ultrafast x-ray streak camera has been realized using a grazing incidence reflection photocathode. X-rays are incident on a gold photocathode at a grazing angle of 20 degree and photoemitted electrons are focused by a large aperture magnetic solenoid lens. The streak camera has high quantum efficiency, 600fs temporal resolution, and 6mm imaging length in the spectral direction. Its single shot capability eliminates temporal smearing due to sweep jitter, and allows recording of the ultrafast dynamics of samples that undergo non-reversible changes.

Feng, Jun; Engelhorn, K.; Cho, B.I.; Lee, H.J.; Greaves, M.; Weber, C.P.; Falcone, R.W.; Padmore, H. A.; Heimann, P.A.

2010-02-18

356

Triangles: Finding Interior Angle Measures  

NSDL National Science Digital Library

In this lesson plan, students will start with a hands-on activity and then experiment with a GeoGebra-based computer model to investigate and discover the Triangle Angle Sum Theorem. Then they will use the Triangle Angle Sum Theorem to write and solve equations and find missing angle measures in a variety of examples.

2012-11-25

357

The multi-angle view of MISR detects oil slicks under sun glitter conditions  

Microsoft Academic Search

We tested the use of the Multi-angle Imaging SpectroRadiometer (MISR) for detecting oil spills in the Lake Maracaibo, Venezuela, that were caused by a series of accidents between December 2002 and March 2003. The MISR sensor, onboard the Terra satellite, utilises nine cameras pointed at fixed angles, ranging from nadir to ±70.5°. Based upon the Bidirectional Reflectance Factor, a contrast

Guillem Chust; Yolanda Sagarminaga

2007-01-01

358

Angle-resolved scattering spectroscopy of explosives using an external cavity quantum cascade laser  

SciTech Connect

Investigation of angle-resolved scattering from solid explosives residues on a car door for non-contact sensing geometries. Illumination with a mid-infrared external cavity quantum cascade laser tuning between 7 and 8 microns was detected both with a sensitive single point detector and a hyperspectral imaging camera. Spectral scattering phenomena were discussed and possibilities for hyperspectral imaging at large scattering angles were outlined.

Suter, Jonathan D.; Bernacki, Bruce E.; Phillips, Mark C.

2012-04-01

359

True-color night vision cameras  

NASA Astrophysics Data System (ADS)

This paper describes True-Color Night Vision cameras that are sensitive to the visible to near-infrared (V-NIR) portion of the spectrum allowing for the "true-color" of scenes and objects to be displayed and recorded under low-light-level conditions. As compared to traditional monochrome (gray or green) night vision imagery, color imagery has increased information content and has proven to enable better situational awareness, faster response time, and more accurate target identification. Urban combat environments, where rapid situational awareness is vital, and marine operations, where there is inherent information in the color of markings and lights, are example applications that can benefit from True-Color Night Vision technology. Two different prototype cameras, employing two different true-color night vision technological approaches, are described and compared in this paper. One camera uses a fast-switching liquid crystal filter in front of a custom Gen-III image intensified camera, and the second camera is based around an EMCCD sensor with a mosaic filter applied directly to the sensor. In addition to visible light, both cameras utilize NIR to (1) increase the signal and (2) enable the viewing of laser aiming devices. The performance of the true-color cameras, along with the performance of standard (monochrome) night vision cameras, are reported and compared under various operating conditions in the lab and the field. In addition to subjective criterion, figures of merit designed specifically for the objective assessment of such cameras are used in this analysis.

Kriesel, Jason; Gat, Nahum

2007-04-01

360

Advanced High-Definition Video Cameras  

NASA Technical Reports Server (NTRS)

A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

Glenn, William

2007-01-01

361

Tissue characterization by using narrow band imaging  

NASA Astrophysics Data System (ADS)

NBI (Narrow Band Imaging) was first introduced in the market in 2005 as a technique enabling to enhance image contrast of capillaries on a mucosal surface(1). It is classified as an Optical-Digital Method for Image-Enhanced Endoscopy(2). To date, the application has widely spread not only to gastrointestinal fields such as esophagus, stomach and colon but also the organs such as bronchus and bladder. The main target tissue of NBI enhancement is capillaries. However, findings of many clinical studies conducted by endoscopy physicians have revealed that NBI observation enables to enhance more other structures in addition to capillaries. There is a close relationship between those enhanced structures and histological microstructure of a tissue. This report introduces the tissue microstructures enhanced by NBI and discusses the possibility of optimized illumination wavelength in observing living tissues.

Gono, Kazuhiro

2010-02-01

362

Isolating prompt photons with narrow cones  

NASA Astrophysics Data System (ADS)

We discuss the isolation of prompt photons in hadronic collisions by means of narrow isolation cones and the QCD computation of the corresponding cross sections. We reconsider the occurence of large perturbative terms with logarithmic dependence on the cone size and their impact on the fragmentation scale dependence. We cure the apparent perturbative violation of unitarity for small cone sizes, which had been noticed earlier in next-to-leading-order (NLO) calculations, by resumming the leading logarithmic dependence on the cone size. We discuss possible implications regarding the implementation of some hollow cone variants of the cone criterion, which simulate the experimental difficulty to impose isolation inside the region filled by the electromagnetic shower that develops in the calorimeter.

Catani, S.; Fontannaz, M.; Guillet, J. Ph.; Pilon, E.

2013-09-01

363

Active Brownian motion in a narrow channel  

E-print Network

We review recent advances in rectification control of artificial microswimmers, also known as Janus particles, diffusing along narrow, periodically corrugated channels. The swimmer self-propulsion mechanism is modeled so as to incorporate a nonzero torque (propulsion chirality). We first summarize the effects of chirality on the autonomous current of microswimmers freely diffusing in channels of different geometries. In particular, left-right and upside-down asymmetric channels are shown to exhibit different transport properties. We then report new results on the dependence of the diffusivity of chiral microswimmers on the channel geometry and their own self-propulsion mechanism. The self-propulsion torque turns out to play a key role as a transport control parameter.

Xue Ao; Pulak Kumar Ghosh; Yunyun Li; Gerhard Schmid; Peter Hänggi; Fabio Marchesoni

2014-09-17

364

Driven polymer translocation through a narrow pore.  

PubMed Central

Motivated by experiments in which a polynucleotide is driven through a proteinaceous pore by an electric field, we study the diffusive motion of a polymer threaded through a narrow channel with which it may have strong interactions. We show that there is a range of polymer lengths in which the system is approximately translationally invariant, and we develop a coarse-grained description of this regime. From this description, general features of the distribution of times for the polymer to pass through the pore may be deduced. We also introduce a more microscopic model. This model provides a physically reasonable scenario in which, as in experiments, the polymer's speed depends sensitively on its chemical composition, and even on its orientation in the channel. Finally, we point out that the experimental distribution of times for the polymer to pass through the pore is much broader than expected from simple estimates, and speculate on why this might be. PMID:10512806

Lubensky, D K; Nelson, D R

1999-01-01

365

Limbus Impact on Off-angle Iris Degradation  

SciTech Connect

The accuracy of iris recognition depends on the quality of data capture and is negatively affected by several factors such as angle, occlusion, and dilation. Off-angle iris recognition is a new research focus in biometrics that tries to address several issues including corneal refraction, complex 3D iris texture, and blur. In this paper, we present an additional significant challenge that degrades the performance of the off-angle iris recognition systems, called the limbus effect . The limbus is the region at the border of the cornea where the cornea joins the sclera. The limbus is a semitransparent tissue that occludes a side portion of the iris plane. The amount of occluded iris texture on the side nearest the camera increases as the image acquisition angle increases. Without considering the role of the limbus effect, it is difficult to design an accurate off-angle iris recognition system. To the best of our knowledge, this is the first work that investigates the limbus effect in detail from a biometrics perspective. Based on results from real images and simulated experiments with real iris texture, the limbus effect increases the hamming distance score between frontal and off-angle iris images ranging from 0.05 to 0.2 depending upon the limbus height.

Karakaya, Mahmut [ORNL; Barstow, Del R [ORNL; Santos-Villalobos, Hector J [ORNL; Thompson, Joseph W [ORNL; Bolme, David S [ORNL; Boehnen, Chris Bensing [ORNL

2013-01-01

366

An Instability in Narrow Planetary Rings  

NASA Astrophysics Data System (ADS)

We will present our work investigating the behavior of narrow planetary rings with low dispersion velocities. Such narrow a ring will be initially unstable to self-gravitational collapse. After the collapse, the ring is collisionally very dense. At this stage, it is subject to a new instability. Waves appear on the inner and outer edges of the ring within half of an orbital period. The ring then breaks apart radially, taking approximately a quarter of an orbital period of do so. As clumps of ring particles expand radially away from the dense ring, Kepler shear causes these clumps to stretch out azimuthally, and eventually collapse into a new set of dense rings. Small-scale repetitions of the original instability in these new rings eventually leads to a stabilized broad ring with higher dispersion velocities than the initial ring. Preliminary results indicate that this instability may be operating on small scales in broad rings in the wake-like features seen by Salo and others. Some intriguing properties have been observed during this instability. The most significant is a coherence in the epicyclic phases of the particles. Both self-gravity and collisions in the ring operated to create and enforce this coherence. The coherence might also be responsible for the instability to radial expansion. We also observe that guiding centers of the particles do not migrate to the center of the ring during the collapse phase of the ring. In fact, guiding centers move radially away from the core of the ring during this phase, consistent with global conservation of angular momentum. We will show the results of our simulations to date, including movies of the evolution of various parameters. (Audiences members wanting popcorn are advised to bring their own.) This work is supported by a NASA Graduate Student Research Program grant and by the Cassini mission.

Weiss, J. W.; Stewart, G. R.

2003-08-01

367

THz emission spectroscopy of narrow bandgap semiconductors  

NASA Astrophysics Data System (ADS)

This dissertation presents a model for emission of electromagnetic transients in the terahertz (THz) region from optically excited narrow bandgap semiconductors. This model explains the THz emission from the surface field acceleration mechanism and from the photo-Dember effect independently. It relates intrinsic parameters of the semiconductor, namely the majority and minority carrier concentrations and the mobilities, to the radiated THz field. The conditions that enhance the THz emission process in the case of surface field acceleration and in the case of the photo-Dember effect have been clearly identified. In this work three types of narrow bandgap semiconductors were investigated as sources of THz radiation. First the THz emission from a set of Te doped GaSb samples was studied. GaSb:Te is an interesting material because samples can be grown with a broad range of carrier concentrations. A Ga xIn1-xSb ingot was also studied. In this material system the electron mobility and the bandgap range from the GaSb values (Eg = 0.73eV) to the InSb values (Eg = 0.17eV). This is important in order to understand the extent to which the reduced bandgap of InSb is favorable for the THz emission process. This ingot was also used as a demonstration of how the THz time-domain spectroscopy technique can reveal the native defect density distributions in this ingot. Additionally the THz emission from InN was studied. The recently revised bandgap value of InN make it a good material for optically excited THz emission. In order to investigate the ultrafast scattering mechanisms in InN, an ultrafast photo-reflection setup was used. Subpicosecond non-radiative recombination was observed in silicon doped InN films.

Ascazubi, Ricardo

368

Mission report on the Orbiter Camera Payload System (OCPS) Large Format Camera (LFC) and Attitude Reference System (ARS)  

NASA Technical Reports Server (NTRS)

The Orbiter Camera Payload System (OCPS) is an integrated photographic system which is carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC), a precision wide-angle cartographic instrument that is capable of producing high resolution stereo photography of great geometric fidelity in multiple base-to-height (B/H) ratios. A secondary, supporting system to the LFC is the Attitude Reference System (ARS), which is a dual lens Stellar Camera Array (SCA) and camera support structure. The SCA is a 70-mm film system which is rigidly mounted to the LFC lens support structure and which, through the simultaneous acquisition of two star fields with each earth-viewing LFC frame, makes it possible to determine precisely the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with Shuttle launch conditions and the on-orbit environment. The full-up OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on October 5, 1984, as a major payload aboard mission STS 41-G. This report documents the system design, the ground testing, the flight configuration, and an analysis of the results obtained during the Challenger mission STS 41-G.

Mollberg, Bernard H.; Schardt, Bruton B.

1988-01-01

369

Narrow linewidth operation of the RILIS titanium: Sapphire laser at ISOLDE/CERN  

NASA Astrophysics Data System (ADS)

A narrow linewidth operating mode for the Ti:sapphire laser of the CERN ISOLDE Resonance Ionization Laser Ion Source (RILIS) has been developed. This satisfies the laser requirements for the programme of in-source resonance ionization spectroscopy measurements and improves the selectivity for isomer separation using RILIS. A linewidth reduction from typically 10 GHz down to 1 GHz was achieved by the intra-cavity insertion of a second (thick) Fabry-Pérot etalon. Reliable operation during a laser scan was achieved through motorized control of the tilt angle of each etalon. A scanning, stabilization and mode cleaning procedure was developed and implemented in LabVIEW. The narrow linewidth operation was confirmed in a high resolution spectroscopy study of francium isotopes by the Collinear Resonance Ionization Spectroscopy experiment. The resulting laser scans demonstrate the suitability of the laser, in terms of linewidth, spectral purity and stability for high resolution in-source spectroscopy and isomer selective ionization using the RILIS.

Rothe, S.; Fedosseev, V. N.; Kron, T.; Marsh, B. A.; Rossel, R. E.; Wendt, K. D. A.

2013-12-01

370

Heterodyne Interferometer Angle Metrology  

NASA Technical Reports Server (NTRS)

A compact, high-resolution angle measurement instrument has been developed that is based on a heterodyne interferometer. The common-path heterodyne interferometer metrology is used to measure displacements of a reflective target surface. In the interferometer setup, an optical mask is used to sample the measurement laser beam reflecting back from a target surface. Angular rotations, around two orthogonal axes in a plane perpendicular to the measurement- beam propagation direction, are determined simultaneously from the relative displacement measurement of the target surface. The device is used in a tracking telescope system where pitch and yaw measurements of a flat mirror were simultaneously performed with a sensitivity of 0.1 nrad, per second, and a measuring range of 0.15 mrad at a working distance of an order of a meter. The nonlinearity of the device is also measured less than one percent over the measurement range.

Hahn, Inseob; Weilert, Mark A.; Wang, Xu; Goullioud, Renaud

2010-01-01

371

Wind dynamic range video camera  

NASA Technical Reports Server (NTRS)

A television camera apparatus is disclosed in which bright objects are attenuated to fit within the dynamic range of the system, while dim objects are not. The apparatus receives linearly polarized light from an object scene, the light being passed by a beam splitter and focused on the output plane of a liquid crystal light valve. The light valve is oriented such that, with no excitation from the cathode ray tube, all light is rotated 90 deg and focused on the input plane of the video sensor. The light is then converted to an electrical signal, which is amplified and used to excite the CRT. The resulting image is collected and focused by a lens onto the light valve which rotates the polarization vector of the light to an extent proportional to the light intensity from the CRT. The overall effect is to selectively attenuate the image pattern focused on the sensor.

Craig, G. D. (inventor)

1985-01-01

372

Mountain glaciers caught on camera  

NASA Astrophysics Data System (ADS)

Many glaciers around the world are melting, and new research is showing some of the dramatic details. Ulyana Horodyskyj, a graduate student at the Cooperative Institute for Research in Environmental Sciences (CIRES), University of Colorado at Boulder, set up cameras to take time-lapse photographs of three lakes on a glacier in Nepal. This allowed her and her colleagues to see the supraglacial lake drain in real time for the first time, making it possible to estimate how much water was involved and how long it took for the lake to drain and refill. Horodyskyj said in a press conference at the AGU Fall Meeting that such observations of supraglacial lakes are valuable because in a warming climate, melting glaciers can lead to formation of supraglacial lakes.

Balcerak, Ernie

2011-12-01

373

Infrared polarimetric camera system development  

NASA Astrophysics Data System (ADS)

The Defence Evaluation and Research Agency (DERA) has a requirement for an IRPC system to detect surface laid and buried anti-tank landmines in support of Phase 2 of the REmote Minefield Detection System Technology Demonstration Program. Nichols Research Corporation is currently under contact to DERA to design and fabricate the IRPC system for integration in the REMIDS TDP. The IRPC is a Stokes 4-vector IR camera system designed to operate form a static tower, a moving elevated surface platform or a moving airborne platform and will be used to demonstrate the usefulness of passive IR polarimetry for mine and minefield detection. DERA will use the IRPC system to investigate the feasibility of using polarimetric techniques to detect buried and surface laid mines from an airborne platform when operated in conjunction with an ultra wideband SAR.

Barnes, Howard B.; Jones, Michael W.; Bishop, Paul K.

1999-08-01

374

A Second Generation Multi-Angle Imaging SpectroRadiometer (MISR-2)  

NASA Technical Reports Server (NTRS)

The Multi-angle Imaging SpectroRadiometer (MISR) has been in Earth orbit since December 1999 on NASA's Terra spacecraft. This instrument provides new ways of looking at the Earth's atmosphere, clouds, and surface for the purpose of understanding the Earth's ecology, environment, and climate. To facilitate the potential future continuation of MISR's multi-angle observations, a study was undertaken in 1999 and 2000 under the Instrument Incubator Program (IIP) of NASA Code Y's Earth Science Technology Office (ESTO) to investigate and demonstrate the feasibility of a successor to MISR that will have greatly reduced size and mass. The kernel of the program was the design, construction, and testing of a highly miniaturized camera, one of the nine that would probably be used on a future space borne MISR-like instrument. This demonstrated that the size and mass reduction of the optical system and camera electronics are possible and that filters can be assembled to meet the miniaturized packaging requirements. An innovative, reflective optics design was used, enabling the wavelength range to be extended into the shortwave infrared. This was the smallest all-reflective camera ever produced by the contractor. A study was undertaken to determine the feasibility of implementing nine (multi-angle) cameras within a single structure. This resulted in several possible configurations. It would also be possible to incorporate one of the cameras into an airborne instrument.

Bothwell, Graham; Diner, David J.; Pagano, Thomas S.; Duval, Valerie G.; Beregovski, Yuri; Hovland, Larry E.; Preston, Daniel J.

2001-01-01

375

Wall Angle Effects on Nozzle Separation Stability  

NASA Astrophysics Data System (ADS)

The presence of asymmetric side loads due to unstable separation within over-expanded rocket nozzles is well documented. Although progress has been made in developing understanding of this phenomenon through numerical and experimental means, the causes of these side loads have yet to be fully explained. The hypothesis examined within this paper is that there is a relationship between nozzle wall angle at the point of separation, and the stability of the flow separation. This was achieved through an experimental investigation of a series of subscale over-expanded conical nozzles with half-angles of 8.3°, 10.4°, 12.6° and 14.8°. All had overall area ratios of 16:1, with separation occurring at approximately half the nozzle length (i.e. area ration of 4:1) under an overall pressure ratio of approximately 7:1 using air as the working fluid. The structure of exhaust flow was observed and analysed by use of an optimised Schlieren visualisation system, coupled with a high speed digital camera. The 12.6° and 14.8° nozzles exhaust flow were seen to be stable throughout the recorded test period of 10 seconds. However, a small number of large fluctuations in the jet angle were seen to be present within the flowfield of the 10.4° nozzle, occurring at apparently random intervals through the test period. The flowfield of the 8.3° nozzle demonstrated near continuous, large angle deviations in the jet, with flow patterns containing thickened shear layers and apparent reattachment to the wall, something not previously identified in conical nozzles. These results were used to design a truncated ideal contour with an exit angle of over 10 degrees, in order to assess the possibility of designing conventional nozzles that separate stably over a wide range of pressure ratios. These tests were successful, potentially providing a simpler, cheaper alternative to altitude compensating nozzle devices. However, more work determining the nature of the separation and its causes is required.

Aghababaie, A.; Taylor, N.

376

Investigating at the Moon With new Eyes: The Lunar Reconnaissance Orbiter Mission Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Mission Camera (LROC) H. Hiesinger (1,2), M.S. Robinson (3), A.S. McEwen (4), E.P. Turtle (4), E.M. Eliason (4), B.L. Jolliff (5), M.C. Malin (6), and P.C. Thomas (7) (1) Brown Univ., Dept. of Geological Sciences, Providence RI 02912, Harald_Hiesinger@brown.edu, (2) Westfaelische Wilhelms-University, (3) Northwestern Univ., (4) LPL, Univ. of Arizona, (5) Washington Univ., (6) Malin Space Science Systems, (7) Cornell Univ. The Lunar Reconnaissance Orbiter (LRO) mission is scheduled for launch in October 2008 as a first step to return humans to the Moon by 2018. The main goals of the Lunar Reconnaissance Orbiter Camera (LROC) are to: 1) assess meter and smaller- scale features for safety analyses for potential lunar landing sites near polar resources, and elsewhere on the Moon; and 2) acquire multi-temporal images of the poles to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near permanent illumination over a full lunar year. In addition, LROC will return six high-value datasets such as 1) meter-scale maps of regions of permanent or near permanent illumination of polar massifs; 2) high resolution topography through stereogrammetric and photometric stereo analyses for potential landing sites; 3) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 4) a global 100-m/pixel basemap with incidence angles (60-80 degree) favorable for morphologic interpretations; 5) images of a variety of geologic units at sub-meter resolution to investigate physical properties and regolith variability; and 6) meter-scale coverage overlapping with Apollo Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to estimate hazards for future surface operations. LROC consists of two narrow-angle cameras (NACs) which will provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera (WAC) to acquire images at about 100 m/pixel in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). Each NAC has a 700-mm-focal-length optic that images onto a 5000-pixel CCD line-array, providing a cross-track field-of-view (FOV) of 2.86 degree. The NAC readout noise is better than 100 e- , and the data are sampled at 12 bits. Its internal buffer holds 256 MB of uncompressed data, enough for a full-swath image 25-km long or a 2x2 binned image 100-km long. The WAC has two 6-mm- focal-length lenses imaging onto the same 1000 x 1000 pixel, electronically shuttered CCD area-array, one imaging in the visible/near IR, and the other in the UV. Each has a cross-track FOV of 90 degree. From the nominal 50-km orbit, the WAC will have a resolution of 100 m/pixel in the visible, and a swath width of ˜100 km. The seven-band color capability of the WAC is achieved by color filters mounted directly 1 over the detector, providing different sections of the CCD with different filters [1]. The readout noise is less than 40 e- , and, as with the NAC, pixel values are digitized to 12-bits and may be subsequently converted to 8-bit values. The total mass of the LROC system is about 12 kg; the total LROC power consumption averages at 22 W (30 W peak). Assuming a downlink with lossless compression, LRO will produce a total of 20 TeraBytes (TB) of raw data. Production of higher-level data products will result in a total of 70 TB for Planetary Data System (PDS) archiving, 100 times larger than any previous missions. [1] Malin et al., JGR, 106, 17651-17672, 2001. 2

Hiesinger, H.; Robinson, M. S.; McEwen, A. S.; Turtle, E. P.; Eliason, E. M.; Jolliff, B. L.; Malin, M. C.; Thomas, P. C.

377

Equilibrium contact angle or the most-stable contact angle?  

PubMed

It is well-established that the equilibrium contact angle in a thermodynamic framework is an "unattainable" contact angle. Instead, the most-stable contact angle obtained from mechanical stimuli of the system is indeed experimentally accessible. Monitoring the susceptibility of a sessile drop to a mechanical stimulus enables to identify the most stable drop configuration within the practical range of contact angle hysteresis. Two different stimuli may be used with sessile drops: mechanical vibration and tilting. The most stable drop against vibration should reveal the changeless contact angle but against the gravity force, it should reveal the highest resistance to slide down. After the corresponding mechanical stimulus, once the excited drop configuration is examined, the focus will be on the contact angle of the initial drop configuration. This methodology needs to map significantly the static drop configurations with different stable contact angles. The most-stable contact angle, together with the advancing and receding contact angles, completes the description of physically realizable configurations of a solid-liquid system. Since the most-stable contact angle is energetically significant, it may be used in the Wenzel, Cassie or Cassie-Baxter equations accordingly or for the surface energy evaluation. PMID:24140073

Montes Ruiz-Cabello, F J; Rodríguez-Valverde, M A; Cabrerizo-Vílchez, M A

2014-04-01

378

Digital Cameras in the K-12 Classroom.  

ERIC Educational Resources Information Center

This paper discusses the use of digital cameras in K-12 education. Examples are provided of the integration of the digital camera and visual images into: reading and writing; science, social studies, and mathematics; projects; scientific experiments; desktop publishing; visual arts; data analysis; computer literacy; classroom atmosphere; and…

Clark, Kenneth; Hosticka, Alice; Bedell, Jacqueline

379

Calibrating and characterizing intensified video cameras radiometrically  

Microsoft Academic Search

Multispectral, hyperspectral, and polarization filters have been shown to provide additional discriminants when searching for mines and other obstacles, but they demand more illumination for the sensing system. Conventional CCD video cameras, when used through such filters, fail at sunset or soon after. It is tempting to employ an automatic-gain intensified camera to push this time deeper into the night

Harold R. Suiter; Chuong N. Pham; Kenneth R. Tinsley

2003-01-01

380

Using a digital camera to study motion  

Microsoft Academic Search

A digital camera can easily be used to make a video record of a range of motions and interactions of objects - shm, free-fall and collisions, both elastic and inelastic. The video record allows measurements of displacement and time, and hence calculation of velocities, and practice with the standard formulas for motions and collisions. The camera extends the range of

Andrew J. McNeil; Steven Daniel

381

Smart Camera Networks in Virtual Reality  

Microsoft Academic Search

This paper presents our research towards smart camera networks capable of carrying out advanced surveillance tasks with little or no human supervision. A unique centerpiece of our work is the combination of computer graphics, artificial life, and computer vision simulation technologies to develop such networks and experiment with them. Specifically, we demonstrate a smart camera network comprising static and active

Faisal Qureshi; Demetri Terzopoulos

2008-01-01

382

Social practice of Camera Phone in Japan  

Microsoft Academic Search

The camera phone makes it possible to take and share pictures of the stream of people, places, pets and objects in the flow of everyday life. The work reported here is the emergent practice of camera phone use in Japan, providing concrete examples from \\

Daisuke Okabe

2005-01-01

383

Automatic focus control for facsimile camera  

NASA Technical Reports Server (NTRS)

Focus control performs function of automatically focusing facsimile camera throughout object field being scanned. It does this by determining and adjusting focus of imaging sensor accordingly. Since facsimile camera images a scene by scanning discrete strips, it is possible to have entire three-dimensional scene in perfect focus at point of imaging by use of focus control.

Sinclair, A. R.; Katzberg, S. J.; Burcher, E. E.

1973-01-01

384

Creating and Using a Camera Obscura  

ERIC Educational Resources Information Center

The camera obscura (Latin for "darkened room") is the earliest optical device and goes back over 2500 years. The small pinhole or lens at the front of the room allows light to enter and this is then "projected" onto a screen inside the room. This differs from a camera, which projects its image onto light-sensitive material. Originally images were…

Quinnell, Justin

2012-01-01

385

Holography cameras for art and education  

NASA Astrophysics Data System (ADS)

Holographers have an expectation that holography might be a photography of tomorrow. But now, holography isn't still popular because of difficulties of its making process. With my experiment of teaching holography at Art University, good results are obtained by the use of holography-camera instead of conventional optical kit. This paper illustrates the outline of holo-camera and its use.

Ishikawa, Jun

1995-04-01

386

Digital video camera workshop Sony VX2000  

E-print Network

Digital video camera workshop Sony VX2000 Sony DSR-PDX10 #12;Borrowing Eligibility · Currently are excellent for all types of video work · Both use MiniDV format (60min SP/90min LP) #12;Camcorder Kits ­ Firewire #12;Video Camera Operation Installing the Battery Sony VX2000 Insert the battery with the arrow

387

Wand-based Multiple Camera Studio Calibration  

Microsoft Academic Search

To meet the demands of the many emerging multiple camera studio systems in entertainment content production, a novel wand-based system is presented for calibration of both intrinsic (focal length,lens distortion) and extrinsic (position, orientation) parame ters of multiple cameras. Full metric calibration is obtained solely from observations of a wand comprising two visible markers at a known fixed distance. It

Joel Mitchelson; Adrian Hilton

2003-01-01

388

Visual Servoing from Spheres with Paracatadioptric Cameras  

E-print Network

a visible set of rigid-body configurations of a special target to an image space, to construct global firstname.name@irisa.fr Summary. A paracatadioptric camera consists of the coupling of a parabolic mirror with a telecentric lens which realizes an orthographic projection to the image sensor. This type of camera provides

Paris-Sud XI, Université de

389

Flat-field spectrograph camera designs.  

PubMed

The final designs for the short-focus spectrographic cameras for the 213-cm Kitt Peak reflector are here defined. Duplicates of some of these cameras will be used also on the 152-cm telescope of the Cerro Tololo Inter-American Observatory in Chile. PMID:20048874

Schulte, D H

1966-03-01

390

Solid State Replacement of Rotating Mirror Cameras  

SciTech Connect

Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed ''In-situ Storage Image Sensor'' or ''ISIS'', by Prof. Goji Etoh, has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

Frank, A M; Bartolick, J M

2006-08-25

391

Rotating drum cameras for high speed photography  

Microsoft Academic Search

The operation of a rotating drum, rotating mirror framing camera is based on an optical shuttering technique eliminating complex mechanisms and improving reliability. Such cameras offer continuous access capability, constant recording rates, economy of operation, and the rapid sequencing, processing, and analysis of shots. Applications include investigations in cloud physics, holography, observing failure mechanisms in high-speed machinery, ballistics, gas combustion,

G. R. van Horn

1977-01-01

392

Cameras Monitor Spacecraft Integrity to Prevent Failures  

NASA Technical Reports Server (NTRS)

The Jet Propulsion Laboratory contracted Malin Space Science Systems Inc. to outfit Curiosity with four of its cameras using the latest commercial imaging technology. The company parlayed the knowledge gained under working with NASA to develop an off-the-shelf line of cameras, along with a digital video recorder, designed to help troubleshoot problems that may arise on satellites in space.

2014-01-01

393

An interpretation of the narrow positron annihilation feature from X-ray nova Muscae 1991  

NASA Technical Reports Server (NTRS)

The physical mechanism responsible for the narrow redshifted positron annihilation gamma-ray line from the X-ray nova Muscae 1991 is studied. The orbital inclination angle of the system is estimated and its black hole mass is constrained under the assumptions that the annihilation line centroid redshift is purely gravitational and that the line width is due to the combined effect of temperature broadening and disk rotation. The large black hole mass lower limit of 8 solar and the high binary mass ratio it implies raise a serious challenge to theoretical models of the formation and evolution of massive binaries.

Chen, Wan; Gehrels, Neil; Cheng, F. H.

1993-01-01

394

Opposed bubbly jets at different impact angles: Jet structure and bubble properties  

Microsoft Academic Search

The structure of two colliding water jets containing small gas bubbles is studied experimentally. The effects of the separation distance between jets, as well as the orientation angle, on the spatial distribution of bubbles have been considered. Results on the global structure of the final jet and bubble properties have been obtained using a high-speed video camera, and measurements of

Francesc Suñol; Ricard González-Cinca

2010-01-01

395

A novel digital shearography with wide angle of view for nondestructive inspection  

NASA Astrophysics Data System (ADS)

Digital shearography is widely accepted in non-destructive inspection of honeycomb sandwich structures due to its advantages of validity, non-contact, simple setup and robustness. In digital shearography, Michelson shear interferometer (MSI) is a dominant shearing device because it is easy to change the shearing amount and direction. However, the conventional digital shearography based on MSI suffers from the small angle of view which limits its employments in full field inspection of a big size sample at a short working distance. A novel structure digital shearography with wide angle of view introduced in this paper is developed to overcome the disadvantage. In the new shearography optical arrangement, the image lens is separated with the camera and locates at the front of system. A 4f imaging system is used to transmit the image of object from the imaging lens to the camera. The shearing device, MSI, locates between the imaging lens and camera. The angle of view in this shearography has no limit to the setup but it is based on several parameters, such as the focus length of the imaging lens and the size of the imaging device inside the camera. Thus wide angle of view can be easily achieved by changing those parameters. Using this novel digital shearography, full field inspection of the big size honeycomb sandwich structure can be rapidly conducted at a short working distance.

Wu, Sijin; Yang, Lianxiang

2010-12-01

396

A novel digital shearography with wide angle of view for nondestructive inspection  

NASA Astrophysics Data System (ADS)

Digital shearography is widely accepted in non-destructive inspection of honeycomb sandwich structures due to its advantages of validity, non-contact, simple setup and robustness. In digital shearography, Michelson shear interferometer (MSI) is a dominant shearing device because it is easy to change the shearing amount and direction. However, the conventional digital shearography based on MSI suffers from the small angle of view which limits its employments in full field inspection of a big size sample at a short working distance. A novel structure digital shearography with wide angle of view introduced in this paper is developed to overcome the disadvantage. In the new shearography optical arrangement, the image lens is separated with the camera and locates at the front of system. A 4f imaging system is used to transmit the image of object from the imaging lens to the camera. The shearing device, MSI, locates between the imaging lens and camera. The angle of view in this shearography has no limit to the setup but it is based on several parameters, such as the focus length of the imaging lens and the size of the imaging device inside the camera. Thus wide angle of view can be easily achieved by changing those parameters. Using this novel digital shearography, full field inspection of the big size honeycomb sandwich structure can be rapidly conducted at a short working distance.

Wu, Sijin; Yang, Lianxiang

2011-05-01

397

ARNICA, the Arcetri Near-Infrared Camera  

NASA Astrophysics Data System (ADS)

ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)

Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.

1996-04-01

398

The QUEST Large Area CCD Camera  

E-print Network

We have designed, constructed and put into operation a very large area CCD camera that covers the field of view of the 1.2 m Samuel Oschin Schmidt Telescope at the Palomar Observatory. The camera consists of 112 CCDs arranged in a mosaic of four rows with 28 CCDs each. The CCDs are 600 x 2400 pixel Sarnoff thinned, back illuminated devices with 13 um x 13 um pixels. The camera covers an area of 4.6 deg x 3.6 deg on the sky with an active area of 9.6 square degrees. This camera has been installed at the prime focus of the telescope, commissioned, and scientific quality observations on the Palomar-QUEST Variability Sky Survey were started in September of 2003. The design considerations, construction features, and performance parameters of this camera are described in this paper.

Charlie Baltay; David Rabinowitz; Peter Andrews; Anne Bauer; Nancy Ellman; William Emmet; Rebecca Hudson; Thomas Hurteau; Jonathan Jerke; Rochelle Lauer; Julia Silge; Andrew Szymkowiak; Brice Adams; Mark Gebhard; James Musser; Michael Doyle; Harold Petrie; Roger Smith; Robert Thicksten; John Geary

2007-02-21

399

Architecture of PAU survey camera readout electronics  

NASA Astrophysics Data System (ADS)

PAUCam is a new camera for studying the physics of the accelerating universe. The camera will consist of eighteen 2Kx4K HPK CCDs: sixteen for science and two for guiding. The camera will be installed at the prime focus of the WHT (William Herschel Telescope). In this contribution, the architecture of the readout electronics system is presented. Back- End and Front-End electronics are described. Back-End consists of clock, bias and video processing boards, mounted on Monsoon crates. The Front-End is based on patch panel boards. These boards are plugged outside the camera feed-through panel for signal distribution. Inside the camera, individual preamplifier boards plus kapton cable completes the path to connect to each CCD. The overall signal distribution and grounding scheme is shown in this paper.

Castilla, Javier; Cardiel-Sas, Laia; De Vicente, Juan; Illa, Joseph; Jimenez, Jorge; Maiorino, Marino; Martinez, Gustavo

2012-07-01

400

VLSI-distributed architectures for smart cameras  

NASA Astrophysics Data System (ADS)

Smart cameras use video/image processing algorithms to capture images as objects, not as pixels. This paper describes architectures for smart cameras that take advantage of VLSI to improve the capabilities and performance of smart camera systems. Advances in VLSI technology aid in the development of smart cameras in two ways. First, VLSI allows us to integrate large amounts of processing power and memory along with image sensors. CMOS sensors are rapidly improving in performance, allowing us to integrate sensors, logic, and memory on the same chip. As we become able to build chips with hundreds of millions of transistors, we will be able to include powerful multiprocessors on the same chip as the image sensors. We call these image sensor/multiprocessor systems image processors. Second, VLSI allows us to put a large number of these powerful sensor/processor systems on a single scene. VLSI factories will produce large quantities of these image processors, making it cost-effective to use a large number of them in a single location. Image processors will be networked into distributed cameras that use many sensors as well as the full computational resources of all the available multiprocessors. Multiple cameras make a number of image recognition tasks easier: we can select the best view of an object, eliminate occlusions, and use 3D information to improve the accuracy of object recognition. This paper outlines approaches to distributed camera design: architectures for image processors and distributed cameras; algorithms to run on distributed smart cameras, and applications of which VLSI distributed camera systems.

Wolf, Wayne H.

2001-03-01

401

Development of high-speed video cameras  

NASA Astrophysics Data System (ADS)

Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

2001-04-01

402

Measures on Mixing Angles  

E-print Network

We address the problem of the apparently very small magnitude of CP violation in the standard model, measured by the Jarlskog invariant J. In order to make statements about probabilities for certain values of J, we seek to find a natural measure on the space of Kobayashi-Maskawa matrices, the double quotient U(1)^2\\SU(3)/U(1)^2. We review several possible, geometrically motivated choices of the measure, and compute expectation values for powers of J for these measures. We find that different choices of the measure generically make the observed magnitude of CP violation appear finely tuned. Since the quark masses and the mixing angles are determined by the same set of Yukawa couplings, we then do a second calculation in which we take the known quark mass hierarchy into account. We construct the simplest measure on the space of 3 x 3 Hermitian matrices which reproduces this known hierarchy. Calculating expectation values for powers of J in this second approach, we find that values of J close to the observed val...

Gibbons, Gary W; Pope, C N; Turok, Neil

2008-01-01

403

Cerebellopontine Angle Epidermoids  

PubMed Central

Epidermoids, or congenital cholesteatomas, constitute about 0.2% to 1.5% of intracranial tumors, and 3% to 5% of tumors of the cerebellopontine angle (CPA). We review the surgical management of CPA epidermoids in 13 patients at the House Ear Clinic for the years 1978 to 1993. There were seven male and six female patients, ranging in age from 27 to 59 years (average, 40 years). Tumors ranged in size from 3.5 cm to 7.0 cm, and the surgical approach was tailored to the tumor extent and location. All patients complained at presentation of unilateral hearing loss, and nine had poor speech discrimination (less than 50%) preoperatively. Serviceable hearing was preserved in two patients. Two patients presented with facial nerve symptoms, and four cases had postoperative permanent facial nerve paralysis (House-Brackmann Grade V or VI). There were no surgical deaths. Four patients required second surgeries to remove residual cholesteatoma. Compared with prior series, we describe a higher rate of total tumor removed, as well as a higher rate of second operations, indicating a more aggressive approach to these lesions. ImagesFigure 1Figure 2Figure 3 PMID:17170950

Doyle, Karen Jo; De la Cruz, Antonio

1996-01-01

404

Tunable pulsed narrow bandwidth light source  

DOEpatents

A tunable pulsed narrow bandwidth light source and a method of operating a light source are provided. The light source includes a pump laser, first and second non-linear optical crystals, a tunable filter, and light pulse directing optics. The method includes the steps of operating the pump laser to generate a pulsed pump beam characterized by a nanosecond pulse duration and arranging the light pulse directing optics so as to (i) split the pulsed pump beam into primary and secondary pump beams; (ii) direct the primary pump beam through an input face of the first non-linear optical crystal such that a primary output beam exits from an output face of the first non-linear optical crystal; (iii) direct the primary output beam through the tunable filter to generate a sculpted seed beam; and direct the sculpted seed beam and the secondary pump beam through an input face of the second non-linear optical crystal such that a secondary output beam characterized by at least one spectral bandwidth on the order of about 0.1 cm.sup.-1 and below exits from an output face of the second non-linear optical crystal.

Powers, Peter E. (Dayton, OH); Kulp, Thomas J. (Livermore, CA)

2002-01-01

405

History of the Tacoma Narrows Bridge  

NSDL National Science Digital Library

The Department of Special Collections at the University of Washington has created an excellent online exhibit documenting the rise and (literal) fall of the Tacoma Narrows bridge in Washington State, an event referred to as the Pearl Harbor of engineering. The massive structure was built between 1938 and 1940 and, at the time of its completion, was the third longest suspension bridge in the world. The bridge displayed some notable wavelike motions during the final stages of construction, but no one was prepared for what happened on November 7, 1940, when the entire structure began to buckle, and shortly collapsed into the water below. Amazingly, the only fatality was a dog that was trapped in one of the vehicles on the main span of the bridge. The online exhibit documents this amazing event, with numerous photographs of the bridge under construction, and most incredibly, dramatic shots of the bridge buckling and its fall taken by several bystanders. This exhibit will be of particular interest to engineers, particularly those working in the field of bridge construction.

1999-01-01

406

Be Foil "Filter Knee Imaging" NSTX Plasma with Fast Soft X-ray Camera  

SciTech Connect

A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28{sup o}) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip.

B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

2005-08-08

407

Performance characterization of commercially available uncooled micro-bolometer thermal cameras for varying camera temperatures  

NASA Astrophysics Data System (ADS)

Uncooled infrared (IR) microbolometer cameras are gaining popularity in a variety of military and commercial applications due to their simplicity, compactness and reduced cost when compared to photon detectors. Three commercially available IR microbolometer cameras have been investigated for use in a system. The cameras have been characterized in terms of camera response and noise as function of camera temperature with the aim of modelling the cameras for use in simulation. Ideally, the camera systems, consisting of a detector, electronics, and optics, should be modelled from a low-level physical point of view and measurements should be performed for verification. However, the detector and electronic design parameters are not available for the commercially acquired cameras, and a black-box approach of the systems was adopted for modelling and characterization. The black-box approach entails empirical mathematical modelling of the camera response and noise through measurements and subsequent data analysis. A 3D noise model was employed to characterize camera noise in terms of orthogonal noise components, and an empirical temperature-dependent model was deduced for each component. The method of modelling through measurement is discussed, and the accuracy of specifically the empirical noise models is shown. The cameras are also compared in terms of measured noise performance.

Minnaar, I. J.

2014-06-01

408

Optimum Projection Angle for Attaining Maximum Distance in a Rugby Place Kick  

PubMed Central

This study investigated the effect of projection angle on the distance attained in a rugby place kick. A male rugby player performed 49 maximum-effort kicks using projection angles of between 20 and 50°. The kicks were recorded by a video camera at 50 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity and projection angle of the ball. The player’s optimum projection angle was calculated by substituting a mathematical expression for the relationship between projection velocity and projection angle into the equations for the aerodynamic flight of a rugby ball. We found that the player’s calculated optimum projection angle (30.6°, 95% confidence limits ± 1.9°) was in close agreement with his preferred projection angle (mean value 30.8°, 95% confidence limits ± 2.1°). The player’s calculated optimum projection angle was also similar to projection angles previously reported for skilled rugby players. The optimum projection angle in a rugby place kick is considerably less than 45° because the projection velocity that a player can produce decreases substantially as projection angle is increased. Aerodynamic forces and the requirement to clear the crossbar have little effect on the optimum projection angle. Key Points The optimum projection angle in a rugby place kick is about 30°. The optimum projection angle is considerably less than 45° because the projection velocity that a player can produce decreases substantially as projection angle is increased. Aerodynamic forces and the requirement to clear the crossbar have little effect on the optimum projection angle. PMID:24570626

Linthorne, Nicholas P.; Stokes, Thomas G.

2014-01-01

409

Generalization of the Euler Angles  

NASA Technical Reports Server (NTRS)

It is shown that the Euler angles can be generalized to axes other than members of an orthonormal triad. As first shown by Davenport, the three generalized Euler axes, hereafter: Davenport axes, must still satisfy the constraint that the first two and the last two axes be mutually perpendicular if these axes are to define a universal set of attitude parameters. Expressions are given which relate the generalized Euler angles, hereafter: Davenport angles, to the 3-1-3 Euler angles of an associated direction-cosine matrix. The computation of the Davenport angles from the attitude matrix and their kinematic equation are presented. The present work offers a more direct development of the Davenport angles than Davenport's original publication and offers additional results.

Bauer, Frank H. (Technical Monitor); Shuster, Malcolm D.; Markley, F. Landis

2002-01-01

410

Axial-Cones: Modeling Spherical Catadioptric Cameras for Wide-Angle Light Field Rendering  

E-print Network

) Ramesh Raskar MIT Media Lab Captured Photo Focus Back (Aliasing) Focus Ball (Anti-Aliased)Setup All In enables wide-FOV digital refocusing and dense depth estimation using an array of spherical mirrors. We depth map is used for aliasing removal, surface-dependent refocusing, and all-in-focus rendering

Agrawal, Amit

411

Performing fish counts with a wide-angle camera, a promising approach reducing divers' limitations  

E-print Network

to physiological effects related to SCUBA diving (Baddeley, 1971), the effect of ocean waves and currents (Harmelin 2012 Received in revised form 15 April 2013 Accepted 16 April 2013 Available online 11 May 2013

Teixeira, Sara

412

3/9/2007 Mohanty 1 A Secure Digital Camera (SDC) forA Secure Digital Camera (SDC) for  

E-print Network

3/9/2007 Mohanty 1 A Secure Digital Camera (SDC) forA Secure Digital Camera (SDC) for Real Proposed Secure Digital Camera (SDC) for real-time DRM Our Low-Power Watermarking Chip for the SDC Research for DRM: Secure Digital Camera (SDC)Secure Digital Camera (SDC) #12;3/9/2007 Mohanty 11 Secure Digital

Mohanty, Saraju P.

413

3/20/2007 Mohanty 1 A Secure Digital Camera (SDC) forA Secure Digital Camera (SDC) for  

E-print Network

3/20/2007 Mohanty 1 A Secure Digital Camera (SDC) forA Secure Digital Camera (SDC) for Real Proposed Secure Digital Camera (SDC) for real-time DRM. Our Low-Power Watermarking Chip for the SDC:Our Solution for DRM: Secure Digital Camera (SDC)Secure Digital Camera (SDC) #12;3/20/2007 Mohanty 10 Secure

Mohanty, Saraju P.

414

Cooperative resonance linewidth narrowing in a planar metamaterial  

E-print Network

We theoretically analyze the experimental observations of a spectral line collapse in a metamaterial array of asymmetric split ring resonators [Fedotov et al., Phys. Rev. Lett. 104, 223901 (2010)]. We show that the ensemble of closely-spaced resonators exhibits cooperative response, explaining the observed system-size dependent narrowing of the transmission resonance linewidth. We further show that this cooperative narrowing depends sensitively on the lattice spacing and that significantly stronger narrowing could be achieved in media with suppressed ohmic losses.

S. D. Jenkins; J. Ruostekoski

2011-06-28

415

8.G Find the Angle  

NSDL National Science Digital Library

This is a task from the Illustrative Mathematics website that is one part of a complete illustration of the standard to which it is aligned. Each task has at least one solution and some commentary that addresses important asects of the task and its potential use. Here are the first few lines of the commentary for this task: In triangle $\\Delta ABC$, point $M$ is the point of intersection of the bisectors of angles $\\angle BAC$, $\\angle ABC$, and $\\angle ACB$. The measure o...

416

The Critical Angle Can Override the Brewster Angle  

ERIC Educational Resources Information Center

As a culminating activity in their study of optics, my students investigate polarized light and the Brewster angle. In this exercise they encounter a situation in which it is impossible to measure the Brewster angle for light reflecting from a particular surface. This paper describes the activity and explains the students' observations.

Froehle, Peter H.

2009-01-01

417

Performance Analysis of a Low-Cost Triangulation-Based 3d Camera: Microsoft Kinect System  

NASA Astrophysics Data System (ADS)

Recent technological advancements have made active imaging sensors popular for 3D modelling and motion tracking. The 3D coordinates of signalised targets are traditionally estimated by matching conjugate points in overlapping images. Current 3D cameras can acquire point clouds at video frame rates from a single exposure station. In the area of 3D cameras, Microsoft and PrimeSense have collaborated and developed an active 3D camera based on the triangulation principle, known as the Kinect system. This off-the-shelf system costs less than 150 USD and has drawn a lot of attention from the robotics, computer vision, and photogrammetry disciplines. In this paper, the prospect of using the Kinect system for precise engineering applications was evaluated. The geometric quality of the Kinect system as a function of the scene (i.e. variation of depth, ambient light conditions, incidence angle, and object reflectivity) and the sensor (i.e. warm-up time and distance averaging) were analysed quantitatively. This system's potential in human body measurements was tested against a laser scanner and 3D range camera. A new calibration model for simultaneously determining the exterior orientation parameters, interior orientation parameters, boresight angles, leverarm, and object space features parameters was developed and the effectiveness of this calibration approach was explored.

. K. Chow, J. C.; Ang, K. D.; Lichti, D. D.; Teskey, W. F.

2012-07-01

418

SLAM-Based Automatic Extrinsic Calibration of a Multi-Camera Rig Gerardo Carrera, Adrien Angeli and Andrew J. Davison  

E-print Network

cameras with special optics such as fish-eye lenses or catadioptric This work was supported by a CONACYT in robotics, however, since ongoing progress in computer vision indicates that the detailed photometric of highly separated angles cannot be simultaneous and the robot loses the `eyes in the back of its head

Davison, Andrew

419

Correction of dark current in consumer cameras  

NASA Astrophysics Data System (ADS)

A study of dark current in digital imagers in digital single-lens reflex (DSLR) and compact consumer-grade digital cameras is presented. Dark current is shown to vary with temperature, exposure time, and ISO setting. Further, dark current is shown to increase in successive images during a series of images. DSLR and compact consumer cameras are often designed such that they are contained within a densely packed camera body, and therefore the digital imagers within the camera frame are prone to heat generated by the sensor as well as nearby elements within the camera body. It is the scope of this work to characterize the dark current in such cameras and to show that the dark current, in part due to heat generated by the camera itself, can be corrected by using hot pixels on the imager. This method generates computed dark frames based on the dark current indicator value of the hottest pixels on the chip. We compare this method to standard methods of dark current correction.

Dunlap, Justin C.; Bodegom, Erik; Widenhorn, Ralf

2010-01-01

420

Narrow Escape, Part II: The circular disk  

E-print Network

We consider Brownian motion in a circular disk $\\Omega$, whose boundary $\\p\\Omega$ is reflecting, except for a small arc, $\\p\\Omega_a$, which is absorbing. As $\\epsilon=|\\partial \\Omega_a|/|\\partial \\Omega|$ decreases to zero the mean time to absorption in $\\p\\Omega_a$, denoted $E\\tau$, becomes infinite. The narrow escape problem is to find an asymptotic expansion of $E\\tau$ for $\\epsilon\\ll1$. We find the first two terms in the expansion and an estimate of the error. The results are extended in a straightforward manner to planar domains and two-dimensional Riemannian manifolds that can be mapped conformally onto the disk. Our results improve the previously derived expansion for a general smooth domain, $E\\tau = \\ds{\\frac{|\\Omega|}{D\\pi}}[\\log\\ds{\\frac{1}{\\epsilon}}+O(1)],$ ($D$ is the diffusion coefficient) in the case of a circular disk. We find that the mean first passage time from the center of the disk is $E[\\tau | \\x(0)=\\mb{0}]=\\ds{\\frac{R^2}{D}}[\\log\\ds{\\frac{1}{\\epsilon}} + \\log 2 +\\ds{{1/4}} + O(\\epsilon)]$. The second term in the expansion is needed in real life applications, such as trafficking of receptors on neuronal spines, because $\\log\\ds{\\frac{1}{\\epsilon}}$ is not necessarily large, even when $\\epsilon$ is small. We also find the singular behavior of the probability flux profile into $\\p\\Omega_a$ at the endpoints of $\\p\\Omega_a$, and find the value of the flux near the center of the window.

A. Singer; Z. Schuss; D. Holcman

2004-12-15

421

Mitsubishi Electric Research Labs (MERL) Computational Cameras Amit Agrawal  

E-print Network

Mitsubishi Electric Research Labs (MERL) Computational Cameras Amit Agrawal Mitsubishi Electric Research Labs (MERL) Cambridge, MA, USA Computational Cameras: Exploiting Spatial- Angular Temporal Tradeoffs in Photography #12;Mitsubishi Electric Research Labs (MERL) Computational Cameras Where

Agrawal, Amit

422

Deviation from Snell's law for beams transmitted near the critical angle: application to microcavity lasers  

Microsoft Academic Search

We show that when a narrow beam is incident upon a dielectric interface near\\u000athe critical angle for total internal reflection it will be transmitted into\\u000athe far-field with an angular deflection from the direction predicted by\\u000aSnell's Law, due to a phenomenon we call \\

H. E. Tureci; A. Douglas Stone

2002-01-01

423

Deviation from Snell's law for beams transmitted near the critical angle: application to microcavity lasers  

Microsoft Academic Search

We show that when a narrow beam is incident upon a dielectric interface near the critical angle for total internal reflection it will be transmitted into the far field with an angular deflection from the direction predicted by Snell's law, because of a phenomenon that we call ``Fresnel filtering.'' This effect can be quite large for the parameter range that

H. E. Tureci; A. Douglas Stone

2002-01-01

424

Deviation from Snell's Law for Beams Transmitted Near the Critical Angle: Application to Microcavity Lasers  

E-print Network

We show that when a narrow beam is incident upon a dielectric interface near the critical angle for total internal reflection it will be transmitted into the far-field with an angular deflection from the direction predicted by Snell's Law, due to a phenomenon we call "Fresnel Filtering". This effect can be quite large for the parameter range relevant to dielectric microcavity lasers.

H. E. Tureci; A. D. Stone

2001-09-24

425

Radionuclide annular single crystal scintillator camera with rotating collimator  

SciTech Connect

A radionuclide emission tomography camera is described for sensing gamma ray emissions from a source within the field of view consisting of: a fixed, position-sensitive detector means, responsive to the gamma ray emissions and surrounding the field of view for detecting the contact position and the trajectory from which a gamma ray emission originates, the fixed, position-sensitive detector including a single continuous stationary scintillation crystal; rotatable collimator means, disposed between the fixed, position-sensitive detecto means and the field of view, and including at least one array of collimator elements, for restricting and collimating the gamma ray emissions; and means for rotating the collimator means relative to the fixed, position-sensitive detector, for exposing different sections of the position-sensitive detector to the gamma ray emissions in order to view the source from different angles.

Genna, S.; Pang, S.-C.

1986-04-22

426

Modified plenoptic camera for phase and amplitude wavefront sensing  

NASA Astrophysics Data System (ADS)

Shack-Hartmann sensors have been widely applied in wavefront sensing. However, they are limited to measuring slightly distorted wavefronts whose local tilt doesn't surpass the numerical aperture of its micro-lens array and cross talk of incident waves on the mrcro-lens array should be strictly avoided. In medium to strong turbulence cases of optic communication, where large jitter in angle of arrival and local interference caused by break-up of beam are common phenomena, Shack-Hartmann sensors no longer serve as effective tools in revealing distortions in a signal wave. Our design of a modified Plenoptic Camera shows great potential in observing and extracting useful information from severely disturbed wavefronts. Furthermore, by separating complex interference patterns into several minor interference cases, it may also be capable of telling regional phase difference of coherently illuminated objects.

Wu, Chensheng; Davis, Christopher C.

2013-09-01

427

The Atlases of Vesta derived from Dawn Framing Camera images  

NASA Astrophysics Data System (ADS)

The Dawn Framing Camera acquired during its two HAMO (High Altitude Mapping Orbit) phases in 2011 and 2012 about 6,000 clear filter images with a resolution of about 60 m/pixel. We combined these images in a global ortho-rectified mosaic of Vesta (60 m/pixel resolution). Only very small areas near the northern pole were still in darkness and are missing in the mosaic. The Dawn Framing Camera also acquired about 10,000 high-resolution clear filter images (about 20 m/pixel) of Vesta during its Low Altitude Mapping Orbit (LAMO). Unfortunately, the northern part of Vesta was still in darkness during this phase, good illumination (incidence angle < 70°) was only available for 66.8 % of the surface [1]. We used the LAMO images to calculate another global mosaic of Vesta, this time with 20 m/pixel resolution. Both global mosaics were used to produce atlases of Vesta: a HAMO atlas with 15 tiles at a scale of 1:500,000 and a LAMO atlas with 30 tiles at a scale between 1:200,000 and 1:225,180. The nomenclature used in these atlases is based on names and places historically associated with the Roman goddess Vesta, and is compliant with the rules of the IAU. 65 names for geological features were already approved by the IAU, 39 additional names are currently under review. Selected examples of both atlases will be shown in this presentation. Reference: [1]Roatsch, Th., etal., High-resolution Vesta Low Altitude Mapping Orbit Atlas derived from Dawn Framing Camera images. Planetary and Space Science (2013), http://dx.doi.org/10.1016/j.pss.2013.06.024i

Roatsch, T.; Kersten, E.; Matz, K.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

2013-12-01

428

Close-range photogrammetry with video cameras  

NASA Technical Reports Server (NTRS)

Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

Burner, A. W.; Snow, W. L.; Goad, W. K.

1985-01-01

429

Close-Range Photogrammetry with Video Cameras  

NASA Technical Reports Server (NTRS)

Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

Burner, A. W.; Snow, W. L.; Goad, W. K.

1983-01-01

430

Optical metrology for the filter set for the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS)  

Microsoft Academic Search

The Hubble Space Telescope (HST) advanced camera for surveys (ACS) employs a wide variety of spectral filtration components including narrow band, medium band, wide band, and far UV (FUV) long pass filters, spatially-variable filters, VIS\\/IR polarizers, NUV polarizers, FUV prisms, and a grism. These components are spread across ACS's wide field, high resolution, and solar blind channels which provide diffraction-limited

Douglas B. Leviton; Rene A. Boucarut; Frank D. Bush; Ritva A. Keski-Kuha; Catherine Kral; Carolyn A. Krebs; Timothy J. Madison; Kimberly I. Mehalick; Linda A. Miner; Todd A. Norton; Peter Petrone; Bernard P. Puc; Clive Standley; Zlatan Tsvetanov; Frank Varosi

1998-01-01

431

Clementine High Resolution Camera Mosaicking Project  

NASA Astrophysics Data System (ADS)

This report constitutes the final report for NASA Contract NASW-5054. This project processed Clementine I high resolution images of the Moon, mosaicked these images together, and created a 22-disk set of compact disk read-only memory (CD-ROM) volumes. The mosaics were produced through semi-automated registration and calibration of the high resolution (HiRes) camera's data against the geometrically and photometrically controlled Ultraviolet/Visible (UV/Vis) Basemap Mosaic produced by the US Geological Survey (USGS). The HiRes mosaics were compiled from non-uniformity corrected, 750 nanometer ("D") filter high resolution nadir-looking observations. The images were spatially warped using the sinusoidal equal-area projection at a scale of 20 m/pixel for sub-polar mosaics (below 80 deg. latitude) and using the stereographic projection at a scale of 30 m/pixel for polar mosaics. Only images with emission angles less than approximately 50 were used. Images from non-mapping cross-track slews, which tended to have large SPICE errors, were generally omitted. The locations of the resulting image population were found to be offset from the UV/Vis basemap by up to 13 km (0.4 deg.). Geometric control was taken from the 100 m/pixel global and 150 m/pixel polar USGS Clementine Basemap Mosaics compiled from the 750 nm Ultraviolet/Visible Clementine imaging system. Radiometric calibration was achieved by removing the image nonuniformity dominated by the HiRes system's light intensifier. Also provided are offset and scale factors, achieved by a fit of the HiRes data to the corresponding photometrically calibrated UV/Vis basemap, that approximately transform the 8-bit HiRes data to photometric units. The sub-polar mosaics are divided into tiles that cover approximately 1.75 deg. of latitude and span the longitude range of the mosaicked frames. Images from a given orbit are map projected using the orbit's nominal central latitude. Polar mosaics are tiled into squares 2250 pixels on a side, which spans approximately 2.2 deg. Two mosaics are provided for each pole: one corresponding to data acquired while periapsis was in the south, the other while periapsis was in the north. The CD-ROMs also contain ancillary data files that support the HiRes mosaic. These files include browse images with UV/Vis context stored in a Joint Photographic Experts Group (JPEG) format, index files ('imgindx.tab' and 'srcindx.tab') that tabulate the contents of the CD, and documentation files.

1998-10-01

432

High speed web printing inspection with multiple linear cameras  

NASA Astrophysics Data System (ADS)

Purpose: To detect the defects during the high speed process of web printing, such as smudges, doctor streaks, pin holes, character misprints, foreign matters, hazing, wrinkles, etc., which are the main infecting factors to the quality of printing presswork. Methods: A set of novel machine vision system is used to detect the defects. This system consists of distributed data processing with multiple linear cameras, effective anti-blooming illumination design and fast image processing algorithm with blob searching. Also, pattern matching adapted to paper tension and snake-moving are emphasized. Results: Experimental results verify the speed, reliability and accuracy of the proposed system, by which most of the main defects are inspected at real time under the speed of 300 m/min. Conclusions: High speed quality inspection of large-size web requires multiple linear cameras to construct distributed data processing system. Also material characters of the printings should also be stressed to design proper optical structure, so that tiny web defects can be inspected with variably angles of illumination.

Shi, Hui; Yu, Wenyong

2011-12-01

433

Depth map from focus for cell-phone cameras  

NASA Astrophysics Data System (ADS)

Cell-phone cameras generally use mini lenses that are wide-angle and fixed-focal length (4-6 mm) with a fixed aperture (usually f/2.8). As a result, these mini lenses have very short hyper-focal lengths (e.g., the estimated hyper-focal length for a 3.1-MP cell-phone camera module with a 5.6-mm mini lens is only about 109 cm which covers focused-object distances from about 55 cm to infinity). This combination of optical characteristics can be used effectively to achieve: (a) a faster process for auto-focusing based on a small number of pre-defined non-uniform lens-position intervals; and (b) a depth map generation (coarse or fine depending on the number of focus regions of interest--ROIs) which can be used for different image capture/processing operations such as flash/no-flash decision-making. The above two processes were implemented, tested and validated under different lighting conditions and scene contents.

Safaee-Rad, R.; Aleksic, M.

2008-02-01

434

New design of a gamma camera detector with reduced edge effect for breast imaging  

NASA Astrophysics Data System (ADS)

In recent years, there has been a growing interest in developing small gamma cameras dedicated to breast imaging. We designed a new detector with trapezoidal shape to expand the field of view (FOV) of camera without increasing its dimensions. To find optimal parameters, images of point sources at the edge area as functions of the angle and optical treatment of crystal side surface were simulated by using a DETECT2000. Our detector employs monolithic CsI(Tl) with dimensions of 48.0×48.0×6.0 mm coupled to an array of photo-sensors. Side surfaces of crystal were treated with three different surface finishes: black absorber, metal reflector and white reflector. The trapezoidal angle varied from 45° to 90° in steps of 15°. Gamma events were generated on 15 evenly spaced points with 1.0 mm spacing in the X-axis starting 1.0 mm away from the side surface. Ten thousand gamma events were simulated at each location and images were formed by calculating the Anger-logic. The results demonstrated that all the 15 points could be identified only for the crystal with trapezoidal shape having 45° angle and white reflector on the side surface. In conclusion, our new detector proved to be a reliable design to expand the FOV of small gamma camera for breast imaging.

Yeon Hwang, Ji; Lee, Seung-Jae; Baek, Cheol-Ha; Hyun Kim, Kwang; Hyun Chung, Yong

2011-05-01

435

Visualization of explosion phenomena using a high-speed video camera with an uncoupled objective lens by fiber-optic  

NASA Astrophysics Data System (ADS)

Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.

Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo

2008-11-01

436

Naskah Angling Darma Ambya Madura  

Microsoft Academic Search

ABSTRAK Angling Darma adalah cerita Jawa yang mengisahkan perjalanan seorang raja yang terpaksa meninggalkan kerajaannya sebagai dharma untuk menebus dosa-dosa yang dilakukannya. Angling Darma Ambya Madura ditulis dalam bahasa Madura dialek Sumenep. Oleh kerana proses penulisannya yang diduga melalui teknik penyaduran, versi Madura ini banyak sekali meng- gunakan kata dan ungkapan yang diadaptasi dari bahasa Jawa, sehingga kata dan ungkapan

A. SYUKUR GHAZALI

437

Spinning angle optical calibration apparatus  

DOEpatents

An optical calibration apparatus is provided for calibrating and reproducing spinning angles in cross-polarization, nuclear magnetic resonance spectroscopy. An illuminated magnifying apparatus enables optical setting an accurate reproducing of spinning "magic angles" in cross-polarization, nuclear magnetic resonance spectroscopy experiments. A reference mark scribed on an edge of a spinning angle test sample holder is illuminated by a light source and viewed through a magnifying scope. When the "magic angle" of a sample material used as a standard is attained by varying the angular position of the sample holder, the coordinate position of the reference mark relative to a graduation or graduations on a reticle in the magnifying scope is noted. Thereafter, the spinning "magic angle" of a test material having similar nuclear properties to the standard is attained by returning the sample holder back to the originally noted coordinate position.

Beer, Stephen K. (Morgantown, WV); Pratt, II, Harold R. (Morgantown, WV)

1991-01-01

438

Keyboard before Head Tracking Depresses User Success in Remote Camera Control  

NASA Astrophysics Data System (ADS)

In remote mining, operators of complex machinery have more tasks or devices to control than they have hands. For example, operating a rock breaker requires two handed joystick control to position and fire the jackhammer, leaving the camera control to either automatic control or require the operator to switch between controls. We modelled such a teleoperated setting by performing experiments using a simple physical game analogue, being a half size table soccer game with two handles. The complex camera angles of the mining application were modelled by obscuring the direct view of the play area and the use of a Pan-Tilt-Zoom (PTZ) camera. The camera control was via either a keyboard or via head tracking using two different sets of head gestures called “head motion” and “head flicking” for turning camera motion on/off. Our results show that the head motion control was able to provide a comparable performance to using a keyboard, while head flicking was significantly worse. In addition, the sequence of use of the three control methods is highly significant. It appears that use of the keyboard first depresses successful use of the head tracking methods, with significantly better results when one of the head tracking methods was used first. Analysis of the qualitative survey data collected supports that the worst (by performance) method was disliked by participants. Surprisingly, use of that worst method as the first control method significantly enhanced performance using the other two control methods.

Zhu, Dingyun; Gedeon, Tom; Taylor, Ken

439

InfraCAM (trade mark): A Hand-Held Commercial Infrared Camera Modified for Spaceborne Applications  

NASA Technical Reports Server (NTRS)

In 1994, Inframetrics introduced the InfraCAM(TM), a high resolution hand-held thermal imager. As the world's smallest, lightest and lowest power PtSi based infrared camera, the InfraCAM is ideal for a wise range of industrial, non destructive testing, surveillance and scientific applications. In addition to numerous commercial applications, the light weight and low power consumption of the InfraCAM make it extremely valuable for adaptation to space borne applications. Consequently, the InfraCAM has been selected by NASA Lewis Research Center (LeRC) in Cleveland, Ohio, for use as part of the DARTFire (Diffusive and Radiative Transport in Fires) space borne experiment. In this experiment, a solid fuel is ignited in a low gravity environment. The combustion period is recorded by both visible and infrared cameras. The infrared camera measures the emission from polymethyl methacrylate, (PMMA) and combustion products in six distinct narrow spectral bands. Four cameras successfully completed all qualification tests at Inframetrics and at NASA Lewis. They are presently being used for ground based testing in preparation for space flight in the fall of 1995.

Manitakos, Daniel; Jones, Jeffrey; Melikian, Simon

1996-01-01

440

Characteristics of slug flow in narrow rectangular channels under vertical condition  

NASA Astrophysics Data System (ADS)

Gas-liquid slug flow is widely encountered in many practical industrial applications. A detailed understanding of the hydrodynamics of gas slug has important significance for modeling of the slug flow. Non-intrusive flow visualization using a high speed video camera system is applied to study characteristics of slug flow in a vertical narrow rectangular channel (3.25×40 mm2). Ideal Taylor bubbles are hardly observed, and most of the gas slugs are deformed, much more seriously at high liquid superficial velocity. The liquid film thicknesses of left and right narrow sides surrounding gas slug are divergent and wavy, but it has weak effect on liquid film velocity. The gas and liquid velocity as well as the length of gas slug have significant effect on the separating liquid film thickness. The separating liquid film velocity is decreased with the increase of gas superficial velocity at low liquid velocity, and increased with the increase of liquid superficial velocity. The film stops descending and the gas superficial velocity has no significant effect on liquid film separating velocity at high liquid velocity (jL?1.204 m/s), and it is mainly determined by the liquid flow rate. The shape of slug nose has a significant effect on its velocity, while the effect of its length is very weak. The Ishii&Jones-Zuber drift flux correlation could predict slug velocity well, except at low liquid superficial velocity by reason of that the calculated drift velocity is less than experimental values.

Wang, Yang; Yan, Changqi; Sun, Licheng; Xing, Dianchuan; Yan, Chaoxing; Tian, Daogui

2013-07-01

441

X-ray imaging using a consumer-grade digital camera  

NASA Astrophysics Data System (ADS)

The recent advancements in consumer-grade digital camera technology and the introduction of high-resolution, high sensitivity CsBr:Eu 2+ storage phosphor imaging plates make possible a new cost-effective technique for X-ray imaging. The imaging plate is bathed with red stimulating light by high-intensity light-emitting diodes, and the photostimulated image is captured with a digital single-lens reflex (SLR) camera. A blue band-pass optical filter blocks the stimulating red light but transmits the blue photostimulated luminescence. Using a Canon D5 Mk II camera and an f1.4 wide-angle lens, the optical image of a 240×180 mm 2 Konica CsBr:Eu 2+ imaging plate from a position 230 mm in front of the camera lens can be focussed so as to laterally fill the 35×23.3 mm 2 camera sensor, and recorded in 2808×1872 pixel elements, corresponding to an equivalent pixel size on the plate of 88 ?m. The analogue-to-digital conversion from the camera electronics is 13 bits, but the dynamic range of the imaging system as a whole is limited in practice by noise to about 2.5 orders of magnitude. The modulation transfer function falls to 0.2 at a spatial frequency of 2.2 line pairs/mm. The limiting factor of the spatial resolution is light scattering in the plate rather than the camera optics. The limiting factors for signal-to-noise ratio are shot noise in the light, and dark noise in the CMOS sensor. Good quality images of high-contrast objects can be recorded with doses of approximately 1 mGy. The CsBr:Eu 2+ plate has approximately three times the readout sensitivity of a similar BaFBr:Eu 2+ plate.

Winch, N. M.; Edgar, A.

2011-10-01

442

Riparian deforestation, stream narrowing, and loss of stream ecosystem services  

Microsoft Academic Search

A study of 16 streams in eastern North America shows that riparian deforestation causes channel narrowing, which reduces the total amount of stream habitat and ecosystem per unit channel length and compromises in-stream processing of pollutants. Wide forest reaches had more macroinvertebrates, total ecosystem processing of organic matter, and nitrogen uptake per unit channel length than contiguous narrow deforested reaches.

Bernard W. Sweeney; Thomas L. Bott; John K. Jackson; Louis A. Kaplan; J. Denis Newbold; Laurel J. Standley; W. Cully Hession; Richard J. Horwitz

2004-01-01

443

Narrow pulsed voltage generator for liquid food sterilization  

Microsoft Academic Search

This paper first reviews mechanism and waveform consideration for liquid food sterilization, and then presents how to generate bipolar narrow pulses. In this research, a bidirectional flyback converter with energy recovery circuits is proposed to generate bipolar narrow pulsed electric fields for liquid food sterilization, which includes sterilization of drinking water, alcoholic beverages, fruit juice, etc. Since in the converter,

T.-F. Wu; S.-Y. Tseng; M.-W. Wu; Y.-M. Chen

2006-01-01

444

Spectral diffusion dephasing and motional narrowing in single semiconductor quantum  

E-print Network

Spectral diffusion dephasing and motional narrowing in single semiconductor quantum dots Guillaume that dominates the decoherence in semiconductor quantum dots at cryo- genic temperature. We discuss the limits narrowing in standard semiconductor quantum dots at low incident power and tem- perature, that makes

Paris-Sud XI, Université de

445

A new mathematical explanation of the Tacoma Narrows Bridge collapse  

E-print Network

a new mathematical model for the study of the dy- namical behavior of suspension bridges which providesA new mathematical explanation of the Tacoma Narrows Bridge collapse Gianni ARIOLI - Filippo The spectacular collapse of the Tacoma Narrows Bridge, which occurred in 1940, has attracted the at- tention

446

Geometric Model of a Narrow Tilting CAR using Robotics formalism  

E-print Network

Geometric Model of a Narrow Tilting CAR using Robotics formalism Salim Maakaroun *°, Wisama Khalil of an Electrical narrow tilting car instead of a large gasoline car should dramatically decrease traffic congestion, pollution and parking problem. The aim of this paper is to give a unique presentation of the geometric

Boyer, Edmond

447

Geometric model of a narrow tilting CAR using robotics formalism  

Microsoft Academic Search

The use of an Electrical narrow tilting car instead of a large gasoline car should dramatically decrease traffic congestion, pollution and parking problem. The aim of this paper is to give a unique presentation of the geometric modeling issue of a new narrow tilting car. The modeling is based on the modified Denavit Hartenberg geometric description, which is commonly used

Salim Maakaroun; Wisama Khalil; Maxime Gautier; Philippe Chevrel

2010-01-01

448

Plant Responses of Ultra Narrow Row Cotton to Nitrogen Fertilization  

Microsoft Academic Search

Recent developments in cotton (Gossypium hirsutum L.) production technology in the Mississippi River Delta region include drill planting cotton. Production systems that include drill planting cotton are referred to as ultra narrow row (UNR). Ultra narrow row cotton production is a low input system designed to maximize economic returns. Cotton grown under UNR systems is generally lower yielding and lower

J. S. McConnell; P. B. Francis; C. R. Stark; R. E. Glover

2008-01-01

449

The narrow escape problem for diffusion in cellular microdomains  

E-print Network

The narrow escape problem for diffusion in cellular microdomains Z. Schuss , A. Singer , and D it can escape. We call the calculation of the mean escape time the narrow escape prob- lem. This time present asymptotic formulas for the mean escape time in several cases, including regular domains in two

Singer, Amit

450

On Narrowing Strategies for Partial Non-Strict Functions  

Microsoft Academic Search

We study completeness of narrowing strategies for a class of programs defining (possibly partial and non-strict) functions by means of equations, with a lazy semantics, so that infinite values are also admissible. We consider a syntactical restriction introduced by Echahed, under which he proved that any narrowing strategy is complete for specifications defining total functions with finite values. Unfortunately things

David De Frutos-escrig; María-inés Fernández-camacho

1991-01-01

451

Electronic structure and thermoelectric properties of narrow band gap chalcogenides  

Microsoft Academic Search

In recent years there have been a revival of interest in discovering and understanding the physical properties of novel thermoelectric (TE) systems with high figure of merit. These systems are primarily narrow band gap semiconductors. In this thesis, electronic structure calculations were carried out for several narrow band gap chalcogenide TE materials in order to understand their electronic and transport

Daniel Bilc

2005-01-01

452

Recording Images Using a Simple Pinhole Camera  

NSDL National Science Digital Library

In this lesson, students develop and expand their observational skills and technological understanding by building and operating a pinhole camera. The interdisciplinary connections are in the realm of application in this motivating activity. The lesson pr

Eichinger, John

2009-05-30

453

Activity based matching in distributed camera networks.  

PubMed

In this paper, we consider the problem of finding correspondences between distributed cameras that have partially overlapping field of views. When multiple cameras with adaptable orientations and zooms are deployed, as in many wide area surveillance applications, identifying correspondence between different activities becomes a fundamental issue. We propose a correspondence method based upon activity features that, unlike photometric features, have certain geometry independence properties. The proposed method is robust to pose, illumination and geometric effects, unsupervised (does not require any calibration objects). In addition, these features are amenable to low communication bandwidth and distributed network applications. We present quantitative and qualitative results with synthetic and real life examples, and compare the proposed method with scale invariant feature transform (SIFT) based method. We show that our method significantly outperforms the SIFT method when cameras have significantly different orientations. We then describe extensions of our method in a number of directions including topology reconstruction, camera calibration, and distributed anomaly detection. PMID:20550993

Ermis, Erhan Baki; Clarot, Pierre; Jodoin, Pierre-Marc; Saligrama, Venkatesh

2010-10-01

454

X-Ray Shawdowgraph Camera Design  

SciTech Connect

An imagining camera that is used with X-Ray radiography systems in high explosive experiments has been built and fielded. The camera uses a 40mm diameter Micro-