Sample records for complex camera facing

  1. MTR AND ETR COMPLEXES. CAMERA FACING EASTERLY TOWARD CHEMICAL PROCESSING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR AND ETR COMPLEXES. CAMERA FACING EASTERLY TOWARD CHEMICAL PROCESSING PLANT. MTR AND ITS ATTACHMENTS IN FOREGROUND. ETR BEYOND TO RIGHT. INL NEGATIVE NO. 56-4100. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  2. Contextual view of building 926 west elevation; camera facing east. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 926 west elevation; camera facing east. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  3. Detail of main hall porch on east elevation; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main hall porch on east elevation; camera facing west. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  4. LOFT complex, camera facing west. Mobile entry (TAN624) is position ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT complex, camera facing west. Mobile entry (TAN-624) is position next to containment building (TAN-650). Shielded roadway entrance in view just below and to right of stack. Borated water tank has been covered with weather shelter and is no longer visible. ANP hangar (TAN-629) in view beyond LOFT. Date: 1974. INEEL negative no. 74-4191 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  5. ETR COMPLEX. CAMERA FACING SOUTH. FROM BOTTOM OF VIEW TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPLEX. CAMERA FACING SOUTH. FROM BOTTOM OF VIEW TO TOP: MTR, MTR SERVICE BUILDING, ETR CRITICAL FACILITY, ETR CONTROL BUILDING (ATTACHED TO ETR), ETR BUILDING (HIGH-BAY), COMPRESSOR BUILDING (ATTACHED AT LEFT OF ETR), HEAT EXCHANGER BUILDING (JUST BEYOND COMPRESSOR BUILDING), COOLING TOWER PUMP HOUSE, COOLING TOWER. OTHER BUILDINGS ARE CONTRACTORS' CONSTRUCTION BUILDINGS. INL NEGATIVE NO. 56-4105. Unknown Photographer, ca. 1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  6. Building, north side (original front), detail of original entrance. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Building, north side (original front), detail of original entrance. Camera facing south - Naval Supply Center, Broadway Complex, Administration Storehouse, 911 West Broadway, San Diego, San Diego County, CA

  7. Detail of second story balcony porch at southeast corner; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of second story balcony porch at southeast corner; camera facing northwest. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  8. DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  9. LOFT complex in 1975 awaits renewed mission. Aerial view. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT complex in 1975 awaits renewed mission. Aerial view. Camera facing southwesterly. Left to right: stack, entry building (TAN-624), door shroud, duct shroud and filter hatches, dome (painted white), pre-amp building, equipment and piping building, shielded control room (TAN-630), airplane hangar (TAN-629). Date: 1975. INEEL negative no. 75-3690 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  10. ETR AND MTR COMPLEXES IN CONTEXT. CAMERA FACING NORTHERLY. FROM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR AND MTR COMPLEXES IN CONTEXT. CAMERA FACING NORTHERLY. FROM BOTTOM TO TOP: ETR COOLING TOWER, ELECTRICAL BUILDING AND LOW-BAY SECTION OF ETR BUILDING, HEAT EXCHANGER BUILDING (WITH U SHAPED YARD), COMPRESSOR BUILDING. MTR REACTOR SERVICES BUILDING IS ATTACHED TO SOUTH WALL OF MTR. WING A IS ATTACHED TO BALCONY FLOOR OF MTR. NEAR UPPER RIGHT CORNER OF VIEW IS MTR PROCESS WATER BUILDING. WING B IS AT FAR WEST END OF COMPLEX. NEAR MAIN GATE IS GAMMA FACILITY, WITH "COLD" BUILDINGS BEYOND: RAW WATER STORAGE TANKS, STEAM PLANT, MTR COOLING TOWER PUMP HOUSE AND COOLING TOWER. INL NEGATIVE NO. 56-4101. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. ETR COMPLEX. CAMERA FACING EAST. FROM LEFT TO RIGHT: ETRCRITICAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPLEX. CAMERA FACING EAST. FROM LEFT TO RIGHT: ETR-CRITICAL FACILITY BUILDING, ETR CONTROL BUILDING (ATTACHED TO HIGH-BAY ETR), ETR, ONE-STORY SECTION OF ETR BUILDING, ELECTRICAL BUILDING, COOLING TOWER PUMP HOUSE, COOLING TOWER. COMPRESSOR AND HEAT EXCHANGER BUILDING ARE PARTLY IN VIEW ABOVE ETR. DARK-COLORED DUCTS PROCEED FROM GROUND CONNECTION TO ETR WASTE GAS STACK. OTHER STACK IS MTR STACK WITH FAN HOUSE IN FRONT OF IT. RECTANGULAR STRUCTURE NEAR TOP OF VIEW IS SETTLING BASIN. INL NEGATIVE NO. 56-4102. Unknown Photographer, ca. 1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  12. ETR, TRA642. ETR COMPLEX NEARLY COMPLETE. CAMERA FACES NORTHWEST, PROBABLY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR, TRA-642. ETR COMPLEX NEARLY COMPLETE. CAMERA FACES NORTHWEST, PROBABLY FROM TOP DECK OF COOLING TOWER. SHADOW IS CAST BY COOLING TOWER UNITS OFF LEFT OF VIEW. HIGH-BAY REACTOR BUILDING IS SURROUNDED BY ITS ATTACHED SERVICES: ELECTRICAL (TRA-648), HEAT EXCHANGER (TRA-644 WITH U-SHAPED YARD), AND COMPRESSOR (TRA-643). THE CONTROL BUILDING (TRA-647) ON THE NORTH SIDE IS HIDDEN FROM VIEW. AT UPPER RIGHT IS MTR BUILDING, TRA-603. INL NEGATIVE NO. 56-3798. Jack L. Anderson, Photographer, 11/26/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. Contextual view of building, with building #11 in right foreground. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building, with building #11 in right foreground. Camera facing east - Naval Supply Center, Broadway Complex, Administration Storehouse, 911 West Broadway, San Diego, San Diego County, CA

  14. 14. VIEW OF MST, FACING SOUTHEAST, AND LAUNCH PAD TAKEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. VIEW OF MST, FACING SOUTHEAST, AND LAUNCH PAD TAKEN FROM NORTHEAST PHOTO TOWER WITH WINDOW OPEN. FEATURES LEFT TO RIGHT: SOUTH TELEVISION CAMERA TOWER, SOUTHWEST PHOTO TOWER, LAUNCHER, UMBILICAL MAST, MST, AND OXIDIZER APRON. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  15. Building, roof, with machinery penthouses on left and harbor control ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Building, roof, with machinery penthouses on left and harbor control tower on right. Camera facing south - Naval Supply Center, Broadway Complex, Warehouse, 911 West Broadway, San Diego, San Diego County, CA

  16. Contextual view of the rear of building 926 from the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of the rear of building 926 from the hillside; camera facing east. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  17. Contextual view of building, with building #12 in right background ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building, with building #12 in right background and building #11 in right foreground. Camera facing east-southeast - Naval Supply Center, Broadway Complex, Administration Storehouse, 911 West Broadway, San Diego, San Diego County, CA

  18. Integration of multispectral face recognition and multi-PTZ camera automated surveillance for security applications

    NASA Astrophysics Data System (ADS)

    Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi

    2013-06-01

    Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially reduced computational complexity and improved flexibility at the cost of slightly decreased pixel accuracy as compared to Chen and Wang's method [18].

  19. Contextual view showing building 926 north wing at left and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing building 926 north wing at left and hospital historic district at right; camera facing north. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  20. ETR WASTE GAS EXITED THE ETR COMPLEX FROM THE NORTH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR WASTE GAS EXITED THE ETR COMPLEX FROM THE NORTH SIDE THROUGH A TUNNEL AND THEN TO A FILTER PIT. TUNNEL EXIT IS UNDER CONSTRUCTION WHILE CONTROL BUILDING IS BEING FORMED BEYOND. CAMERA FACING WEST. INL NEGATIVE NO. 56-1238. Jack L. Anderson, Photographer, 4/17/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  1. Face recognition system for set-top box-based intelligent TV.

    PubMed

    Lee, Won Oh; Kim, Yeong Gon; Hong, Hyung Gil; Park, Kang Ryoung

    2014-11-18

    Despite the prevalence of smart TVs, many consumers continue to use conventional TVs with supplementary set-top boxes (STBs) because of the high cost of smart TVs. However, because the processing power of a STB is quite low, the smart TV functionalities that can be implemented in a STB are very limited. Because of this, negligible research has been conducted regarding face recognition for conventional TVs with supplementary STBs, even though many such studies have been conducted with smart TVs. In terms of camera sensors, previous face recognition systems have used high-resolution cameras, cameras with high magnification zoom lenses, or camera systems with panning and tilting devices that can be used for face recognition from various positions. However, these cameras and devices cannot be used in intelligent TV environments because of limitations related to size and cost, and only small, low cost web-cameras can be used. The resulting face recognition performance is degraded because of the limited resolution and quality levels of the images. Therefore, we propose a new face recognition system for intelligent TVs in order to overcome the limitations associated with low resource set-top box and low cost web-cameras. We implement the face recognition system using a software algorithm that does not require special devices or cameras. Our research has the following four novelties: first, the candidate regions in a viewer's face are detected in an image captured by a camera connected to the STB via low processing background subtraction and face color filtering; second, the detected candidate regions of face are transmitted to a server that has high processing power in order to detect face regions accurately; third, in-plane rotations of the face regions are compensated based on similarities between the left and right half sub-regions of the face regions; fourth, various poses of the viewer's face region are identified using five templates obtained during the initial user registration stage and multi-level local binary pattern matching. Experimental results indicate that the recall; precision; and genuine acceptance rate were about 95.7%; 96.2%; and 90.2%, respectively.

  2. 3. VIEW OF NORTHEAST CORNER OF MST. NOTE: ENVIRONMENTAL DOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VIEW OF NORTHEAST CORNER OF MST. NOTE: ENVIRONMENTAL DOOR ON THE LOWER EAST SIDE OF THE NORTH FACE IS MISSING. NORTH CAMERA TOWER IN FOREGROUND. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  3. LOFT complex, aerial view taken on same on same day ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT complex, aerial view taken on same on same day as HAER photo ID-33-E-376. Camera facing south. Note curve of rail track toward hot shop (TAN-607). Earth shielding on control building (TAN-630) is partly removed, showing edge of concrete structure. Great southern butte on horizon. Date: 1975. INEEL negative no. 75-3693 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  4. Towards next generation 3D cameras

    NASA Astrophysics Data System (ADS)

    Gupta, Mohit

    2017-03-01

    We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (<100 microns resolution) scans in extremely demanding scenarios with low-cost components. Several of these cameras are making a practical impact in industrial automation, being adopted in robotic inspection and assembly systems.

  5. Face detection assisted auto exposure: supporting evidence from a psychophysical study

    NASA Astrophysics Data System (ADS)

    Jin, Elaine W.; Lin, Sheng; Dharumalingam, Dhandapani

    2010-01-01

    Face detection has been implemented in many digital still cameras and camera phones with the promise of enhancing existing camera functions (e.g. auto exposure) and adding new features to cameras (e.g. blink detection). In this study we examined the use of face detection algorithms in assisting auto exposure (AE). The set of 706 images, used in this study, was captured using Canon Digital Single Lens Reflex cameras and subsequently processed with an image processing pipeline. A psychophysical study was performed to obtain optimal exposure along with the upper and lower bounds of exposure for all 706 images. Three methods of marking faces were utilized: manual marking, face detection algorithm A (FD-A), and face detection algorithm B (FD-B). The manual marking method found 751 faces in 426 images, which served as the ground-truth for face regions of interest. The remaining images do not have any faces or the faces are too small to be considered detectable. The two face detection algorithms are different in resource requirements and in performance. FD-A uses less memory and gate counts compared to FD-B, but FD-B detects more faces and has less false positives. A face detection assisted auto exposure algorithm was developed and tested against the evaluation results from the psychophysical study. The AE test results showed noticeable improvement when faces were detected and used in auto exposure. However, the presence of false positives would negatively impact the added benefit.

  6. 9. DETAIL VIEW OF BRIDGE CRANE ON WEST SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. DETAIL VIEW OF BRIDGE CRANE ON WEST SIDE OF BUILDING. CAMERA FACING NORTHEAST. CONTAMINATED AIR FILTERS LOADED IN TRANSPORT CASKS WERE TRANSFERRED TO VEHICLES AND SENT TO RADIOACTIVE WASTE MANAGEMENT COMPLEX FOR STORAGE. INEEL PROOF NUMBER HD-17-1. - Idaho National Engineering Laboratory, Old Waste Calcining Facility, Scoville, Butte County, ID

  7. 1. GENERAL VIEW OF SLC3W SHOWING SOUTH FACE AND EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. GENERAL VIEW OF SLC-3W SHOWING SOUTH FACE AND EAST SIDE OF A-FRAME MOBILE SERVICE TOWER (MST). MST IN SERVICE POSITION OVER LAUNCHER AND FLAME BUCKET. CABLE TRAYS BETWEEN LAUNCH OPERATIONS BUILDING (BLDG. 763) AND SLC-3W IN FOREGROUND. LIQUID OXYGEN APRON VISIBLE IMMEDIATELY EAST (RIGHT) OF MST; FUEL APRON VISIBLE IMMEDIATELY WEST (LEFT) OF MST. A PORTION OF THE FLAME BUCKET VISIBLE BELOW THE SOUTH FACE OF THE MST. CAMERA TOWERS VISIBLE EAST OF MST BETWEEN ROAD AND CABLE TRAY, AND SOUTH OF MST NEAR LEFT MARGIN OF PHOTOGRAPH. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 West, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  8. Interior view showing south entrance; camera facing south. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view showing south entrance; camera facing south. - Mare Island Naval Shipyard, Machine Shop, California Avenue, southwest corner of California Avenue & Thirteenth Street, Vallejo, Solano County, CA

  9. PBF Reactor Building (PER620). Camera faces southeast. Concrete placement will ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces southeast. Concrete placement will leave opening for neutron camera to be installed later. Note vertical piping within rebar. Photographer: John Capek. Date: July 6, 1967. INEEL negative no. 67-3514 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  10. VIEW OF EAST ELEVATION; CAMERA FACING WEST Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF EAST ELEVATION; CAMERA FACING WEST - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  11. VIEW OF SOUTH ELEVATION; CAMERA FACING NORTH Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF SOUTH ELEVATION; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  12. VIEW OF WEST ELEVATION: CAMERA FACING NORTHEAST Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF WEST ELEVATION: CAMERA FACING NORTHEAST - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  13. VIEW OF NORTH ELEVATION; CAMERA FACING SOUTH Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF NORTH ELEVATION; CAMERA FACING SOUTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  14. View of south elevation; camera facing northeast. Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of south elevation; camera facing northeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  15. View of north elevation; camera facing southeast. Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of north elevation; camera facing southeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  16. Detail of main entrance; camera facing southwest. Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main entrance; camera facing southwest. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  17. Contextual view of building 733; camera facing southeast. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 733; camera facing southeast. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  18. Interior detail of tower space; camera facing southwest. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of tower space; camera facing southwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  19. Oblique view of southeast corner; camera facing northwest. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Oblique view of southeast corner; camera facing northwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  20. Detail of stairway at north elevation; camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of stairway at north elevation; camera facing southwest. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  1. Interior view of second floor sleeping area; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor sleeping area; camera facing south. - Mare Island Naval Shipyard, Marine Barracks, Cedar Avenue, west side between Twelfth & Fourteenth Streets, Vallejo, Solano County, CA

  2. Interior detail of lobby ceiling design; camera facing east. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of lobby ceiling design; camera facing east. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  3. Interior detail of stairway in tower; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of stairway in tower; camera facing south. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  4. View of camera station located northeast of Building 70022, facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of camera station located northeast of Building 70022, facing northwest - Naval Ordnance Test Station Inyokern, Randsburg Wash Facility Target Test Towers, Tower Road, China Lake, Kern County, CA

  5. Interior view of second floor lobby; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor lobby; camera facing south. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  6. Interior detail of first floor lobby; camera facing northeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of first floor lobby; camera facing northeast. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  7. Interior view of second floor space; camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor space; camera facing southwest. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  8. Detail of columns, cornice and eaves; camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of columns, cornice and eaves; camera facing southwest. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  9. Detail of cupola on south wing; camera facing southeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of cupola on south wing; camera facing southeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  10. Interior view of north wing, south wall offices; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of north wing, south wall offices; camera facing south. - Mare Island Naval Shipyard, Smithery, California Avenue, west side at California Avenue & Eighth Street, Vallejo, Solano County, CA

  11. Interior detail of main entry with railroad tracks; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of main entry with railroad tracks; camera facing east. - Mare Island Naval Shipyard, Mechanics Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  12. DETAIL OF LAMP ABOVE SOUTH SIDE ENTRANCE; CAMERA FACING EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF LAMP ABOVE SOUTH SIDE ENTRANCE; CAMERA FACING EAST - Mare Island Naval Shipyard, Bachelor Enlisted Quarters & Offices, Walnut Avenue, east side between D Street & C Street, Vallejo, Solano County, CA

  13. Detail of main doors on east elevation; camera facing west. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main doors on east elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  14. Detail of central portion of southeast elevation; camera facing west. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of central portion of southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  15. Detail of windows at center of west elevation; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of windows at center of west elevation; camera facing east. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  16. Interior view of hallway on second floor; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of hallway on second floor; camera facing south. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  17. Detail of balcony and windows on west elevation; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of balcony and windows on west elevation; camera facing northeast. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  18. Contextual view of building 733 along Cedar Avenue; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 733 along Cedar Avenue; camera facing southwest. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  19. View of main terrace with mature tree, camera facing southeast ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of main terrace with mature tree, camera facing southeast - Naval Training Station, Senior Officers' Quarters District, Naval Station Treasure Island, Yerba Buena Island, San Francisco, San Francisco County, CA

  20. Detail of main entry on east elevation; camera facing west. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main entry on east elevation; camera facing west. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  1. Detail of south wing south elevation wall section; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of south wing south elevation wall section; camera facing northwest - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  2. Detail of large industrial doors on north elevation; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of large industrial doors on north elevation; camera facing south. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  3. View of steel warehouses, building 710 north sidewalk; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses, building 710 north sidewalk; camera facing east. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  4. Interior detail of main stairway from first floor; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of main stairway from first floor; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  5. Interior detail of arched doorway at second floor; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of arched doorway at second floor; camera facing north. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  6. INTERIOR DETAIL OF STAIRWAY AT SOUTH WING ENTRANCE; CAMERA FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR DETAIL OF STAIRWAY AT SOUTH WING ENTRANCE; CAMERA FACING SOUTH - Mare Island Naval Shipyard, Bachelor Enlisted Quarters & Offices, Walnut Avenue, east side between D Street & C Street, Vallejo, Solano County, CA

  7. A&M. Hot liquid waste building (TAN616) under construction. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste building (TAN-616) under construction. Camera facing northeast. Date: November 25, 1953. INEEL negative no. 9232 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  8. LPT. Low power assembly and test building (TAN640). Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Low power assembly and test building (TAN-640). Camera facing west. Rollup doors to each test cell face east. Concrete walls poured in place. Apparatus at right of view was part of a post-ANP program. INEEL negative no. HD-40-1-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  9. Emotion Recognition - the need for a complete analysis of the phenomenon of expression formation

    NASA Astrophysics Data System (ADS)

    Bobkowska, Katarzyna; Przyborski, Marek; Skorupka, Dariusz

    2018-01-01

    This article shows how complex emotions are. This has been proven by the analysis of the changes that occur on the face. The authors present the problem of image analysis for the purpose of identifying emotions. In addition, they point out the importance of recording the phenomenon of the development of emotions on the human face with the use of high-speed cameras, which allows the detection of micro expression. The work that was prepared for this article was based on analyzing the parallax pair correlation coefficients for specific faces. In the article authors proposed to divide the facial image into 8 characteristic segments. With this approach, it was confirmed that at different moments of emotion the pace of expression and the maximum change characteristic of a particular emotion, for each part of the face is different.

  10. SPERTI, Instrument Cell Building (PER606). West facade. Camera facing northeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SPERT-I, Instrument Cell Building (PER-606). West facade. Camera facing northeast. Date: August 2003. INEEL negative no. HD-35-3-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  11. SPERTI, Instrument Cell Building (PER606). East facade. Camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SPERT-I, Instrument Cell Building (PER-606). East facade. Camera facing southwest. Date: August 2003. INEEL negative no. HD-35-3-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  12. 4. First floor interior. Camera facing west. Mirrored interior supports ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. First floor interior. Camera facing west. Mirrored interior supports mark the former division between 11 and 13 North Broad Street. - 11-15 North Broad Street (Commercial Building), 11-15 North Broad Street, Trenton, Mercer County, NJ

  13. 2. View from same camera position facing 232 degrees southwest ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. View from same camera position facing 232 degrees southwest showing abandoned section of old grade - Oak Creek Administrative Center, One half mile east of Zion-Mount Carmel Highway at Oak Creek, Springdale, Washington County, UT

  14. Automated face detection for occurrence and occupancy estimation in chimpanzees.

    PubMed

    Crunchant, Anne-Sophie; Egerer, Monika; Loos, Alexander; Burghardt, Tilo; Zuberbühler, Klaus; Corogenes, Katherine; Leinert, Vera; Kulik, Lars; Kühl, Hjalmar S

    2017-03-01

    Surveying endangered species is necessary to evaluate conservation effectiveness. Camera trapping and biometric computer vision are recent technological advances. They have impacted on the methods applicable to field surveys and these methods have gained significant momentum over the last decade. Yet, most researchers inspect footage manually and few studies have used automated semantic processing of video trap data from the field. The particular aim of this study is to evaluate methods that incorporate automated face detection technology as an aid to estimate site use of two chimpanzee communities based on camera trapping. As a comparative baseline we employ traditional manual inspection of footage. Our analysis focuses specifically on the basic parameter of occurrence where we assess the performance and practical value of chimpanzee face detection software. We found that the semi-automated data processing required only 2-4% of the time compared to the purely manual analysis. This is a non-negligible increase in efficiency that is critical when assessing the feasibility of camera trap occupancy surveys. Our evaluations suggest that our methodology estimates the proportion of sites used relatively reliably. Chimpanzees are mostly detected when they are present and when videos are filmed in high-resolution: the highest recall rate was 77%, for a false alarm rate of 2.8% for videos containing only chimpanzee frontal face views. Certainly, our study is only a first step for transferring face detection software from the lab into field application. Our results are promising and indicate that the current limitation of detecting chimpanzees in camera trap footage due to lack of suitable face views can be easily overcome on the level of field data collection, that is, by the combined placement of multiple high-resolution cameras facing reverse directions. This will enable to routinely conduct chimpanzee occupancy surveys based on camera trapping and semi-automated processing of footage. Using semi-automated ape face detection technology for processing camera trap footage requires only 2-4% of the time compared to manual analysis and allows to estimate site use by chimpanzees relatively reliably. © 2017 Wiley Periodicals, Inc.

  15. Multiview face detection based on position estimation over multicamera surveillance system

    NASA Astrophysics Data System (ADS)

    Huang, Ching-chun; Chou, Jay; Shiu, Jia-Hou; Wang, Sheng-Jyh

    2012-02-01

    In this paper, we propose a multi-view face detection system that locates head positions and indicates the direction of each face in 3-D space over a multi-camera surveillance system. To locate 3-D head positions, conventional methods relied on face detection in 2-D images and projected the face regions back to 3-D space for correspondence. However, the inevitable false face detection and rejection usually degrades the system performance. Instead, our system searches for the heads and face directions over the 3-D space using a sliding cube. Each searched 3-D cube is projected onto the 2-D camera views to determine the existence and direction of human faces. Moreover, a pre-process to estimate the locations of candidate targets is illustrated to speed-up the searching process over the 3-D space. In summary, our proposed method can efficiently fuse multi-camera information and suppress the ambiguity caused by detection errors. Our evaluation shows that the proposed approach can efficiently indicate the head position and face direction on real video sequences even under serious occlusion.

  16. Remote repair appliance

    DOEpatents

    Heumann, Frederick K.; Wilkinson, Jay C.; Wooding, David R.

    1997-01-01

    A remote appliance for supporting a tool for performing work at a worksite on a substantially circular bore of a workpiece and for providing video signals of the worksite to a remote monitor comprising: a baseplate having an inner face and an outer face; a plurality of rollers, wherein each roller is rotatably and adjustably attached to the inner face of the baseplate and positioned to roll against the bore of the workpiece when the baseplate is positioned against the mouth of the bore such that the appliance may be rotated about the bore in a plane substantially parallel to the baseplate; a tool holding means for supporting the tool, the tool holding means being adjustably attached to the outer face of the baseplate such that the working end of the tool is positioned on the inner face side of the baseplate; a camera for providing video signals of the worksite to the remote monitor; and a camera holding means for supporting the camera on the inner face side of the baseplate, the camera holding means being adjustably attached to the outer face of the baseplate. In a preferred embodiment, roller guards are provided to protect the rollers from debris and a bore guard is provided to protect the bore from wear by the rollers and damage from debris.

  17. An Implementation of Privacy Protection for a Surveillance Camera Using ROI Coding of JPEG2000 with Face Detection

    NASA Astrophysics Data System (ADS)

    Muneyasu, Mitsuji; Odani, Shuhei; Kitaura, Yoshihiro; Namba, Hitoshi

    On the use of a surveillance camera, there is a case where privacy protection should be considered. This paper proposes a new privacy protection method by automatically degrading the face region in surveillance images. The proposed method consists of ROI coding of JPEG2000 and a face detection method based on template matching. The experimental result shows that the face region can be detected and hidden correctly.

  18. SOUTH WING, TRA661. SOUTH SIDE. CAMERA FACING NORTH. MTR HIGH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SOUTH WING, TRA-661. SOUTH SIDE. CAMERA FACING NORTH. MTR HIGH BAY BEYOND. INL NEGATIVE NO. HD46-45-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  19. PBF Cooling Tower (PER720). Camera faces south to show north ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower (PER-720). Camera faces south to show north facade. Note enclosed stairway. Date: August 2003. INEEL negative no. HD-35-10-3 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  20. A&M. Radioactive parts security storage area. camera facing northwest. Outdoor ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Radioactive parts security storage area. camera facing northwest. Outdoor storage of concrete storage casks. Photographer: M. Holmes. Date: November 21, 1959. INEEL negative no. 59-6081 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  1. A&M. Hot liquid waste holding tanks. Camera faces southeast. Located ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste holding tanks. Camera faces southeast. Located in vicinity of TAN-616, hot liquid waste treatment plant. Date: November 13, 1953. INEEL negative no. 9159 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  2. 24. ARAIII Reactor building ARA608 interior. Camera facing south. Chalk ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. ARA-III Reactor building ARA-608 interior. Camera facing south. Chalk marks on wall indicate presence or absence of spot contamination. Ineel photo no. 3-2. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  3. Barrier Coverage for 3D Camera Sensor Networks

    PubMed Central

    Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi; Ji, Peng; Chu, Hao

    2017-01-01

    Barrier coverage, an important research area with respect to camera sensor networks, consists of a number of camera sensors to detect intruders that pass through the barrier area. Existing works on barrier coverage such as local face-view barrier coverage and full-view barrier coverage typically assume that each intruder is considered as a point. However, the crucial feature (e.g., size) of the intruder should be taken into account in the real-world applications. In this paper, we propose a realistic resolution criterion based on a three-dimensional (3D) sensing model of a camera sensor for capturing the intruder’s face. Based on the new resolution criterion, we study the barrier coverage of a feasible deployment strategy in camera sensor networks. Performance results demonstrate that our barrier coverage with more practical considerations is capable of providing a desirable surveillance level. Moreover, compared with local face-view barrier coverage and full-view barrier coverage, our barrier coverage is more reasonable and closer to reality. To the best of our knowledge, our work is the first to propose barrier coverage for 3D camera sensor networks. PMID:28771167

  4. Barrier Coverage for 3D Camera Sensor Networks.

    PubMed

    Si, Pengju; Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi; Ji, Peng; Chu, Hao

    2017-08-03

    Barrier coverage, an important research area with respect to camera sensor networks, consists of a number of camera sensors to detect intruders that pass through the barrier area. Existing works on barrier coverage such as local face-view barrier coverage and full-view barrier coverage typically assume that each intruder is considered as a point. However, the crucial feature (e.g., size) of the intruder should be taken into account in the real-world applications. In this paper, we propose a realistic resolution criterion based on a three-dimensional (3D) sensing model of a camera sensor for capturing the intruder's face. Based on the new resolution criterion, we study the barrier coverage of a feasible deployment strategy in camera sensor networks. Performance results demonstrate that our barrier coverage with more practical considerations is capable of providing a desirable surveillance level. Moreover, compared with local face-view barrier coverage and full-view barrier coverage, our barrier coverage is more reasonable and closer to reality. To the best of our knowledge, our work is the first to propose barrier coverage for 3D camera sensor networks.

  5. 3. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY, CAMERA FACING NORTHEAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY, CAMERA FACING NORTHEAST. SHOWS RELATIONSHIP BETWEEN DECONTAMINATION ROOM, ADSORBER REMOVAL HATCHES (FLAT ON GRADE), AND BRIDGE CRANE. INEEL PROOF NUMBER HD-17-2. - Idaho National Engineering Laboratory, Old Waste Calcining Facility, Scoville, Butte County, ID

  6. PBF Cooling Tower detail. Camera facing southwest. Wood fill rises ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower detail. Camera facing southwest. Wood fill rises from foundation piers of cold water basin. Photographer: Kirsh. Date: May 1, 1969. INEEL negative no. 69-2826 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  7. PBF (PER620) west facade. Camera facing east. Note 1980 addition ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) west facade. Camera facing east. Note 1980 addition on south side of west wall. Date: March 2004. INEEL negative no. HD-41-3-3 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  8. ETRMTR MECHANICAL SERVICES BUILDING, TRA653. CAMERA FACING NORTHWEST AS BUILDING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR-MTR MECHANICAL SERVICES BUILDING, TRA-653. CAMERA FACING NORTHWEST AS BUILDING WAS NEARLY COMPLETE. INL NEGATIVE NO. 57-3653. K. Mansfield, Photographer, 7/22/1957 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. Have a Nice Spring! MOC Revisits "Happy Face" Crater

    NASA Image and Video Library

    2005-05-16

    Smile! Spring has sprung in the martian southern hemisphere. With it comes the annual retreat of the winter polar frost cap. This view of "Happy Face Crater"--officially named "Galle Crater"--shows patches of white water ice frost in and around the crater's south-facing slopes. Slopes that face south will retain frost longer than north-facing slopes because they do not receive as much sunlight in early spring. This picture is a composite of images taken by the Mars Global Surveyor Mars Orbiter Camera (MOC) red and blue wide angle cameras. The wide angle cameras were designed to monitor the changing weather, frost, and wind patterns on Mars. Galle Crater is located on the east rim of the Argyre Basin and is about 215 kilometers (134 miles) across. In this picture, illumination is from the upper left and north is up. http://photojournal.jpl.nasa.gov/catalog/PIA02325

  10. PBF (PER620) north facade. Camera facing south. Small metal shed ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) north facade. Camera facing south. Small metal shed at right is Stack Gas Monitor Building, PER-629. Date: March 2004. INEEL negative no. HD-41-2-4 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  11. MTR BUILDING AND BALCONY FLOORS. CAMERA FACING EASTERLY. PHOTOGRAPHER DID ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BUILDING AND BALCONY FLOORS. CAMERA FACING EASTERLY. PHOTOGRAPHER DID NOT EXPLAIN DARK CLOUD. MTR WING WILL ATTACH TO GROUND FLOOR. INL NEGATIVE NO. 1567. Unknown Photographer, 2/28/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  12. A&M. Hot liquid waste treatment building (TAN616). Camera facing east. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616). Camera facing east. Showing west facades of structure. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-1-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  13. PBF Control Building (PER619) south facade. Camera faces north. Note ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Control Building (PER-619) south facade. Camera faces north. Note buried tanks with bollards protecting their access hatches. Date: July 2004. INEEL negative no. HD-41-10-4 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  14. Remote repair appliance

    DOEpatents

    Heumann, F.K.; Wilkinson, J.C.; Wooding, D.R.

    1997-12-16

    A remote appliance for supporting a tool for performing work at a work site on a substantially circular bore of a work piece and for providing video signals of the work site to a remote monitor comprises: a base plate having an inner face and an outer face; a plurality of rollers, wherein each roller is rotatably and adjustably attached to the inner face of the base plate and positioned to roll against the bore of the work piece when the base plate is positioned against the mouth of the bore such that the appliance may be rotated about the bore in a plane substantially parallel to the base plate; a tool holding means for supporting the tool, the tool holding means being adjustably attached to the outer face of the base plate such that the working end of the tool is positioned on the inner face side of the base plate; a camera for providing video signals of the work site to the remote monitor; and a camera holding means for supporting the camera on the inner face side of the base plate, the camera holding means being adjustably attached to the outer face of the base plate. In a preferred embodiment, roller guards are provided to protect the rollers from debris and a bore guard is provided to protect the bore from wear by the rollers and damage from debris. 5 figs.

  15. Technology survey on video face tracking

    NASA Astrophysics Data System (ADS)

    Zhang, Tong; Gomes, Herman Martins

    2014-03-01

    With the pervasiveness of monitoring cameras installed in public areas, schools, hospitals, work places and homes, video analytics technologies for interpreting these video contents are becoming increasingly relevant to people's lives. Among such technologies, human face detection and tracking (and face identification in many cases) are particularly useful in various application scenarios. While plenty of research has been conducted on face tracking and many promising approaches have been proposed, there are still significant challenges in recognizing and tracking people in videos with uncontrolled capturing conditions, largely due to pose and illumination variations, as well as occlusions and cluttered background. It is especially complex to track and identify multiple people simultaneously in real time due to the large amount of computation involved. In this paper, we present a survey on literature and software that are published or developed during recent years on the face tracking topic. The survey covers the following topics: 1) mainstream and state-of-the-art face tracking methods, including features used to model the targets and metrics used for tracking; 2) face identification and face clustering from face sequences; and 3) software packages or demonstrations that are available for algorithm development or trial. A number of publically available databases for face tracking are also introduced.

  16. LPT. Shield test control building (TAN645), north facade. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Shield test control building (TAN-645), north facade. Camera facing south. Obsolete sign dating from post-1970 program says "Energy and Systems Technology Experimental Facility, INEL." INEEL negative no. HD-40-5-4 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  17. MTR WING, TRA604, INTERIOR. BASEMENT. WEST CORRIDOR. CAMERA FACES NORTH. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR WING, TRA-604, INTERIOR. BASEMENT. WEST CORRIDOR. CAMERA FACES NORTH. HVAC AREA IS AT RIGHT OF CORRIDOR. INL NEGATIVE NO. HD46-13-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  18. PBF Reactor Building (PER620). Camera in first basement, facing south ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera in first basement, facing south and upward toward main floor. Cable trays being erected. Photographer: Kirsh. Date: May 20, 1969. INEEL negative no. 69-3110 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  19. PBF (PER620) south facade. Camera facing north. Note pedestrian bridge ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) south facade. Camera facing north. Note pedestrian bridge crossing over conduit. Central high bay contains reactor room and canal. Date: March 2004. INEEL negative no. HD-41-2-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  20. MTR STACK, TRA710, CONTEXTUAL VIEW, CAMERA FACING SOUTH. PERIMETER SECURITY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR STACK, TRA-710, CONTEXTUAL VIEW, CAMERA FACING SOUTH. PERIMETER SECURITY FENCE AND SECURITY LIGHTING IN VIEW AT LEFT. INL NEGATIVE NO. HD52-1-1. Mike Crane, Photographer, 5/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  1. PBF Cooling Tower Auxiliary Building (PER624) interior. Camera facing north. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower Auxiliary Building (PER-624) interior. Camera facing north. Deluge valves and automatic fire protection piping for Cooling Tower. Photographer: Holmes. Date: May 20, 1970. INEEL negative no. 70-2323 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  2. A&M. Hot liquid waste treatment building (TAN616). Camera facing northeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616). Camera facing northeast. South wall with oblique views of west sides of structure. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-1-2 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  3. A&M. Hot liquid waste treatment building (TAN616). Camera facing north. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616). Camera facing north. Detail of personnel entrance door, stoop, and stairway. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-2-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  4. Skin Color Segmentation Using Coarse-to-Fine Region on Normalized RGB Chromaticity Diagram for Face Detection

    NASA Astrophysics Data System (ADS)

    Soetedjo, Aryuanto; Yamada, Koichi

    This paper describes a new color segmentation based on a normalized RGB chromaticity diagram for face detection. Face skin is extracted from color images using a coarse skin region with fixed boundaries followed by a fine skin region with variable boundaries. Two newly developed histograms that have prominent peaks of skin color and non-skin colors are employed to adjust the boundaries of the skin region. The proposed approach does not need a skin color model, which depends on a specific camera parameter and is usually limited to a particular environment condition, and no sample images are required. The experimental results using color face images of various races under varying lighting conditions and complex backgrounds, obtained from four different resources on the Internet, show a high detection rate of 87%. The results of the detection rate and computation time are comparable to the well known real-time face detection method proposed by Viola-Jones [11], [12].

  5. Applications of digital image acquisition in anthropometry

    NASA Technical Reports Server (NTRS)

    Woolford, B.; Lewis, J. L.

    1981-01-01

    A description is given of a video kinesimeter, a device for the automatic real-time collection of kinematic and dynamic data. Based on the detection of a single bright spot by three TV cameras, the system provides automatic real-time recording of three-dimensional position and force data. It comprises three cameras, two incandescent lights, a voltage comparator circuit, a central control unit, and a mass storage device. The control unit determines the signal threshold for each camera before testing, sequences the lights, synchronizes and analyzes the scan voltages from the three cameras, digitizes force from a dynamometer, and codes the data for transmission to a floppy disk for recording. Two of the three cameras face each other along the 'X' axis; the third camera, which faces the center of the line between the first two, defines the 'Y' axis. An image from the 'Y' camera and either 'X' camera is necessary for determining the three-dimensional coordinates of the point.

  6. A system for tracking and recognizing pedestrian faces using a network of loosely coupled cameras

    NASA Astrophysics Data System (ADS)

    Gagnon, L.; Laliberté, F.; Foucher, S.; Branzan Albu, A.; Laurendeau, D.

    2006-05-01

    A face recognition module has been developed for an intelligent multi-camera video surveillance system. The module can recognize a pedestrian face in terms of six basic emotions and the neutral state. Face and facial features detection (eyes, nasal root, nose and mouth) are first performed using cascades of boosted classifiers. These features are used to normalize the pose and dimension of the face image. Gabor filters are then sampled on a regular grid covering the face image to build a facial feature vector that feeds a nearest neighbor classifier with a cosine distance similarity measure for facial expression interpretation and face model construction. A graphical user interface allows the user to adjust the module parameters.

  7. LPT. Shield test facility test building interior (TAN646). Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Shield test facility test building interior (TAN-646). Camera facing south. Distant pool contained EBOR reactor; near pool was intended for fuel rod storage. Other post-1970 activity equipment remains in pool. INEEL negative no. HD-40-9-4 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  8. LOFT. Interior, control room in control building (TAN630). Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Interior, control room in control building (TAN-630). Camera facing north. Sign says "This control console is partially active. Do not operate any switch handle without authorization." Date: May 2004. INEEL negative no. HD-39-14-3 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  9. PBF Cooling Tower contextual view. Camera facing southwest. West wing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower contextual view. Camera facing southwest. West wing and north facade (rear) of Reactor Building (PER-620) is at left; Cooling Tower to right. Photographer: Kirsh. Date: November 2, 1970. INEEL negative no. 70-4913 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  10. PBF Reactor Building (PER620). Camera faces north into highbay/reactor pit ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces north into high-bay/reactor pit area. Inside from for reactor enclosure is in place. Photographer: John Capek. Date: March 15, 1967. INEEL negative no. 67-1769 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  11. PBF Reactor Building (PER620). Camera facing south end of high ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera facing south end of high bay. Vertical-lift door is being installed. Later, pneumatic seals will be installed around door. Photographer: Kirsh. Date: September 31, 1968. INEEL negative no. 68-3176 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  12. ETR CRITICAL FACILITY (ETRCF), TRA654. SOUTH SIDE. CAMERA FACING NORTH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR CRITICAL FACILITY (ETR-CF), TRA-654. SOUTH SIDE. CAMERA FACING NORTH AND ROLL-UP DOOR. ORIGINAL SIDING HAS BEEN REPLACED WITH STUCCO-LIKE MATERIAL. INL NEGATIVE NO. HD46-40-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. PBF Cooling Tower. Camera facing southwest. Round piers will support ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower. Camera facing southwest. Round piers will support Tower's wood "fill" or "packing." Black-topped stack in far distance is at Idaho Chemical Processing Plant. Photographer: John Capek. Date: October 16, 1968. INEEL negative no. 68-4097 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  14. ADM. Change House (TAN606) as completed. Camera facing northerly. Note ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ADM. Change House (TAN-606) as completed. Camera facing northerly. Note proximity to shielding berm. Part of hot shop (A&M Building, TAN-607) at left of view beyond berm. Date: October 29, 1954. INEEL negative no. 12705 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  15. PBF Reactor Building (PER620). Cubicle 10 detail. Camera facing west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Cubicle 10 detail. Camera facing west toward brick shield wall. Valve stems against wall penetrate through east wall of cubicle. Photographer: John Capek. Date: August 19, 1970. INEEL negative no. 70-3469 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  16. LPT. Low power test control building (TAN641) interior. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Low power test control building (TAN-641) interior. Camera facing northeast at what remains of control room console. Cut in wall at right of view shows west wall of northern test cell. INEEL negative no. HD-40-4-4 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  17. Cooperative multisensor system for real-time face detection and tracking in uncontrolled conditions

    NASA Astrophysics Data System (ADS)

    Marchesotti, Luca; Piva, Stefano; Turolla, Andrea; Minetti, Deborah; Regazzoni, Carlo S.

    2005-03-01

    The presented work describes an innovative architecture for multi-sensor distributed video surveillance applications. The aim of the system is to track moving objects in outdoor environments with a cooperative strategy exploiting two video cameras. The system also exhibits the capacity of focusing its attention on the faces of detected pedestrians collecting snapshot frames of face images, by segmenting and tracking them over time at different resolution. The system is designed to employ two video cameras in a cooperative client/server structure: the first camera monitors the entire area of interest and detects the moving objects using change detection techniques. The detected objects are tracked over time and their position is indicated on a map representing the monitored area. The objects" coordinates are sent to the server sensor in order to point its zooming optics towards the moving object. The second camera tracks the objects at high resolution. As well as the client camera, this sensor is calibrated and the position of the object detected on the image plane reference system is translated in its coordinates referred to the same area map. In the map common reference system, data fusion techniques are applied to achieve a more precise and robust estimation of the objects" track and to perform face detection and tracking. The work novelties and strength reside in the cooperative multi-sensor approach, in the high resolution long distance tracking and in the automatic collection of biometric data such as a person face clip for recognition purposes.

  18. Performance analysis of visual tracking algorithms for motion-based user interfaces on mobile devices

    NASA Astrophysics Data System (ADS)

    Winkler, Stefan; Rangaswamy, Karthik; Tedjokusumo, Jefry; Zhou, ZhiYing

    2008-02-01

    Determining the self-motion of a camera is useful for many applications. A number of visual motion-tracking algorithms have been developed till date, each with their own advantages and restrictions. Some of them have also made their foray into the mobile world, powering augmented reality-based applications on phones with inbuilt cameras. In this paper, we compare the performances of three feature or landmark-guided motion tracking algorithms, namely marker-based tracking with MXRToolkit, face tracking based on CamShift, and MonoSLAM. We analyze and compare the complexity, accuracy, sensitivity, robustness and restrictions of each of the above methods. Our performance tests are conducted over two stages: The first stage of testing uses video sequences created with simulated camera movements along the six degrees of freedom in order to compare accuracy in tracking, while the second stage analyzes the robustness of the algorithms by testing for manipulative factors like image scaling and frame-skipping.

  19. Minimum Requirements for Taxicab Security Cameras.

    PubMed

    Zeng, Shengke; Amandus, Harlan E; Amendola, Alfred A; Newbraugh, Bradley H; Cantis, Douglas M; Weaver, Darlene

    2014-07-01

    The homicide rate of taxicab-industry is 20 times greater than that of all workers. A NIOSH study showed that cities with taxicab-security cameras experienced significant reduction in taxicab driver homicides. Minimum technical requirements and a standard test protocol for taxicab-security cameras for effective taxicab-facial identification were determined. The study took more than 10,000 photographs of human-face charts in a simulated-taxicab with various photographic resolutions, dynamic ranges, lens-distortions, and motion-blurs in various light and cab-seat conditions. Thirteen volunteer photograph-evaluators evaluated these face photographs and voted for the minimum technical requirements for taxicab-security cameras. Five worst-case scenario photographic image quality thresholds were suggested: the resolution of XGA-format, highlight-dynamic-range of 1 EV, twilight-dynamic-range of 3.3 EV, lens-distortion of 30%, and shutter-speed of 1/30 second. These minimum requirements will help taxicab regulators and fleets to identify effective taxicab-security cameras, and help taxicab-security camera manufacturers to improve the camera facial identification capability.

  20. 44. ARAIII Fuel oil tank ARA710. Camera facing west. Perimeter ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    44. ARA-III Fuel oil tank ARA-710. Camera facing west. Perimeter fence at left side of view. Gable-roofed building beyond tank on right is ARA-622. Gable-roofed building beyond tank on left is ARA-610. Ineel photo no. 3-16. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  1. MTR WING A, TRA604. SOUTH SIDE. CAMERA FACING NORTH. THIS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR WING A, TRA-604. SOUTH SIDE. CAMERA FACING NORTH. THIS VIEW TYPIFIES TENDENCY FOR EXPANSIONS TO TAKE THE FORM OF PROJECTIONS AND INFILL USING AVAILABLE YARD SPACES. INL NEGATIVE NO. HD47-44-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  2. PBF Cooling Tower (PER720), and Auxiliary Building (PER624). Camera faces ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower (PER-720), and Auxiliary Building (PER-624). Camera faces north to show south facades. Oblong vertical structure at left of center is weather shield for stairway. Date: August 2003. INEEL negative no. HD-35-10-4 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  3. REACTOR SERVICE BUILDING, TRA635. CROWDED MOCKUP AREA. CAMERA FACES EAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    REACTOR SERVICE BUILDING, TRA-635. CROWDED MOCK-UP AREA. CAMERA FACES EAST. PHOTOGRAPHER'S NOTE SAYS "PICTURE REQUESTED BY IDO IN SUPPORT OF FY '58 BUILDING PROJECTS." INL NEGATIVE NO. 56-3025. R.G. Larsen, Photographer, 9/13/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  4. WATER PUMP HOUSE, TRA619, PUMP INSTALLATION. CAMERA FACING NORTHEAST CORNER. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    WATER PUMP HOUSE, TRA-619, PUMP INSTALLATION. CAMERA FACING NORTHEAST CORNER. CARD IN LOWER RIGHT WAS INSERTED BY INL PHOTOGRAPHER TO COVER AN OBSOLETE SECURITY RESTRICTION PRINTED ON THE ORIGINAL NEGATIVE. INL NEGATIVE NO. 3998. Unknown Photographer, 12/28/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  5. LPT. Low power test (TAN640) interior. Basement level. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Low power test (TAN-640) interior. Basement level. Camera facing north. Cable trays and conduit cross tunnel between critical experiment cell and critical experiment control room. Construction 93% complete. Photographer: Jack L. Anderson. Date: October 23, 1957. INEEL negative no. 57-5339 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  6. Construct and face validity of a virtual reality-based camera navigation curriculum.

    PubMed

    Shetty, Shohan; Panait, Lucian; Baranoski, Jacob; Dudrick, Stanley J; Bell, Robert L; Roberts, Kurt E; Duffy, Andrew J

    2012-10-01

    Camera handling and navigation are essential skills in laparoscopic surgery. Surgeons rely on camera operators, usually the least experienced members of the team, for visualization of the operative field. Essential skills for camera operators include maintaining orientation, an effective horizon, appropriate zoom control, and a clean lens. Virtual reality (VR) simulation may be a useful adjunct to developing camera skills in a novice population. No standardized VR-based camera navigation curriculum is currently available. We developed and implemented a novel curriculum on the LapSim VR simulator platform for our residents and students. We hypothesize that our curriculum will demonstrate construct and face validity in our trainee population, distinguishing levels of laparoscopic experience as part of a realistic training curriculum. Overall, 41 participants with various levels of laparoscopic training completed the curriculum. Participants included medical students, surgical residents (Postgraduate Years 1-5), fellows, and attendings. We stratified subjects into three groups (novice, intermediate, and advanced) based on previous laparoscopic experience. We assessed face validity with a questionnaire. The proficiency-based curriculum consists of three modules: camera navigation, coordination, and target visualization using 0° and 30° laparoscopes. Metrics include time, target misses, drift, path length, and tissue contact. We analyzed data using analysis of variance and Student's t-test. We noted significant differences in repetitions required to complete the curriculum: 41.8 for novices, 21.2 for intermediates, and 11.7 for the advanced group (P < 0.05). In the individual modules, coordination required 13.3 attempts for novices, 4.2 for intermediates, and 1.7 for the advanced group (P < 0.05). Target visualization required 19.3 attempts for novices, 13.2 for intermediates, and 8.2 for the advanced group (P < 0.05). Participants believe that training improves camera handling skills (95%), is relevant to surgery (95%), and is a valid training tool (93%). Graphics (98%) and realism (93%) were highly regarded. The VR-based camera navigation curriculum demonstrates construct and face validity for our training population. Camera navigation simulation may be a valuable tool that can be integrated into training protocols for residents and medical students during their surgery rotations. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Che; Huang, Chung-Lin

    2013-03-01

    This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

  8. PBF Cooling Tower (PER720). Camera faces east to show west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower (PER-720). Camera faces east to show west facade. Sloped (louvered) panels in this and opposite facade allow air to enter tower and cool water falling on splash bars within. Date: August 2003. INEEL negative no. HD-35-10-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  9. ETR HEAT EXCHANGER BUILDING, TRA644. EAST SIDE. CAMERA FACING WEST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR HEAT EXCHANGER BUILDING, TRA-644. EAST SIDE. CAMERA FACING WEST. NOTE COURSE OF PIPE FROM GROUND AND FOLLOWING ROOF OF BUILDING. MTR BUILDING IN BACKGROUND AT RIGHT EDGE OF VIEW. INL NEGATIVE NO. HD46-36-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  10. PBF Reactor Building (PER620). Camera facing north toward south facade. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera facing north toward south facade. Note west-wing siding on concrete block; high-bay siding of metal. Excavation and forms for signal and cable trenches proceed from building. Photographer: Kirsh. Date August 20, 1968. INEEL negative no. 68-3332 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  11. LOFT, TAN650. Camera facing southeast. From left to right: stack ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT, TAN-650. Camera facing southeast. From left to right: stack in distance, pre-amp wing, dome, north side of loft "service building." Note poured concrete wall of pre-amp wing on lower section; pumice block above. Date: May 2004. INEEL negative no. HD-39-19-3 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  12. PBF Reactor Building (PER620). Camera is facing east and down ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera is facing east and down into canal and storage pit for fuel rod assemblies. Stainless steel liner is being applied, temporarily covered with plywood for protection. Photographer: John Capek. Date: August 29, 1969. INEEL negative no. 69-4641 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  13. A&M. Hot liquid waste treatment building (TAN616). Camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616). Camera facing southwest. Oblique view of east and north walls. Note three corrugated pipes at lower left indicating location of underground hot waste storage tanks. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-1-4 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  14. LOFT. Containment building (TAN650) detail. Camera facing east. Service building ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Containment building (TAN-650) detail. Camera facing east. Service building corner is at left of view above personnel access. Round feature at left of dome is tank that will contain borated water. Metal stack at right of view. Date: 1973. INEEL negative no. 73-1085 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  15. PBF Reactor Building (PER620). Cubicle 10. Camera facing southeast. Loop ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Cubicle 10. Camera facing southeast. Loop pressurizer on right. Other equipment includes loop strained, control valves, loop piping, pressurizer interchanger, and cleanup system cooler. High-density shielding brick walls. Photographer: Kirsh. Date: November 2, 1970. INEEL negative no. 70-4908 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  16. Intelligent person identification system using stereo camera-based height and stride estimation

    NASA Astrophysics Data System (ADS)

    Ko, Jung-Hwan; Jang, Jae-Hun; Kim, Eun-Soo

    2005-05-01

    In this paper, a stereo camera-based intelligent person identification system is suggested. In the proposed method, face area of the moving target person is extracted from the left image of the input steros image pair by using a threshold value of YCbCr color model and by carrying out correlation between the face area segmented from this threshold value of YCbCr color model and the right input image, the location coordinates of the target face can be acquired, and then these values are used to control the pan/tilt system through the modified PID-based recursive controller. Also, by using the geometric parameters between the target face and the stereo camera system, the vertical distance between the target and stereo camera system can be calculated through a triangulation method. Using this calculated vertical distance and the angles of the pan and tilt, the target's real position data in the world space can be acquired and from them its height and stride values can be finally extracted. Some experiments with video images for 16 moving persons show that a person could be identified with these extracted height and stride parameters.

  17. A multicenter prospective cohort study on camera navigation training for key user groups in minimally invasive surgery.

    PubMed

    Graafland, Maurits; Bok, Kiki; Schreuder, Henk W R; Schijven, Marlies P

    2014-06-01

    Untrained laparoscopic camera assistants in minimally invasive surgery (MIS) may cause suboptimal view of the operating field, thereby increasing risk for errors. Camera navigation is often performed by the least experienced member of the operating team, such as inexperienced surgical residents, operating room nurses, and medical students. The operating room nurses and medical students are currently not included as key user groups in structured laparoscopic training programs. A new virtual reality laparoscopic camera navigation (LCN) module was specifically developed for these key user groups. This multicenter prospective cohort study assesses face validity and construct validity of the LCN module on the Simendo virtual reality simulator. Face validity was assessed through a questionnaire on resemblance to reality and perceived usability of the instrument among experts and trainees. Construct validity was assessed by comparing scores of groups with different levels of experience on outcome parameters of speed and movement proficiency. The results obtained show uniform and positive evaluation of the LCN module among expert users and trainees, signifying face validity. Experts and intermediate experience groups performed significantly better in task time and camera stability during three repetitions, compared to the less experienced user groups (P < .007). Comparison of learning curves showed significant improvement of proficiency in time and camera stability for all groups during three repetitions (P < .007). The results of this study show face validity and construct validity of the LCN module. The module is suitable for use in training curricula for operating room nurses and novice surgical trainees, aimed at improving team performance in minimally invasive surgery. © The Author(s) 2013.

  18. Application of real-time single camera SLAM technology for image-guided targeting in neurosurgery

    NASA Astrophysics Data System (ADS)

    Chang, Yau-Zen; Hou, Jung-Fu; Tsao, Yi Hsiang; Lee, Shih-Tseng

    2012-10-01

    In this paper, we propose an application of augmented reality technology for targeting tumors or anatomical structures inside the skull. The application is a combination of the technologies of MonoSLAM (Single Camera Simultaneous Localization and Mapping) and computer graphics. A stereo vision system is developed to construct geometric data of human face for registration with CT images. Reliability and accuracy of the application is enhanced by the use of fiduciary markers fixed to the skull. The MonoSLAM keeps track of the current location of the camera with respect to an augmented reality (AR) marker using the extended Kalman filter. The fiduciary markers provide reference when the AR marker is invisible to the camera. Relationship between the markers on the face and the augmented reality marker is obtained by a registration procedure by the stereo vision system and is updated on-line. A commercially available Android based tablet PC equipped with a 320×240 front-facing camera was used for implementation. The system is able to provide a live view of the patient overlaid by the solid models of tumors or anatomical structures, as well as the missing part of the tool inside the skull.

  19. Unconstrained face detection and recognition based on RGB-D camera for the visually impaired

    NASA Astrophysics Data System (ADS)

    Zhao, Xiangdong; Wang, Kaiwei; Yang, Kailun; Hu, Weijian

    2017-02-01

    It is highly important for visually impaired people (VIP) to be aware of human beings around themselves, so correctly recognizing people in VIP assisting apparatus provide great convenience. However, in classical face recognition technology, faces used in training and prediction procedures are usually frontal, and the procedures of acquiring face images require subjects to get close to the camera so that frontal face and illumination guaranteed. Meanwhile, labels of faces are defined manually rather than automatically. Most of the time, labels belonging to different classes need to be input one by one. It prevents assisting application for VIP with these constraints in practice. In this article, a face recognition system under unconstrained environment is proposed. Specifically, it doesn't require frontal pose or uniform illumination as required by previous algorithms. The attributes of this work lie in three aspects. First, a real time frontal-face synthesizing enhancement is implemented, and frontal faces help to increase recognition rate, which is proved with experiment results. Secondly, RGB-D camera plays a significant role in our system, from which both color and depth information are utilized to achieve real time face tracking which not only raises the detection rate but also gives an access to label faces automatically. Finally, we propose to use neural networks to train a face recognition system, and Principal Component Analysis (PCA) is applied to pre-refine the input data. This system is expected to provide convenient help for VIP to get familiar with others, and make an access for them to recognize people when the system is trained enough.

  20. The ideal subject distance for passport pictures.

    PubMed

    Verhoff, Marcel A; Witzel, Carsten; Kreutz, Kerstin; Ramsthaler, Frank

    2008-07-04

    In an age of global combat against terrorism, the recognition and identification of people on document images is of increasing significance. Experiments and calculations have shown that the camera-to-subject distance - not the focal length of the lens - can have a significant effect on facial proportions. Modern passport pictures should be able to function as a reference image for automatic and manual picture comparisons. This requires a defined subject distance. It is completely unclear which subject distance, in the taking of passport photographs, is ideal for the recognition of the actual person. We show here that the camera-to-subject distance that is perceived as ideal is dependent on the face being photographed, even if the distance of 2m was most frequently preferred. So far the problem of the ideal camera-to-subject distance for faces has only been approached through technical calculations. We have, for the first time, answered this question experimentally with a double-blind experiment. Even if there is apparently no ideal camera-to-subject distance valid for every face, 2m can be proposed as ideal for the taking of passport pictures. The first step would actually be the determination of a camera-to-subject distance for the taking of passport pictures within the standards. From an anthropological point of view it would be interesting to find out which facial features allow the preference of a shorter camera-to-subject distance and which allow the preference of a longer camera-to-subject distance.

  1. Minimum Requirements for Taxicab Security Cameras*

    PubMed Central

    Zeng, Shengke; Amandus, Harlan E.; Amendola, Alfred A.; Newbraugh, Bradley H.; Cantis, Douglas M.; Weaver, Darlene

    2015-01-01

    Problem The homicide rate of taxicab-industry is 20 times greater than that of all workers. A NIOSH study showed that cities with taxicab-security cameras experienced significant reduction in taxicab driver homicides. Methods Minimum technical requirements and a standard test protocol for taxicab-security cameras for effective taxicab-facial identification were determined. The study took more than 10,000 photographs of human-face charts in a simulated-taxicab with various photographic resolutions, dynamic ranges, lens-distortions, and motion-blurs in various light and cab-seat conditions. Thirteen volunteer photograph-evaluators evaluated these face photographs and voted for the minimum technical requirements for taxicab-security cameras. Results Five worst-case scenario photographic image quality thresholds were suggested: the resolution of XGA-format, highlight-dynamic-range of 1 EV, twilight-dynamic-range of 3.3 EV, lens-distortion of 30%, and shutter-speed of 1/30 second. Practical Applications These minimum requirements will help taxicab regulators and fleets to identify effective taxicab-security cameras, and help taxicab-security camera manufacturers to improve the camera facial identification capability. PMID:26823992

  2. ETR COMPRESSOR BUILDING, TRA643. CAMERA FACES NORTHEAST. WATER HEAT EXCHANGER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPRESSOR BUILDING, TRA-643. CAMERA FACES NORTHEAST. WATER HEAT EXCHANGER IS IN LEFT FOREGROUND. A PARTIALLY ASSEMBLED PLANT AIR CONDITIONER IS AT CENTER. WORKERS AT RIGHT ASSEMBLE 4000 HORSEPOWER COMPRESSOR DRIVE MOTOR AT RIGHT. INL NEGATIVE NO. 56-3714. R.G. Larsen, Photographer, 11/13/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  3. PBF Reactor Building (PER620). Camera faces south toward verticallift door, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces south toward vertical-lift door, which is closed. Note crane and its trolley positioned near door; its rails along side walls. Reactor vessel and lifting beams are positioned above reactor pit. Photographer: John Capek. Date: January 9, 1970. INEEL negative no. 70-132 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  4. MTR BUILDING INTERIOR, TRA603. BASEMENT. CAMERA IN WEST CORRIDOR FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BUILDING INTERIOR, TRA-603. BASEMENT. CAMERA IN WEST CORRIDOR FACING SOUTH. FREIGHT ELEVATOR IS AT RIGHT OF VIEW. AT CENTER VIEW IS MTR VAULT NO. 1, USED TO STORE SPECIAL OR FISSIONABLE MATERIALS. INL NEGATIVE NO. HD46-6-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  5. Human Age Estimation Method Robust to Camera Sensor and/or Face Movement

    PubMed Central

    Nguyen, Dat Tien; Cho, So Ra; Pham, Tuyen Danh; Park, Kang Ryoung

    2015-01-01

    Human age can be employed in many useful real-life applications, such as customer service systems, automatic vending machines, entertainment, etc. In order to obtain age information, image-based age estimation systems have been developed using information from the human face. However, limitations exist for current age estimation systems because of the various factors of camera motion and optical blurring, facial expressions, gender, etc. Motion blurring can usually be presented on face images by the movement of the camera sensor and/or the movement of the face during image acquisition. Therefore, the facial feature in captured images can be transformed according to the amount of motion, which causes performance degradation of age estimation systems. In this paper, the problem caused by motion blurring is addressed and its solution is proposed in order to make age estimation systems robust to the effects of motion blurring. Experiment results show that our method is more efficient for enhancing age estimation performance compared with systems that do not employ our method. PMID:26334282

  6. CONTEXTUAL AERIAL VIEW OF "COLD" NORTH HALF OF MTR COMPLEX. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    CONTEXTUAL AERIAL VIEW OF "COLD" NORTH HALF OF MTR COMPLEX. CAMERA FACING EASTERLY. FOREGROUND CORNER CONTAINS OIL STORAGE TANKS. WATER TANKS AND WELL HOUSES ARE BEYOND THEM TO THE LEFT. LARGE LIGHT-COLORED BUILDING IN CENTER OF VIEW IS STEAM PLANT. DEMINERALIZER AND WATER STORAGE TANK ARE BEYOND. SIX-CELL COOLING TOWER AND ITS PUMP HOUSE ARE ABOVE IT IN VIEW. SERVICE BUILDINGS INCLUDING CANTEEN ARE ON NORTH SIDE OF ROAD. "EXCLUSION" AREA IS BEYOND ROAD. COMPARE LOCATION OF EXCLUSION-AREA GATE WITH PHOTO ID-33-G-202. INL NEGATIVE NO. 3608. Unknown Photographer, 10/30/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. Alternative images for perpendicular parking : a usability test of a multi-camera parking assistance system.

    DOT National Transportation Integrated Search

    2004-10-01

    The parking assistance system evaluated consisted of four outward facing cameras whose images could be presented on a monitor on the center console. The images presented varied in the location of the virtual eye point of the camera (the height above ...

  8. A&M. A&M building (TAN607). Camera facing east. From left to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. A&M building (TAN-607). Camera facing east. From left to right, pool section, hot shop, cold shop, and machine shop. Biparting doors to hot shop are in open position behind shroud. Four rail tracks lead to hot shop and cold shop. Date: August 20, 1954. INEEL negative no. 11706 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  9. Cross-Platform Mobile Application Development: A Pattern-Based Approach

    DTIC Science & Technology

    2012-03-01

    Additionally, developers should be aware of different hardware capabilities such as external SD cards and forward facing cameras. Finally, each...applications are written. Additionally, developers should be aware of different hardware capabilities such as external SD cards and forward facing cameras... iTunes library, allowing the user to update software and manage content on each device. However, in iOS5, the PC Free feature removes this constraint

  10. PROCESS WATER BUILDING, TRA605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS WATER BUILDING AND ETR STACK ARE IN LEFT HALF OF VIEW. TRA-666 IS NEAR CENTER, ABUTTED BY SECURITY BUILDING; TRA-626, AT RIGHT EDGE OF VIEW BEHIND BUS. INL NEGATIVE NO. HD46-34-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. 1. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY. CAMERA FACING NORTHEAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY. CAMERA FACING NORTHEAST. ON RIGHT OF VIEW IS PART OF EARTH/GRAVEL SHIELDING FOR BIN SET. AERIAL STRUCTURE MOUNTED ON POLES IS PNEUMATIC TRANSFER SYSTEM FOR DELIVERY OF SAMPLES BEING SENT FROM NEW WASTE CALCINING FACILITY TO THE CPP REMOTE ANALYTICAL LABORATORY. INEEL PROOF NUMBER HD-17-1. - Idaho National Engineering Laboratory, Old Waste Calcining Facility, Scoville, Butte County, ID

  12. MTR BUILDING, TRA603. EAST SIDE. CAMERA FACING WEST. CORRUGATED IRON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BUILDING, TRA-603. EAST SIDE. CAMERA FACING WEST. CORRUGATED IRON BUILDING MARKED WITH "X" IS TRA-651. TRA-626, TO ITS RIGHT, HOUSED COMPRESSOR EQUIPMENT FOR THE AIRCRAFT NUCLEAR PROPULSION PROGRAM. LATER, IT WAS USED FOR STORAGE. INL NEGATIVE NO. HD46-42-4. Mike Crane, Photographer, April 2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. ENGINEERING TEST REACTOR (ETR) BUILDING, TRA642. CONTEXTUAL VIEW, CAMERA FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ENGINEERING TEST REACTOR (ETR) BUILDING, TRA-642. CONTEXTUAL VIEW, CAMERA FACING EAST. VERTICAL METAL SIDING. ROOF IS SLIGHTLY ELEVATED AT CENTER LINE FOR DRAINAGE. WEST SIDE OF ETR COMPRESSOR BUILDING, TRA-643, PROJECTS TOWARD LEFT AT FAR END OF ETR BUILDING. INL NEGATIVE NO. HD46-37-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  14. A multi-camera system for real-time pose estimation

    NASA Astrophysics Data System (ADS)

    Savakis, Andreas; Erhard, Matthew; Schimmel, James; Hnatow, Justin

    2007-04-01

    This paper presents a multi-camera system that performs face detection and pose estimation in real-time and may be used for intelligent computing within a visual sensor network for surveillance or human-computer interaction. The system consists of a Scene View Camera (SVC), which operates at a fixed zoom level, and an Object View Camera (OVC), which continuously adjusts its zoom level to match objects of interest. The SVC is set to survey the whole filed of view. Once a region has been identified by the SVC as a potential object of interest, e.g. a face, the OVC zooms in to locate specific features. In this system, face candidate regions are selected based on skin color and face detection is accomplished using a Support Vector Machine classifier. The locations of the eyes and mouth are detected inside the face region using neural network feature detectors. Pose estimation is performed based on a geometrical model, where the head is modeled as a spherical object that rotates upon the vertical axis. The triangle formed by the mouth and eyes defines a vertical plane that intersects the head sphere. By projecting the eyes-mouth triangle onto a two dimensional viewing plane, equations were obtained that describe the change in its angles as the yaw pose angle increases. These equations are then combined and used for efficient pose estimation. The system achieves real-time performance for live video input. Testing results assessing system performance are presented for both still images and video.

  15. Why are faces denser in the visual experiences of younger than older infants?

    PubMed Central

    Jayaraman, Swapnaa; Fausey, Caitlin M.; Smith, Linda B.

    2017-01-01

    Recent evidence from studies using head cameras suggests that the frequency of faces directly in front of infants declines over the first year and a half of life, a result that has implications for the development of and evolutionary constraints on face processing. Two experiments tested two opposing hypotheses about this observed age-related decline in the frequency of faces in infant views. By the People-input hypothesis, there are more faces in view for younger infants because people are more often physically in front of younger than older infants. This hypothesis predicts that not just faces but views of other body parts will decline with age. By the Face-input hypothesis, the decline is strictly about faces, not people or other body parts in general. Two experiments, one using a time-sampling method (84 infants 3 to 24 months in age) and the other analyses of head camera images (36 infants 1 to 24 months) provide strong support for the Face-input hypothesis. The results suggest developmental constraints on the environment that ensure faces are prevalent early in development. PMID:28026190

  16. Development of camera technology for monitoring nests. Chapter 15

    Treesearch

    W. Andrew Cox; M. Shane Pruett; Thomas J. Benson; Scott J. Chiavacci; Frank R., III Thompson

    2012-01-01

    Photo and video technology has become increasingly useful in the study of avian nesting ecology. However, researchers interested in using camera systems are often faced with insufficient information on the types and relative advantages of available technologies. We reviewed the literature for studies of nests that used cameras and summarized them based on study...

  17. PBF Reactor Building (PER620). Camera faces south along west wall. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces south along west wall. Gap between native lava rock and concrete basement walls is being backfilled and compacted. Wire mesh protects workers from falling rock. Note penetrations for piping that will carry secondary coolant water to Cooling Tower. Photographer: Holmes. Date: June 15, 1967. INEEL negative no. 67-3665 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  18. ETR BUILDING, TRA642, INTERIOR. BASEMENT. CAMERA FACES SOUTH AND LOOKS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. BASEMENT. CAMERA FACES SOUTH AND LOOKS AT DOOR TO M-3 CUBICLE. CUBICLE WALLS ARE MADE OF LEAD SHIELDING BRICKS. VALVE HANDLES AND STEMS PERTAIN TO SAMPLING. METAL SHIELDING DOOR. NOTE GLOVE BOX TO RIGHT OF CUBICLE DOOR. INL NEGATIVE NO. HD-46-21-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  19. ETR HEAT EXCHANGER BUILDING, TRA644. SOUTH SIDE. CAMERA FACING NORTH. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR HEAT EXCHANGER BUILDING, TRA-644. SOUTH SIDE. CAMERA FACING NORTH. NOTE POURED CONCRETE WALLS. ETR IS AT LEFT OF VIEW. NOTE DRIVEWAY INSET AT RIGHT FORMED BY DEMINERALIZER WING AT RIGHT. SOUTHEAST CORNER OF ETR, TRA-642, IN VIEW AT UPPER LEFT. INL NEGATIVE NO. HD46-36-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  20. 9. INTERIOR, WAREHOUSE SPACE AT EAST END OF BUILDING, CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. INTERIOR, WAREHOUSE SPACE AT EAST END OF BUILDING, CAMERA FACING NORTHEAST. - U.S. Coast Guard Support Center Alameda, Warehouse, Spencer Road & Icarrus Drive, Coast Guard Island, Alameda, Alameda County, CA

  1. Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas

    PubMed Central

    2018-01-01

    This paper presents a method of fusing the ego-motion of a robot or a land vehicle estimated from an upward-facing camera with Global Navigation Satellite System (GNSS) signals for navigation purposes in urban environments. A sky-pointing camera is mounted on the top of a car and synchronized with a GNSS receiver. The advantages of this configuration are two-fold: firstly, for the GNSS signals, the upward-facing camera will be used to classify the acquired images into sky and non-sky (also known as segmentation). A satellite falling into the non-sky areas (e.g., buildings, trees) will be rejected and not considered for the final position solution computation. Secondly, the sky-pointing camera (with a field of view of about 90 degrees) is helpful for urban area ego-motion estimation in the sense that it does not see most of the moving objects (e.g., pedestrians, cars) and thus is able to estimate the ego-motion with fewer outliers than is typical with a forward-facing camera. The GNSS and visual information systems are tightly-coupled in a Kalman filter for the final position solution. Experimental results demonstrate the ability of the system to provide satisfactory navigation solutions and better accuracy than the GNSS-only and the loosely-coupled GNSS/vision, 20 percent and 82 percent (in the worst case) respectively, in a deep urban canyon, even in conditions with fewer than four GNSS satellites. PMID:29673230

  2. Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas.

    PubMed

    Gakne, Paul Verlaine; O'Keefe, Kyle

    2018-04-17

    This paper presents a method of fusing the ego-motion of a robot or a land vehicle estimated from an upward-facing camera with Global Navigation Satellite System (GNSS) signals for navigation purposes in urban environments. A sky-pointing camera is mounted on the top of a car and synchronized with a GNSS receiver. The advantages of this configuration are two-fold: firstly, for the GNSS signals, the upward-facing camera will be used to classify the acquired images into sky and non-sky (also known as segmentation). A satellite falling into the non-sky areas (e.g., buildings, trees) will be rejected and not considered for the final position solution computation. Secondly, the sky-pointing camera (with a field of view of about 90 degrees) is helpful for urban area ego-motion estimation in the sense that it does not see most of the moving objects (e.g., pedestrians, cars) and thus is able to estimate the ego-motion with fewer outliers than is typical with a forward-facing camera. The GNSS and visual information systems are tightly-coupled in a Kalman filter for the final position solution. Experimental results demonstrate the ability of the system to provide satisfactory navigation solutions and better accuracy than the GNSS-only and the loosely-coupled GNSS/vision, 20 percent and 82 percent (in the worst case) respectively, in a deep urban canyon, even in conditions with fewer than four GNSS satellites.

  3. Investigation into the use of photoanthropometry in facial image comparison.

    PubMed

    Moreton, Reuben; Morley, Johanna

    2011-10-10

    Photoanthropometry is a metric based facial image comparison technique. Measurements of the face are taken from an image using predetermined facial landmarks. Measurements are then converted to proportionality indices (PIs) and compared to PIs from another facial image. Photoanthropometry has been presented as a facial image comparison technique in UK courts for over 15 years. It is generally accepted that extrinsic factors (e.g. orientation of the head, camera angle and distance from the camera) can cause discrepancies in anthropometric measurements of the face from photographs. However there has been limited empirical research into quantifying the influence of such variables. The aim of this study was to determine the reliability of photoanthropometric measurements between different images of the same individual taken with different angulations of the camera. The study examined the facial measurements of 25 individuals from high resolution photographs, taken at different horizontal and vertical camera angles in a controlled environment. Results show that the degree of variability in facial measurements of the same individual due to variations in camera angle can be as great as the variability of facial measurements between different individuals. Results suggest that photoanthropometric facial comparison, as it is currently practiced, is unsuitable for elimination purposes. Preliminary investigations into the effects of distance from camera and image resolution in poor quality images suggest that such images are not an accurate representation of an individuals face, however further work is required. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks.

    PubMed

    Zhang, Cuicui; Liang, Xuefeng; Matsuyama, Takashi

    2014-12-08

    Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the "small sample size" (SSS) problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1) how to define diverse base classifiers from the small data; (2) how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0-1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system.

  5. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks

    PubMed Central

    Zhang, Cuicui; Liang, Xuefeng; Matsuyama, Takashi

    2014-01-01

    Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the “small sample size” (SSS) problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1) how to define diverse base classifiers from the small data; (2) how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0–1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system. PMID:25494350

  6. Mastcam Telephoto of a Martian Dune Downwind Face

    NASA Image and Video Library

    2016-01-04

    This view combines multiple images from the telephoto-lens camera of the Mast Camera (Mastcam) on NASA's Curiosity Mars rover to reveal fine details of the downwind face of "Namib Dune." The site is part of the dark-sand "Bagnold Dunes" field along the northwestern flank of Mount Sharp. Images taken from orbit have shown that dunes in the Bagnold field move as much as about 3 feet (1 meter) per Earth year. Sand on this face of Namib Dune has cascaded down a slope of about 26 to 28 degrees. The top of the face is about 13 to 17 feet (4 to 5 meters) above the rocky ground at its base. http://photojournal.jpl.nasa.gov/catalog/PIA20283

  7. DETAIL OF DOORS ON EAST ELEVATION AT SOUTH END; CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF DOORS ON EAST ELEVATION AT SOUTH END; CAMERA FACING WEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  8. INTERIOR VIEW OF FIRST STORY SPACE SHOWING CONCRETE BEAMS; CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF FIRST STORY SPACE SHOWING CONCRETE BEAMS; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  9. IET. Weather instrumentation tower, located south of control building. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Weather instrumentation tower, located south of control building. Camera facing west. Date: August 17, 1955. INEEL negative no. 55-2414 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  10. Detail of main entry at center of southeast elevation; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main entry at center of southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  11. Detail of basement level concrete beams at southwest corner; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of basement level concrete beams at southwest corner; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  12. Detail of windows at north portion of west elevation; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of windows at north portion of west elevation; camera facing east. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  13. Detail of portico at main entry on east elevation; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of portico at main entry on east elevation; camera facing southwest. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  14. Interior detail of lobby terrazzo floor inlaid NPG insignia; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of lobby terrazzo floor inlaid NPG insignia; camera facing west. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  15. Interior detail of fourpart moveable doors in east hallway; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of four-part moveable doors in east hallway; camera facing south. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  16. Detail of southeast corner with spandrel and window pattern; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of southeast corner with spandrel and window pattern; camera facing northeast. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  17. Detail of tower and entry doors on north elevation; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of tower and entry doors on north elevation; camera facing southwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  18. Detail of northeast corner with spandrel and window pattern; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of northeast corner with spandrel and window pattern; camera facing southwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  19. Interior detail of main entrance doors on south wall; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of main entrance doors on south wall; camera facing south. - Mare Island Naval Shipyard, Old Administrative Offices, Eighth Street, north side between Railroad Avenue & Walnut Avenue, Vallejo, Solano County, CA

  20. EAST FACE OF REACTOR BASE. COMING TOWARD CAMERA IS EXCAVATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    EAST FACE OF REACTOR BASE. COMING TOWARD CAMERA IS EXCAVATION FOR MTR CANAL. CAISSONS FLANK EACH SIDE. COUNTERFORT (SUPPORT PERPENDICULAR TO WHAT WILL BE THE LONG WALL OF THE CANAL) RESTS ATOP LEFT CAISSON. IN LOWER PART OF VIEW, DRILLERS PREPARE TRENCHES FOR SUPPORT BEAMS THAT WILL LIE BENEATH CANAL FLOOR. INL NEGATIVE NO. 739. Unknown Photographer, 10/6/1950 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  1. PBF Reactor Building (PER620). Camera facing southeast in second basement. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera facing southeast in second basement. Round form and reinforcing steel surround reactor vessel pit, which will be heavily shielded by several feet of concrete. Block-out is for door to sub-pile room. Rectangular form and rebar beyond pit is for canal wall. Photographer: John Capek. Date: March 10, 1967. INEEL negative no. 67-1643 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  2. FAST CHOPPER BUILDING, TRA665. CAMERA FACING NORTH. NOTE BRICKEDIN WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FAST CHOPPER BUILDING, TRA-665. CAMERA FACING NORTH. NOTE BRICKED-IN WINDOW ON RIGHT SIDE (BELOW PAINTED NUMERALS "665"). SLIDING METAL DOOR ON COVERED RAIL AT UPPER LEVEL. SHELTERED ENTRANCE TO STEEL SHIELDING DOOR. DOOR INTO MTR SERVICE BUILDING, TRA-635, STANDS OPEN. MTR BEHIND CHOPPER BUILDING. INL NEGATIVE NO. HD42-1. Mike Crane, Photographer, 3/2004 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  3. ETR COMPRESSOR BUILDING, TRA643. CAMERA FACES NORTH. AIR HEATERS LINE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPRESSOR BUILDING, TRA-643. CAMERA FACES NORTH. AIR HEATERS LINE UP AGAINST WALL, TO BE USED IN CONNECTION WITH ETR EXPERIMENTS. EACH HAD A HEAT OUTPUT OF 8 MILLION BTU PER HOUR, OPERATED AT 1260 DEGREES F. AND A PRESSURE OF 320 PSI. NOTE METAL WALLS AND ROOF. INL NEGATIVE NO. 56-3709. R.G. Larsen, Photographer, 11/13/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  4. A 3D camera for improved facial recognition

    NASA Astrophysics Data System (ADS)

    Lewin, Andrew; Orchard, David A.; Scott, Andrew M.; Walton, Nicholas A.; Austin, Jim

    2004-12-01

    We describe a camera capable of recording 3D images of objects. It does this by projecting thousands of spots onto an object and then measuring the range to each spot by determining the parallax from a single frame. A second frame can be captured to record a conventional image, which can then be projected onto the surface mesh to form a rendered skin. The camera is able of locating the images of the spots to a precision of better than one tenth of a pixel, and from this it can determine range to an accuracy of less than 1 mm at 1 meter. The data can be recorded as a set of two images, and is reconstructed by forming a 'wire mesh' of range points and morphing the 2 D image over this structure. The camera can be used to record the images of faces and reconstruct the shape of the face, which allows viewing of the face from various angles. This allows images to be more critically inspected for the purpose of identifying individuals. Multiple images can be stitched together to create full panoramic images of head sized objects that can be viewed from any direction. The system is being tested with a graph matching system capable of fast and accurate shape comparisons for facial recognition. It can also be used with "models" of heads and faces to provide a means of obtaining biometric data.

  5. PBF Cooling Tower detail. Camera facing southwest into north side ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower detail. Camera facing southwest into north side of Tower. Five horizontal layers of splash bars constitute fill decks, which will break up falling water into droplets, promoting evaporative cooling. Louvered faces, through which air enters tower, are on east and west sides. Louvers have been installed. Support framework for one of two venturi-shaped fan stacks (or "vents") is in center top. Orifices in hot basins (not in view) will distribute water over fill. Photographer: Kirsh. Date: May 15, 1969. INEEL negative no. 69-3032 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  6. Application of robust face recognition in video surveillance systems

    NASA Astrophysics Data System (ADS)

    Zhang, De-xin; An, Peng; Zhang, Hao-xiang

    2018-03-01

    In this paper, we propose a video searching system that utilizes face recognition as searching indexing feature. As the applications of video cameras have great increase in recent years, face recognition makes a perfect fit for searching targeted individuals within the vast amount of video data. However, the performance of such searching depends on the quality of face images recorded in the video signals. Since the surveillance video cameras record videos without fixed postures for the object, face occlusion is very common in everyday video. The proposed system builds a model for occluded faces using fuzzy principal component analysis (FPCA), and reconstructs the human faces with the available information. Experimental results show that the system has very high efficiency in processing the real life videos, and it is very robust to various kinds of face occlusions. Hence it can relieve people reviewers from the front of the monitors and greatly enhances the efficiency as well. The proposed system has been installed and applied in various environments and has already demonstrated its power by helping solving real cases.

  7. Face Liveness Detection Using Defocus

    PubMed Central

    Kim, Sooyeon; Ban, Yuseok; Lee, Sangyoun

    2015-01-01

    In order to develop security systems for identity authentication, face recognition (FR) technology has been applied. One of the main problems of applying FR technology is that the systems are especially vulnerable to attacks with spoofing faces (e.g., 2D pictures). To defend from these attacks and to enhance the reliability of FR systems, many anti-spoofing approaches have been recently developed. In this paper, we propose a method for face liveness detection using the effect of defocus. From two images sequentially taken at different focuses, three features, focus, power histogram and gradient location and orientation histogram (GLOH), are extracted. Afterwards, we detect forged faces through the feature-level fusion approach. For reliable performance verification, we develop two databases with a handheld digital camera and a webcam. The proposed method achieves a 3.29% half total error rate (HTER) at a given depth of field (DoF) and can be extended to camera-equipped devices, like smartphones. PMID:25594594

  8. PBF Reactor Building (PER620). In subpile room, camera faces southeast ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). In sub-pile room, camera faces southeast and looks up toward bottom of reactor vessel. Upper assembly in center of view is in-pile tube as it connects to vessel. Lower lateral constraints and rotating control cable are in position. Other connections have been bolted together. Note light bulbs for scale. Photographer: John Capek. Date: August 21, 1970. INEEL negative no. 70-3494 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  9. PBF Reactor Building (PER620). Camera on main floor faces south ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera on main floor faces south (open) doorway. In foreground is canal gate, lined with stainless steel and painted with protective coatings. Reactor pit is round with protective coatings. Reactor put is round form discernible beyond. Lifting beams and rigging are in place for a load test before reactor vessel arrives. Photographer: John Capek. Date: January 26, 1970. INEEL negative no. 70-347 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  10. HOT CELL BUILDING, TRA632. EAST END OF BUILDING. CAMERA FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632. EAST END OF BUILDING. CAMERA FACING WEST. TRUCK ENCLOSURE (1986) TO THE LEFT, SMALL ADDITION IN ITS SHADOW IS ENCLOSURE OVER METAL PORT INTO HOT CELL NO. 1 (THE OLDEST HOT CELL). NOTE PERSONNEL LADDER AND PLATFORM AT LOFT LEVEL USED WHEN SERVICING AIR FILTERS AND VENTS OF CELL NO. 1. INL NEGATIVE NO. HD46-32-4. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. REACTOR SERVICE BUILDING, TRA635, INTERIOR. CAMERA FACES NORTHWEST TOWARDS INTERIOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    REACTOR SERVICE BUILDING, TRA-635, INTERIOR. CAMERA FACES NORTHWEST TOWARDS INTERIOR WALL ENCLOSING STORAGE AND OFFICE SPACE ALONG THE WEST SIDE. AT RIGHT EDGE IS DOOR TO MTR BUILDING. FROM RIGHT TO LEFT, SPACE WAS PLANNED FOR A LOCKER ROOM, MTR ISSUE ROOM, AND STORAGE AREAS AND RELATED OFFICES. NOTE SECOND "MEZZANINE" FLOOR ABOVE. INL NEGATIVE NO. 10227. Unknown Photographer, 3/23/1954 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  12. DEMINERALIZER BUILDING,TRA608. CAMERA FACES EAST ALONG SOUTH WALL. INSTRUMENT PANEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DEMINERALIZER BUILDING,TRA-608. CAMERA FACES EAST ALONG SOUTH WALL. INSTRUMENT PANEL BOARD IS IN RIGHT HALF OF VIEW, WITH FOUR PUMPS BEYOND. SMALLER PUMPS FILL DEMINERALIZED WATER TANK ON SOUTH SIDE OF BUILDING. CARD IN LOWER RIGHT WAS INSERTED BY INL PHOTOGRAPHER TO COVER AN OBSOLETE SECURITY RESTRICTION PRINTED ON ORIGINAL NEGATIVE. INL NEGATIVE NO. 3997A. Unknown Photographer, 12/28/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. Moving human full body and body parts detection, tracking, and applications on human activity estimation, walking pattern and face recognition

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike

    2016-05-01

    We have developed a new way for detection and tracking of human full-body and body-parts with color (intensity) patch morphological segmentation and adaptive thresholding for security surveillance cameras. An adaptive threshold scheme has been developed for dealing with body size changes, illumination condition changes, and cross camera parameter changes. Tests with the PETS 2009 and 2014 datasets show that we can obtain high probability of detection and low probability of false alarm for full-body. Test results indicate that our human full-body detection method can considerably outperform the current state-of-the-art methods in both detection performance and computational complexity. Furthermore, in this paper, we have developed several methods using color features for detection and tracking of human body-parts (arms, legs, torso, and head, etc.). For example, we have developed a human skin color sub-patch segmentation algorithm by first conducting a RGB to YIQ transformation and then applying a Subtractive I/Q image Fusion with morphological operations. With this method, we can reliably detect and track human skin color related body-parts such as face, neck, arms, and legs. Reliable body-parts (e.g. head) detection allows us to continuously track the individual person even in the case that multiple closely spaced persons are merged. Accordingly, we have developed a new algorithm to split a merged detection blob back to individual detections based on the detected head positions. Detected body-parts also allow us to extract important local constellation features of the body-parts positions and angles related to the full-body. These features are useful for human walking gait pattern recognition and human pose (e.g. standing or falling down) estimation for potential abnormal behavior and accidental event detection, as evidenced with our experimental tests. Furthermore, based on the reliable head (face) tacking, we have applied a super-resolution algorithm to enhance the face resolution for improved human face recognition performance.

  14. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  15. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Public Speaking Anxiety: Comparing Face-to-Face and Web-Based Speeches

    ERIC Educational Resources Information Center

    Campbell, Scott; Larson, James

    2013-01-01

    This study is to determine whether or not students have a different level of anxiety between giving a speech to a group of people in a traditional face-to-face classroom setting to a speech given to an audience (visible on a projected screen) into a camera using distance or web-based technology. The study included approximately 70 students.…

  18. Parting Moon Shots from NASAs GRAIL Mission

    NASA Image and Video Library

    2013-01-10

    Video of the moon taken by the NASA GRAIL mission's MoonKam (Moon Knowledge Acquired by Middle School Students) camera aboard the Ebb spacecraft on Dec. 14, 2012. Features forward-facing and rear-facing views.

  19. Cross-modal face recognition using multi-matcher face scores

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Blasch, Erik

    2015-05-01

    The performance of face recognition can be improved using information fusion of multimodal images and/or multiple algorithms. When multimodal face images are available, cross-modal recognition is meaningful for security and surveillance applications. For example, a probe face is a thermal image (especially at nighttime), while only visible face images are available in the gallery database. Matching a thermal probe face onto the visible gallery faces requires crossmodal matching approaches. A few such studies were implemented in facial feature space with medium recognition performance. In this paper, we propose a cross-modal recognition approach, where multimodal faces are cross-matched in feature space and the recognition performance is enhanced with stereo fusion at image, feature and/or score level. In the proposed scenario, there are two cameras for stereo imaging, two face imagers (visible and thermal images) in each camera, and three recognition algorithms (circular Gaussian filter, face pattern byte, linear discriminant analysis). A score vector is formed with three cross-matched face scores from the aforementioned three algorithms. A classifier (e.g., k-nearest neighbor, support vector machine, binomial logical regression [BLR]) is trained then tested with the score vectors by using 10-fold cross validations. The proposed approach was validated with a multispectral stereo face dataset from 105 subjects. Our experiments show very promising results: ACR (accuracy rate) = 97.84%, FAR (false accept rate) = 0.84% when cross-matching the fused thermal faces onto the fused visible faces by using three face scores and the BLR classifier.

  20. Facial expression identification using 3D geometric features from Microsoft Kinect device

    NASA Astrophysics Data System (ADS)

    Han, Dongxu; Al Jawad, Naseer; Du, Hongbo

    2016-05-01

    Facial expression identification is an important part of face recognition and closely related to emotion detection from face images. Various solutions have been proposed in the past using different types of cameras and features. Microsoft Kinect device has been widely used for multimedia interactions. More recently, the device has been increasingly deployed for supporting scientific investigations. This paper explores the effectiveness of using the device in identifying emotional facial expressions such as surprise, smile, sad, etc. and evaluates the usefulness of 3D data points on a face mesh structure obtained from the Kinect device. We present a distance-based geometric feature component that is derived from the distances between points on the face mesh and selected reference points in a single frame. The feature components extracted across a sequence of frames starting and ending by neutral emotion represent a whole expression. The feature vector eliminates the need for complex face orientation correction, simplifying the feature extraction process and making it more efficient. We applied the kNN classifier that exploits a feature component based similarity measure following the principle of dynamic time warping to determine the closest neighbors. Preliminary tests on a small scale database of different facial expressions show promises of the newly developed features and the usefulness of the Kinect device in facial expression identification.

  1. Applications of a shadow camera system for energy meteorology

    NASA Astrophysics Data System (ADS)

    Kuhn, Pascal; Wilbert, Stefan; Prahl, Christoph; Garsche, Dominik; Schüler, David; Haase, Thomas; Ramirez, Lourdes; Zarzalejo, Luis; Meyer, Angela; Blanc, Philippe; Pitz-Paal, Robert

    2018-02-01

    Downward-facing shadow cameras might play a major role in future energy meteorology. Shadow cameras directly image shadows on the ground from an elevated position. They are used to validate other systems (e.g. all-sky imager based nowcasting systems, cloud speed sensors or satellite forecasts) and can potentially provide short term forecasts for solar power plants. Such forecasts are needed for electricity grids with high penetrations of renewable energy and can help to optimize plant operations. In this publication, two key applications of shadow cameras are briefly presented.

  2. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  3. Late afternoon view of the interior of the westernmost wall ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Late afternoon view of the interior of the westernmost wall section to be removed; camera facing north. (Note: lowered camera position significantly to minimize background distractions including the porta-john, building, and telephone pole) - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  4. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. What are we missing? Advantages of more than one viewpoint to estimate fish assemblages using baited video

    PubMed Central

    Huveneers, Charlie; Fairweather, Peter G.

    2018-01-01

    Counting errors can bias assessments of species abundance and richness, which can affect assessments of stock structure, population structure and monitoring programmes. Many methods for studying ecology use fixed viewpoints (e.g. camera traps, underwater video), but there is little known about how this biases the data obtained. In the marine realm, most studies using baited underwater video, a common method for monitoring fish and nekton, have previously only assessed fishes using a single bait-facing viewpoint. To investigate the biases stemming from using fixed viewpoints, we added cameras to cover 360° views around the units. We found similar species richness for all observed viewpoints but the bait-facing viewpoint recorded the highest fish abundance. Sightings of infrequently seen and shy species increased with the additional cameras and the extra viewpoints allowed the abundance estimates of highly abundant schooling species to be up to 60% higher. We specifically recommend the use of additional cameras for studies focusing on shyer species or those particularly interested in increasing the sensitivity of the method by avoiding saturation in highly abundant species. Studies may also benefit from using additional cameras to focus observation on the downstream viewpoint. PMID:29892386

  6. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. LOFT. Interior view of entry (TAN624) rollup door. Camera is ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Interior view of entry (TAN-624) rollup door. Camera is inside entry building facing south. Rollup door was a modification of the original ANP door arrangement. Date: March 2004. INEEL negative no. HD-39-5-2 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  8. Identifying People with Soft-Biometrics at Fleet Week

    DTIC Science & Technology

    2013-03-01

    onboard sensors. This included:  Color Camera: Located in the right eye, Octavia stored 640x480 RGB images at ~4 Hz from a Point Grey Firefly camera. A...Face Detection The Fleet Week experiments demonstrated the potential of soft biometrics for recognition, but all of the existing algorithms currently

  9. 1. VIEW OF ARVFS BUNKER TAKEN FROM GROUND ELEVATION. CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW OF ARVFS BUNKER TAKEN FROM GROUND ELEVATION. CAMERA FACING NORTH. VIEW SHOWS PROFILE OF BUNKER IN RELATION TO NATURAL GROUND ELEVATION. TOP OF BUNKER HAS APPROXIMATELY THREE FEET OF EARTH COVER. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  10. IET. Aerial view of project, 95 percent complete. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Aerial view of project, 95 percent complete. Camera facing east. Left to right: stack, duct, mobile test cell building (TAN-624), four-rail track, dolly. Retaining wall between mobile test building and shielded control building (TAN-620) just beyond. North of control building are tank building (TAN-627) and fuel-transfer pump building (TAN-625). Guard house at upper right along exclusion fence. Construction vehicles and temporary warehouse in view near guard house. Date: June 6, 1955. INEEL negative no. 55-1462 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  11. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  12. Accurate measurement of imaging photoplethysmographic signals based camera using weighted average

    NASA Astrophysics Data System (ADS)

    Pang, Zongguang; Kong, Lingqin; Zhao, Yuejin; Sun, Huijuan; Dong, Liquan; Hui, Mei; Liu, Ming; Liu, Xiaohua; Liu, Lingling; Li, Xiaohui; Li, Rongji

    2018-01-01

    Imaging Photoplethysmography (IPPG) is an emerging technique for the extraction of vital signs of human being using video recordings. IPPG technology with its advantages like non-contact measurement, low cost and easy operation has become one research hot spot in the field of biomedicine. However, the noise disturbance caused by non-microarterial area cannot be removed because of the uneven distribution of micro-arterial, different signal strength of each region, which results in a low signal noise ratio of IPPG signals and low accuracy of heart rate. In this paper, we propose a method of improving the signal noise ratio of camera-based IPPG signals of each sub-region of the face using a weighted average. Firstly, we obtain the region of interest (ROI) of a subject's face based camera. Secondly, each region of interest is tracked and feature-based matched in each frame of the video. Each tracked region of face is divided into 60x60 pixel block. Thirdly, the weights of PPG signal of each sub-region are calculated, based on the signal-to-noise ratio of each sub-region. Finally, we combine the IPPG signal from all the tracked ROI using weighted average. Compared with the existing approaches, the result shows that the proposed method takes modest but significant effects on improvement of signal noise ratio of camera-based PPG estimated and accuracy of heart rate measurement.

  13. Why Are Faces Denser in the Visual Experiences of Younger than Older Infants?

    ERIC Educational Resources Information Center

    Jayaraman, Swapnaa; Fausey, Caitlin M.; Smith, Linda B.

    2017-01-01

    Recent evidence from studies using head cameras suggests that the frequency of faces directly in front of infants "declines" over the first year and a half of life, a result that has implications for the development of and evolutionary constraints on face processing. Two experiments tested 2 opposing hypotheses about this observed…

  14. Field test studies of our infrared-based human temperature screening system embedded with a parallel measurement approach

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Chaitavon, Kosom

    2009-07-01

    This paper introduces a parallel measurement approach for fast infrared-based human temperature screening suitable for use in a large public area. Our key idea is based on the combination of simple image processing algorithms, infrared technology, and human flow management. With this multidisciplinary concept, we arrange as many people as possible in a two-dimensional space in front of a thermal imaging camera and then highlight all human facial areas through simple image filtering, image morphological, and particle analysis processes. In this way, an individual's face in live thermal image can be located and the maximum facial skin temperature can be monitored and displayed. Our experiment shows a measured 1 ms processing time in highlighting all human face areas. With a thermal imaging camera having an FOV lens of 24° × 18° and 320 × 240 active pixels, the maximum facial skin temperatures from three people's faces located at 1.3 m from the camera can also be simultaneously monitored and displayed in a measured rate of 31 fps, limited by the looping process in determining coordinates of all faces. For our 3-day test under the ambient temperature of 24-30 °C, 57-72% relative humidity, and weak wind from the outside hospital building, hyperthermic patients can be identified with 100% sensitivity and 36.4% specificity when the temperature threshold level and the offset temperature value are appropriately chosen. Appropriately locating our system away from the building doors, air conditioners and electric fans in order to eliminate wind blow coming toward the camera lens can significantly help improve our system specificity.

  15. SOUTH WING, MTR661. INTERIOR DETAIL INSIDE LAB ROOM 131. CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SOUTH WING, MTR-661. INTERIOR DETAIL INSIDE LAB ROOM 131. CAMERA FACING NORTHEAST. NOTE CONCRETE BLOCK WALLS. SAFETY SHOWER AND EYE WASHER AT REAR WALL. INL NEGATIVE NO. HD46-7-2. Mike Crane, Photographer, 2/2005. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  16. PBF Reactor Building (PER620). Aerial view of early construction. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Aerial view of early construction. Camera facing northwest. Excavation and concrete placement in two basements are underway. Note exposed lava rock. Photographer: Farmer. Date: March 22, 1965. INEEL negative no. 65-2219 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  17. 26. VIEW OF METAL SHED OVER SHIELDING TANK WITH CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    26. VIEW OF METAL SHED OVER SHIELDING TANK WITH CAMERA FACING SOUTHWEST. SHOWS OPEN SIDE OF SHED ROOF, HERCULON SHEET, AND HAND-OPERATED CRANE. TAKEN IN 1983. INEL PHOTO NUMBER 83-476-2-9, TAKEN IN 1983. PHOTOGRAPHER NOT NAMED. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  18. HOT CELL BUILDING, TRA632, INTERIOR. CELL 3, "HEAVY" CELL. CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632, INTERIOR. CELL 3, "HEAVY" CELL. CAMERA FACES WEST TOWARD BUILDING EXIT. OBSERVATION WINDOW AT LEFT EDGE OF VIEW. INL NEGATIVE NO. HD46-28-4. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  19. A&M. Guard house (TAN638), contextual view. Built in 1968. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Guard house (TAN-638), contextual view. Built in 1968. Camera faces south. Guard house controlled access to radioactive waste storage tanks beyond and to left of view. Date: February 4, 2003. INEEL negative no. HD-33-4-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  20. The Potential of Low-Cost Rpas for Multi-View Reconstruction of Sub-Vertical Rock Faces

    NASA Astrophysics Data System (ADS)

    Thoeni, K.; Guccione, D. E.; Santise, M.; Giacomini, A.; Roncella, R.; Forlani, G.

    2016-06-01

    The current work investigates the potential of two low-cost off-the-shelf quadcopters for multi-view reconstruction of sub-vertical rock faces. The two platforms used are a DJI Phantom 1 equipped with a Gopro Hero 3+ Black and a DJI Phantom 3 Professional with integrated camera. The study area is a small sub-vertical rock face. Several flights were performed with both cameras set in time-lapse mode. Hence, images were taken automatically but the flights were performed manually as the investigated rock face is very irregular which required manual adjustment of the yaw and roll for optimal coverage. The digital images were processed with commercial SfM software packages. Several processing settings were investigated in order to find out the one providing the most accurate 3D reconstruction of the rock face. To this aim, all 3D models produced with both platforms are compared to a point cloud obtained with a terrestrial laser scanner. Firstly, the difference between the use of coded ground control targets and the use of natural features was studied. Coded targets generally provide the best accuracy, but they need to be placed on the surface, which is not always possible, as sub-vertical rock faces are not easily accessible. Nevertheless, natural features can provide a good alternative if wisely chosen as shown in this work. Secondly, the influence of using fixed interior orientation parameters or self-calibration was investigated. The results show that, in the case of the used sensors and camera networks, self-calibration provides better results. To support such empirical finding, a numerical investigation using a Monte Carlo simulation was performed.

  1. Next-generation digital camera integration and software development issues

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Peters, Ken; Hecht, Richard

    1998-04-01

    This paper investigates the complexities associated with the development of next generation digital cameras due to requirements in connectivity and interoperability. Each successive generation of digital camera improves drastically in cost, performance, resolution, image quality and interoperability features. This is being accomplished by advancements in a number of areas: research, silicon, standards, etc. As the capabilities of these cameras increase, so do the requirements for both hardware and software. Today, there are two single chip camera solutions in the market including the Motorola MPC 823 and LSI DCAM- 101. Real time constraints for a digital camera may be defined by the maximum time allowable between capture of images. Constraints in the design of an embedded digital camera include processor architecture, memory, processing speed and the real-time operating systems. This paper will present the LSI DCAM-101, a single-chip digital camera solution. It will present an overview of the architecture and the challenges in hardware and software for supporting streaming video in such a complex device. Issues presented include the development of the data flow software architecture, testing and integration on this complex silicon device. The strategy for optimizing performance on the architecture will also be presented.

  2. Computer-generated hologram calculation for real scenes using a commercial portable plenoptic camera

    NASA Astrophysics Data System (ADS)

    Endo, Yutaka; Wakunami, Koki; Shimobaba, Tomoyoshi; Kakue, Takashi; Arai, Daisuke; Ichihashi, Yasuyuki; Yamamoto, Kenji; Ito, Tomoyoshi

    2015-12-01

    This paper shows the process used to calculate a computer-generated hologram (CGH) for real scenes under natural light using a commercial portable plenoptic camera. In the CGH calculation, a light field captured with the commercial plenoptic camera is converted into a complex amplitude distribution. Then the converted complex amplitude is propagated to a CGH plane. We tested both numerical and optical reconstructions of the CGH and showed that the CGH calculation from captured data with the commercial plenoptic camera was successful.

  3. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  4. HEAT EXCHANGER BUILDING, TRA644. NORTHEAST CORNER. CAMERA IS ON PIKE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HEAT EXCHANGER BUILDING, TRA-644. NORTHEAST CORNER. CAMERA IS ON PIKE STREET FACING SOUTHWEST. ATTACHED STRUCTURE AT RIGHT OF VIEW IS ETR COMPRESSOR BUILDING, TRA-643. INL NEGATIVE NO. HD46-36-4. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  5. PBF Cooling Tower (PER720) and its Auxiliary Building (PER625). Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower (PER-720) and its Auxiliary Building (PER-625). Camera facing west shows east facades. Center pipe carried secondary coolant water from reactor. Building to distributor basin. Date: August 2003. INEEL negative no. HD-35-10-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  6. ETRMTR MECHANICAL SERVICES BUILDING, TRA653, INTERIOR. CAMERA IS INSIDE MEN'S ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR-MTR MECHANICAL SERVICES BUILDING, TRA-653, INTERIOR. CAMERA IS INSIDE MEN'S LAVATORY AND SHOWER FACING SOUTHEAST. SHOWER AND TOILET STALLS ARE IN PLACE. ROUND COMMUNAL SINK AT LEFT. INL NEGATIVE NO. 57-3652. K. Mansfield, Photographer, 7/22/1957 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. Faces of the Fleet | Navy Live

    Science.gov Websites

    annual training exercise at Ft. Knox, Ky. (U.S. Navy Combat Camera photo by Mass Communication Specialist ), navigates a waterway during an annual training exercise at Ft. Knox, Ky. (U.S. Navy Combat Camera photo by , coxswain assigned to Coastal Riverine Squadron Four (CRS-4), navigates a waterway during an annual training

  8. A Framework for People Re-Identification in Multi-Camera Surveillance Systems

    ERIC Educational Resources Information Center

    Ammar, Sirine; Zaghden, Nizar; Neji, Mahmoud

    2017-01-01

    People re-identification has been a very active research topic recently in computer vision. It is an important application in surveillance system with disjoint cameras. This paper is focused on the implementation of a human re-identification system. First the face of detected people is divided into three parts and some soft-biometric traits are…

  9. Validation of Viewing Reports: Exploration of a Photographic Method.

    ERIC Educational Resources Information Center

    Fletcher, James E.; Chen, Charles Chao-Ping

    A time lapse camera loaded with Super 8 film was employed to photographically record the area in front of a conventional television receiver in selected homes. The camera took one picture each minute for three days, including in the same frame the face of the television receiver. Family members kept a conventional viewing diary of their viewing…

  10. Jupiter's Water Worlds

    NASA Technical Reports Server (NTRS)

    Pappalardo, R. T.

    2004-01-01

    When the twin Voyager spacecraft cruised past Jupiter in 1979, they did more than rewrite the textbooks on the giant planet. Their cameras also unveiled the astounding diversity of the four planet-size moons of ice and stone known as the Galilean satellites. The Voyagers revealed the cratered countenance of Callisto, the valleys and ridges of Ganymede, the cracked face of Europa, and the spewing volcanoes of Io. But it would take a spacecraft named for Italian scientist Galileo, who discovered the moons in 1610, to reveal the true complexity of these worlds and to begin to divulge their interior secrets. Incredibly, the Galileo data strongly suggest that Jupiter's three large icy moons (all but rocky Io) hide interior oceans.

  11. Bispectral infrared forest fire detection and analysis using classification techniques

    NASA Astrophysics Data System (ADS)

    Aranda, Jose M.; Melendez, Juan; de Castro, Antonio J.; Lopez, Fernando

    2004-01-01

    Infrared cameras are well established as a useful tool for fire detection, but their use for quantitative forest fire measurements faces difficulties, due to the complex spatial and spectral structure of fires. In this work it is shown that some of these difficulties can be overcome by applying classification techniques, a standard tool for the analysis of satellite multispectral images, to bi-spectral images of fires. Images were acquired by two cameras that operate in the medium infrared (MIR) and thermal infrared (TIR) bands. They provide simultaneous and co-registered images, calibrated in brightness temperatures. The MIR-TIR scatterplot of these images can be used to classify the scene into different fire regions (background, ashes, and several ember and flame regions). It is shown that classification makes possible to obtain quantitative measurements of physical fire parameters like rate of spread, embers temperature, and radiated power in the MIR and TIR bands. An estimation of total radiated power and heat release per unit area is also made and compared with values derived from heat of combustion and fuel consumption.

  12. Three-dimensional face model reproduction method using multiview images

    NASA Astrophysics Data System (ADS)

    Nagashima, Yoshio; Agawa, Hiroshi; Kishino, Fumio

    1991-11-01

    This paper describes a method of reproducing three-dimensional face models using multi-view images for a virtual space teleconferencing system that achieves a realistic visual presence for teleconferencing. The goal of this research, as an integral component of a virtual space teleconferencing system, is to generate a three-dimensional face model from facial images, synthesize images of the model virtually viewed from different angles, and with natural shadow to suit the lighting conditions of the virtual space. The proposed method is as follows: first, front and side view images of the human face are taken by TV cameras. The 3D data of facial feature points are obtained from front- and side-views by an image processing technique based on the color, shape, and correlation of face components. Using these 3D data, the prepared base face models, representing typical Japanese male and female faces, are modified to approximate the input facial image. The personal face model, representing the individual character, is then reproduced. Next, an oblique view image is taken by TV camera. The feature points of the oblique view image are extracted using the same image processing technique. A more precise personal model is reproduced by fitting the boundary of the personal face model to the boundary of the oblique view image. The modified boundary of the personal face model is determined by using face direction, namely rotation angle, which is detected based on the extracted feature points. After the 3D model is established, the new images are synthesized by mapping facial texture onto the model.

  13. PBF Control Building (PER619). Interior of control room shows control ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Control Building (PER-619). Interior of control room shows control console from direction facing visitors room and its observation window. Camera facing northeast. Date: May 2004. INEEL negative no. HD-41-7-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  14. Real Time 3D Facial Movement Tracking Using a Monocular Camera

    PubMed Central

    Dong, Yanchao; Wang, Yanming; Yue, Jiguang; Hu, Zhencheng

    2016-01-01

    The paper proposes a robust framework for 3D facial movement tracking in real time using a monocular camera. It is designed to estimate the 3D face pose and local facial animation such as eyelid movement and mouth movement. The framework firstly utilizes the Discriminative Shape Regression method to locate the facial feature points on the 2D image and fuses the 2D data with a 3D face model using Extended Kalman Filter to yield 3D facial movement information. An alternating optimizing strategy is adopted to fit to different persons automatically. Experiments show that the proposed framework could track the 3D facial movement across various poses and illumination conditions. Given the real face scale the framework could track the eyelid with an error of 1 mm and mouth with an error of 2 mm. The tracking result is reliable for expression analysis or mental state inference. PMID:27463714

  15. Real Time 3D Facial Movement Tracking Using a Monocular Camera.

    PubMed

    Dong, Yanchao; Wang, Yanming; Yue, Jiguang; Hu, Zhencheng

    2016-07-25

    The paper proposes a robust framework for 3D facial movement tracking in real time using a monocular camera. It is designed to estimate the 3D face pose and local facial animation such as eyelid movement and mouth movement. The framework firstly utilizes the Discriminative Shape Regression method to locate the facial feature points on the 2D image and fuses the 2D data with a 3D face model using Extended Kalman Filter to yield 3D facial movement information. An alternating optimizing strategy is adopted to fit to different persons automatically. Experiments show that the proposed framework could track the 3D facial movement across various poses and illumination conditions. Given the real face scale the framework could track the eyelid with an error of 1 mm and mouth with an error of 2 mm. The tracking result is reliable for expression analysis or mental state inference.

  16. Computer vision research with new imaging technology

    NASA Astrophysics Data System (ADS)

    Hou, Guangqi; Liu, Fei; Sun, Zhenan

    2015-12-01

    Light field imaging is capable of capturing dense multi-view 2D images in one snapshot, which record both intensity values and directions of rays simultaneously. As an emerging 3D device, the light field camera has been widely used in digital refocusing, depth estimation, stereoscopic display, etc. Traditional multi-view stereo (MVS) methods only perform well on strongly texture surfaces, but the depth map contains numerous holes and large ambiguities on textureless or low-textured regions. In this paper, we exploit the light field imaging technology on 3D face modeling in computer vision. Based on a 3D morphable model, we estimate the pose parameters from facial feature points. Then the depth map is estimated through the epipolar plane images (EPIs) method. At last, the high quality 3D face model is exactly recovered via the fusing strategy. We evaluate the effectiveness and robustness on face images captured by a light field camera with different poses.

  17. Three-dimensional face pose detection and tracking using monocular videos: tool and application.

    PubMed

    Dornaika, Fadi; Raducanu, Bogdan

    2009-08-01

    Recently, we have proposed a real-time tracker that simultaneously tracks the 3-D head pose and facial actions in monocular video sequences that can be provided by low quality cameras. This paper has two main contributions. First, we propose an automatic 3-D face pose initialization scheme for the real-time tracker by adopting a 2-D face detector and an eigenface system. Second, we use the proposed methods-the initialization and tracking-for enhancing the human-machine interaction functionality of an AIBO robot. More precisely, we show how the orientation of the robot's camera (or any active vision system) can be controlled through the estimation of the user's head pose. Applications based on head-pose imitation such as telepresence, virtual reality, and video games can directly exploit the proposed techniques. Experiments on real videos confirm the robustness and usefulness of the proposed methods.

  18. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F.; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  19. Power Burst Facility (PBF), PER620, contextual and oblique view. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Power Burst Facility (PBF), PER-620, contextual and oblique view. Camera facing northwest. South and east facade. The 1980 west-wing expansion is left of center bay. Concrete structure at right is PER-730. Date: March 2004. INEEL negative no. HD-41-2-3 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  20. NGEE Arctic Webcam Photographs, Barrow Environmental Observatory, Barrow, Alaska

    DOE Data Explorer

    Bob Busey; Larry Hinzman

    2012-04-01

    The NGEE Arctic Webcam (PTZ Camera) captures two views of seasonal transitions from its generally south-facing position on a tower located at the Barrow Environmental Observatory near Barrow, Alaska. Images are captured every 30 minutes. Historical images are available for download. The camera is operated by the U.S. DOE sponsored Next Generation Ecosystem Experiments - Arctic (NGEE Arctic) project.

  1. Camera-Based Microswitch Technology for Eyelid and Mouth Responses of Persons with Profound Multiple Disabilities: Two Case Studies

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff

    2010-01-01

    These two studies assessed camera-based microswitch technology for eyelid and mouth responses of two persons with profound multiple disabilities and minimal motor behavior. This technology, in contrast with the traditional optic microswitches used for those responses, did not require support frames on the participants' face but only small color…

  2. A&M. Hot liquid waste treatment building (TAN616), south side. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616), south side. Camera facing north. Personnel door at left side of wall. Partial view of outdoor stairway to upper level platform. Note concrete construction. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-1-3 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  3. ADM. Aerial view of administration area. Camera facing westerly. From ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ADM. Aerial view of administration area. Camera facing westerly. From left to right in foregound: Substation (TAN-605), Warehouse (TAN-628), Gate House (TAN-601), Administration Building (TAN-602). Left to right middle ground: Service Building (TAN-603), Warehouse (later known as Maintenance Shop or Craft Shop, TAN-604), Water Well Pump Houses, Fuel Tanks and Fuel Pump Houses, and Water Storage Tanks. Change House (TAN-606) on near side of berm. Large building beyond berm is A&M. Building, TAN-607. Railroad tracks beyond lead from (unseen) turntable to the IET. Date: June 6, 1955. INEEL negative no. 13201 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  4. PBF (PER620) interior of Reactor Room. Camera facing south from ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) interior of Reactor Room. Camera facing south from stairway platform in southwest corner (similar to platform in view at left). Reactor was beneath water in circular tank. Fuel was stored in the canal north of it. Platform and apparatus at right is reactor bridge with control rod mechanisms and actuators. The entire apparatus swung over the reactor and pool during operations. Personnel in view are involved with decontamination and preparation of facility for demolition. Note rails near ceiling for crane; motor for rollup door at upper center of view. Date: March 2004. INEEL negative no. HD-41-3-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  5. From a Million Miles Away, NASA Camera Shows Moon Crossing Face of Earth

    NASA Image and Video Library

    2015-08-05

    This animation still image shows the far side of the moon, illuminated by the sun, as it crosses between the DISCOVR spacecraft's Earth Polychromatic Imaging Camera (EPIC) camera and telescope, and the Earth - one million miles away. Credits: NASA/NOAA A NASA camera aboard the Deep Space Climate Observatory (DSCOVR) satellite captured a unique view of the moon as it moved in front of the sunlit side of Earth last month. The series of test images shows the fully illuminated “dark side” of the moon that is never visible from Earth. The images were captured by NASA’s Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope on the DSCOVR satellite orbiting 1 million miles from Earth. From its position between the sun and Earth, DSCOVR conducts its primary mission of real-time solar wind monitoring for the National Oceanic and Atmospheric Administration (NOAA).

  6. Can we match ultraviolet face images against their visible counterparts?

    NASA Astrophysics Data System (ADS)

    Narang, Neeru; Bourlai, Thirimachos; Hornak, Lawrence A.

    2015-05-01

    In law enforcement and security applications, the acquisition of face images is critical in producing key trace evidence for the successful identification of potential threats. However, face recognition (FR) for face images captured using different camera sensors, and under variable illumination conditions, and expressions is very challenging. In this paper, we investigate the advantages and limitations of the heterogeneous problem of matching ultra violet (from 100 nm to 400 nm in wavelength) or UV, face images against their visible (VIS) counterparts, when all face images are captured under controlled conditions. The contributions of our work are three-fold; (i) We used a camera sensor designed with the capability to acquire UV images at short-ranges, and generated a dual-band (VIS and UV) database that is composed of multiple, full frontal, face images of 50 subjects. Two sessions were collected that span over the period of 2 months. (ii) For each dataset, we determined which set of face image pre-processing algorithms are more suitable for face matching, and, finally, (iii) we determined which FR algorithm better matches cross-band face images, resulting in high rank-1 identification rates. Experimental results show that our cross spectral matching (the heterogeneous problem, where gallery and probe sets consist of face images acquired in different spectral bands) algorithms achieve sufficient identification performance. However, we also conclude that the problem under study, is very challenging, and it requires further investigation to address real-world law enforcement or military applications. To the best of our knowledge, this is first time in the open literature the problem of cross-spectral matching of UV against VIS band face images is being investigated.

  7. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum †

    PubMed Central

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-01-01

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781

  8. Teleoperated control system for underground room and pillar mining

    DOEpatents

    Mayercheck, William D.; Kwitowski, August J.; Brautigam, Albert L.; Mueller, Brian K.

    1992-01-01

    A teleoperated mining system is provided for remotely controlling the various machines involved with thin seam mining. A thin seam continuous miner located at a mining face includes a camera mounted thereon and a slave computer for controlling the miner and the camera. A plurality of sensors for relaying information about the miner and the face to the slave computer. A slave computer controlled ventilation sub-system which removes combustible material from the mining face. A haulage sub-system removes material mined by the continuous miner from the mining face to a collection site and is also controlled by the slave computer. A base station, which controls the supply of power and water to the continuous miner, haulage system, and ventilation systems, includes cable/hose handling module for winding or unwinding cables/hoses connected to the miner, an operator control module, and a hydraulic power and air compressor module for supplying air to the miner. An operator controlled host computer housed in the operator control module is connected to the slave computer via a two wire communications line.

  9. LOFT. Interior view of entry to reactor building, TAN650. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Interior view of entry to reactor building, TAN-650. Camera is inside entry (TAN-624) and facing north. At far end of domed chamber are penetrations in wall for electrical and other connections. Reactor and other equipment has been removed. Date: March 2004. INEEL negative no. HD-39-5-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  10. MTR BUILDING, TRA603. SOUTHEAST CORNER, EAST SIDE FACING TOWARD RIGHT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BUILDING, TRA-603. SOUTHEAST CORNER, EAST SIDE FACING TOWARD RIGHT OF VIEW. CAMERA FACING NORTHWEST. LIGHT-COLORED PROJECTION AT LEFT IS ENGINEERING SERVICES BUILDING, TRA-635. SMALL CONCRETE BLOCK BUILDING AT CENTER OF VIEW IS FAST CHOPPER DETECTOR HOUSE, TRA-665. INL NEGATIVE NO. HD46-43-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. ETR, TRA642, CAMERA IS BELOW, BUT NEAR THE CEILING OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR, TRA-642, CAMERA IS BELOW, BUT NEAR THE CEILING OF THE GROUND FLOOR, AND LOOKS DOWN TOWARD THE CONSOLE FLOOR. CAMERA FACES WESTERLY. THE REACTOR PIT IS IN THE CENTER OF THE VIEW. BEYOND IT TO THE LEFT IS THE SOUTH SIDE OF THE WORKING CANAL. IN THE FOREGROUND ON THE RIGHT IS THE SHIELDING FOR THE PROCESS WATER TUNNEL AND PIPING. SPIRAL STAIRCASE AT LEFT OF VIEW. INL NEGATIVE NO. 56-2237. Jack L. Anderson, Photographer, 7/6/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  12. A multiple camera tongue switch for a child with severe spastic quadriplegic cerebral palsy.

    PubMed

    Leung, Brian; Chau, Tom

    2010-01-01

    The present study proposed a video-based access technology that facilitated a non-contact tongue protrusion access modality for a 7-year-old boy with severe spastic quadriplegic cerebral palsy (GMFCS level 5). The proposed system featured a centre camera and two peripheral cameras to extend coverage of the frontal face view of this user for longer durations. The child participated in a descriptive case study. The participant underwent 3 months of tongue protrusion training while the multiple camera tongue switch prototype was being prepared. Later, the participant was brought back for five experiment sessions where he worked on a single-switch picture matching activity, using the multiple camera tongue switch prototype in a controlled environment. The multiple camera tongue switch achieved an average sensitivity of 82% and specificity of 80%. In three of the experiment sessions, the peripheral cameras were associated with most of the true positive switch activations. These activations would have been missed by a centre-camera-only setup. The study demonstrated proof-of-concept of a non-contact tongue access modality implemented by a video-based system involving three cameras and colour video processing.

  13. An Application for Driver Drowsiness Identification based on Pupil Detection using IR Camera

    NASA Astrophysics Data System (ADS)

    Kumar, K. S. Chidanand; Bhowmick, Brojeshwar

    A Driver drowsiness identification system has been proposed that generates alarms when driver falls asleep during driving. A number of different physical phenomena can be monitored and measured in order to detect drowsiness of driver in a vehicle. This paper presents a methodology for driver drowsiness identification using IR camera by detecting and tracking pupils. The face region is first determined first using euler number and template matching. Pupils are then located in the face region. In subsequent frames of video, pupils are tracked in order to find whether the eyes are open or closed. If eyes are closed for several consecutive frames then it is concluded that the driver is fatigued and alarm is generated.

  14. High-emulation mask recognition with high-resolution hyperspectral video capture system

    NASA Astrophysics Data System (ADS)

    Feng, Jiao; Fang, Xiaojing; Li, Shoufeng; Wang, Yongjin

    2014-11-01

    We present a method for distinguishing human face from high-emulation mask, which is increasingly used by criminals for activities such as stealing card numbers and passwords on ATM. Traditional facial recognition technique is difficult to detect such camouflaged criminals. In this paper, we use the high-resolution hyperspectral video capture system to detect high-emulation mask. A RGB camera is used for traditional facial recognition. A prism and a gray scale camera are used to capture spectral information of the observed face. Experiments show that mask made of silica gel has different spectral reflectance compared with the human skin. As multispectral image offers additional spectral information about physical characteristics, high-emulation mask can be easily recognized.

  15. Brief Report: Using a Point-of-View Camera to Measure Eye Gaze in Young Children with Autism Spectrum Disorder during Naturalistic Social Interactions--A Pilot Study

    ERIC Educational Resources Information Center

    Edmunds, Sarah R.; Rozga, Agata; Li, Yin; Karp, Elizabeth A.; Ibanez, Lisa V.; Rehg, James M.; Stone, Wendy L.

    2017-01-01

    Children with autism spectrum disorder (ASD) show reduced gaze to social partners. Eye contact during live interactions is often measured using stationary cameras that capture various views of the child, but determining a child's precise gaze target within another's face is nearly impossible. This study compared eye gaze coding derived from…

  16. ETR, TRA642. CONSOLE FLOOR. CAMERA IS ON WEST SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR, TRA-642. CONSOLE FLOOR. CAMERA IS ON WEST SIDE OF FLOOR AND FACES NORTH. OUTER WALL OF STORAGE CANAL IS AT RIGHT. SHIELDING IS THICKER AT LOWER LEVEL, WHERE SPENT FUEL ELEMENTS WILL COOL AFTER REMOVAL FROM REACTOR. INL NEGATIVE NO. 56-1401. Jack L. Anderson, Photographer, 5/1/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  17. PBF Reactor Building (PER620). Camera is in cab of electricpowered ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera is in cab of electric-powered rail crane and facing east. Reactor pit and storage canal have been shaped. Floors for wings on east and west side are above and below reactor in view. Photographer: Larry Page. Date: August 23, 1967. INEEL negative no. 67-4403 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  18. IET. Aerial view of snaptran destructive experiment in 1964. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Aerial view of snaptran destructive experiment in 1964. Camera facing north. Test cell building (TAN-624) is positioned away from coupling station. Weather tower in right foreground. Divided duct just beyond coupling station. Air intake structure on south side of shielded control room. Experiment is on dolly at coupling station. Date: 1964. INEEL negative no. 64-1736 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  19. 5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. 32. DETAIL VIEW OF CAMERA PIT SOUTH OF LAUNCH PAD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    32. DETAIL VIEW OF CAMERA PIT SOUTH OF LAUNCH PAD WITH CAMERA AIMED AT LAUNCH DECK; VIEW TO NORTHEAST. - Cape Canaveral Air Station, Launch Complex 17, Facility 28402, East end of Lighthouse Road, Cape Canaveral, Brevard County, FL

  1. Face pose tracking using the four-point algorithm

    NASA Astrophysics Data System (ADS)

    Fung, Ho Yin; Wong, Kin Hong; Yu, Ying Kin; Tsui, Kwan Pang; Kam, Ho Chuen

    2017-06-01

    In this paper, we have developed an algorithm to track the pose of a human face robustly and efficiently. Face pose estimation is very useful in many applications such as building virtual reality systems and creating an alternative input method for the disabled. Firstly, we have modified a face detection toolbox called DLib for the detection of a face in front of a camera. The detected face features are passed to a pose estimation method, known as the four-point algorithm, for pose computation. The theory applied and the technical problems encountered during system development are discussed in the paper. It is demonstrated that the system is able to track the pose of a face in real time using a consumer grade laptop computer.

  2. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  3. Adapting Local Features for Face Detection in Thermal Image.

    PubMed

    Ma, Chao; Trung, Ngo Thanh; Uchiyama, Hideaki; Nagahara, Hajime; Shimada, Atsushi; Taniguchi, Rin-Ichiro

    2017-11-27

    A thermal camera captures the temperature distribution of a scene as a thermal image. In thermal images, facial appearances of different people under different lighting conditions are similar. This is because facial temperature distribution is generally constant and not affected by lighting condition. This similarity in face appearances is advantageous for face detection. To detect faces in thermal images, cascade classifiers with Haar-like features are generally used. However, there are few studies exploring the local features for face detection in thermal images. In this paper, we introduce two approaches relying on local features for face detection in thermal images. First, we create new feature types by extending Multi-Block LBP. We consider a margin around the reference and the generally constant distribution of facial temperature. In this way, we make the features more robust to image noise and more effective for face detection in thermal images. Second, we propose an AdaBoost-based training method to get cascade classifiers with multiple types of local features. These feature types have different advantages. In this way we enhance the description power of local features. We did a hold-out validation experiment and a field experiment. In the hold-out validation experiment, we captured a dataset from 20 participants, comprising 14 males and 6 females. For each participant, we captured 420 images with 10 variations in camera distance, 21 poses, and 2 appearances (participant with/without glasses). We compared the performance of cascade classifiers trained by different sets of the features. The experiment results showed that the proposed approaches effectively improve the performance of face detection in thermal images. In the field experiment, we compared the face detection performance in realistic scenes using thermal and RGB images, and gave discussion based on the results.

  4. Heart rate measurement based on face video sequence

    NASA Astrophysics Data System (ADS)

    Xu, Fang; Zhou, Qin-Wu; Wu, Peng; Chen, Xing; Yang, Xiaofeng; Yan, Hong-jian

    2015-03-01

    This paper proposes a new non-contact heart rate measurement method based on photoplethysmography (PPG) theory. With this method we can measure heart rate remotely with a camera and ambient light. We collected video sequences of subjects, and detected remote PPG signals through video sequences. Remote PPG signals were analyzed with two methods, Blind Source Separation Technology (BSST) and Cross Spectral Power Technology (CSPT). BSST is a commonly used method, and CSPT is used for the first time in the study of remote PPG signals in this paper. Both of the methods can acquire heart rate, but compared with BSST, CSPT has clearer physical meaning, and the computational complexity of CSPT is lower than that of BSST. Our work shows that heart rates detected by CSPT method have good consistency with the heart rates measured by a finger clip oximeter. With good accuracy and low computational complexity, the CSPT method has a good prospect for the application in the field of home medical devices and mobile health devices.

  5. Serial fusion of Eulerian and Lagrangian approaches for accurate heart-rate estimation using face videos.

    PubMed

    Gupta, Puneet; Bhowmick, Brojeshwar; Pal, Arpan

    2017-07-01

    Camera-equipped devices are ubiquitous and proliferating in the day-to-day life. Accurate heart rate (HR) estimation from the face videos acquired from the low cost cameras in a non-contact manner, can be used in many real-world scenarios and hence, require rigorous exploration. This paper has presented an accurate and near real-time HR estimation system using these face videos. It is based on the phenomenon that the color and motion variations in the face video are closely related to the heart beat. The variations also contain the noise due to facial expressions, respiration, eye blinking and environmental factors which are handled by the proposed system. Neither Eulerian nor Lagrangian temporal signals can provide accurate HR in all the cases. The cases where Eulerian temporal signals perform spuriously are determined using a novel poorness measure and then both the Eulerian and Lagrangian temporal signals are employed for better HR estimation. Such a fusion is referred as serial fusion. Experimental results reveal that the error introduced in the proposed algorithm is 1.8±3.6 which is significantly lower than the existing well known systems.

  6. A&M. TAN607. Detail of fuel storage pool under construction. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. TAN-607. Detail of fuel storage pool under construction. Camera is on berm and facing northwest. Note depth of excavation. Formwork underway for floor and concrete walls of pool; wall between pool and vestibule. At center left of view, foundation for liquid waste treatment plant is poured. Date: August 25, 1953. INEEL negative no. 8541 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  7. REACTOR SERVICE BUILDING, TRA635, CONTEXTUAL VIEW DURING CONSTRUCTION. CAMERA IS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    REACTOR SERVICE BUILDING, TRA-635, CONTEXTUAL VIEW DURING CONSTRUCTION. CAMERA IS ATOP MTR BUILDING AND LOOKING SOUTHERLY. FOUNDATION AND DRAINS ARE UNDER CONSTRUCTION. THE BUILDING WILL BUTT AGAINST CHARGING FACE OF PLUG STORAGE BUILDING. HOT CELL BUILDING, TRA-632, IS UNDER CONSTRUCTION AT TOP CENTER OF VIEW. INL NEGATIVE NO. 8518. Unknown Photographer, 8/25/1953 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  8. ETR BUILDING, TRA642, INTERIOR. CONSOLE FLOOR, NORTH HALF. CAMERA IS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. CONSOLE FLOOR, NORTH HALF. CAMERA IS NEAR NORTHWEST CORNER AND FACING SOUTH ALONG WEST CORRIDOR. STORAGE CANAL IS ALONG LEFT OF VIEW; PERIMETER WALL, ALONG RIGHT. CORRIDOR WAS ONE MEANS OF WALKING FROM NORTH TO SOUTH SIDE OF CONSOLE FLOOR. INL NEGATIVE NO. HD46-18-1. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. Acquisition of the spatial temperature distribution of rock faces by using infrared thermography

    NASA Astrophysics Data System (ADS)

    Beham, Michael; Rode, Matthias; Schnepfleitner, Harald; Sass, Oliver

    2013-04-01

    Rock temperature plays a central role for weathering and therefore influences the risk potential originating from rockfall processes. So far, for the acquisition of temperature mainly point-based measuring methods have been used and accordingly, two-dimensional temperature data is rare. To overcome this limitation, an infrared camera was used to collect and analyse data on the spatial temperature distribution on 10 x 10 m sections of rock faces in the Gesäuse (900m a.s.l.) and in the Dachsteingebirge (2700m a.s.l.) within the framework of the research project ROCKING ALPS (FWF-P24244). The advantage of infrared thermography to capture area-wide temperatures has hardly ever been used in this context. In order to investigate the differences between north-facing and south-facing rock faces at about the same period of time it was necessary to move the camera between the sites. The resulting offset of the time lapse infrared images made it necessary to develop a sophisticated methodology to rectify the captured images in order to create matching datasets for future analysis. With the relatively simple camera used, one of the main challenges was to find a way to convert the colour-scale or grey-scale values of the rectified image back to temperature values after the rectification process. The processing steps were mainly carried out with MATLAB. South-facing rock faces generally experienced higher temperatures and amplitudes compared to the north facing ones. In view of the spatial temperature distribution, the temperatures of shady areas were clearly below those of sunny ones, with the latter also showing the highest amplitudes. Joints and sun-shaded areas were characterised by attenuated diurnal temperature fluctuations closely paralleled to the air temperature. The temperature of protruding rock parts and of loose debris responded very quick to changes in radiation and air temperatures while massive rock reacted more slowly. The potential effects of temperature on weathering could only be assessed in a qualitative way by now. However, the variability of temperatures and amplitudes on a rather small and homogeneous section of a rockwall is surprisingly high which challenges any statements on weathering effectiveness based on point measurements. In simple terms, the use of infrared thermography has proven its value in the presented pilot study and is going to be a promising tool for research into rock weathering.

  10. Research on the electro-optical assistant landing system based on the dual camera photogrammetry algorithm

    NASA Astrophysics Data System (ADS)

    Mi, Yuhe; Huang, Yifan; Li, Lin

    2015-08-01

    Based on the location technique of beacon photogrammetry, Dual Camera Photogrammetry (DCP) algorithm was used to assist helicopters landing on the ship. In this paper, ZEMAX was used to simulate the two Charge Coupled Device (CCD) cameras imaging four beacons on both sides of the helicopter and output the image to MATLAB. Target coordinate systems, image pixel coordinate systems, world coordinate systems and camera coordinate systems were established respectively. According to the ideal pin-hole imaging model, the rotation matrix and translation vector of the target coordinate systems and the camera coordinate systems could be obtained by using MATLAB to process the image information and calculate the linear equations. On the basis mentioned above, ambient temperature and the positions of the beacons and cameras were changed in ZEMAX to test the accuracy of the DCP algorithm in complex sea status. The numerical simulation shows that in complex sea status, the position measurement accuracy can meet the requirements of the project.

  11. Contextual view showing northeastern eucalyptus windbreak and portion of citrus ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing northeastern eucalyptus windbreak and portion of citrus orchard. Camera facing 118" east-southeast. - Goerlitz House, 9893 Highland Avenue, Rancho Cucamonga, San Bernardino County, CA

  12. Accuracy of Wearable Cameras to Track Social Interactions in Stroke Survivors.

    PubMed

    Dhand, Amar; Dalton, Alexandra E; Luke, Douglas A; Gage, Brian F; Lee, Jin-Moo

    2016-12-01

    Social isolation after a stroke is related to poor outcomes. However, a full study of social networks on stroke outcomes is limited by the current metrics available. Typical measures of social networks rely on self-report, which is vulnerable to response bias and measurement error. We aimed to test the accuracy of an objective measure-wearable cameras-to capture face-to-face social interactions in stroke survivors. If accurate and usable in real-world settings, this technology would allow improved examination of social factors on stroke outcomes. In this prospective study, 10 stroke survivors each wore 2 wearable cameras: Autographer (OMG Life Limited, Oxford, United Kingdom) and Narrative Clip (Narrative, Linköping, Sweden). Each camera automatically took a picture every 20-30 seconds. Patients mingled with healthy controls for 5 minutes of 1-on-1 interactions followed by 5 minutes of no interaction for 2 hours. After the event, 2 blinded judges assessed whether photograph sequences identified interactions or noninteractions. Diagnostic accuracy statistics were calculated. A total of 8776 photographs were taken and adjudicated. In distinguishing interactions, the Autographer's sensitivity was 1.00 and specificity was .98. The Narrative Clip's sensitivity was .58 and specificity was 1.00. The receiver operating characteristic curves of the 2 devices were statistically different (Z = 8.26, P < .001). Wearable cameras can accurately detect social interactions of stroke survivors. Likely because of its large field of view, the Autographer was more sensitive than the Narrative Clip for this purpose. Copyright © 2016 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  13. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. Mars Orbiter Camera Views the 'Face on Mars' - Best View from Viking

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

    The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long.

    This Viking Orbiter image is one of the best Viking pictures of the area Cydonia where the 'Face' is located. Marked on the image are the 'footprint' of the high resolution (narrow angle) Mars Orbiter Camera image and the area seen in enlarged views (dashed box). See PIA01440-1442 for these images in raw and processed form.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  15. LOFT. Reactor support apparatus inside containment building (TAN650). Camera is ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Reactor support apparatus inside containment building (TAN-650). Camera is on crane rail level and facing northerly. View shows top two banks of round conduit openings on wall for electrical and other connections to control room. Ladders and platforms provide access to reactor instrumentation. Note hatch in floor and drain at edge of floor near wall. Date: 1974. INEEL negative no. 74-219 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  16. 24/7 security system: 60-FPS color EMCCD camera with integral human recognition

    NASA Astrophysics Data System (ADS)

    Vogelsong, T. L.; Boult, T. E.; Gardner, D. W.; Woodworth, R.; Johnson, R. C.; Heflin, B.

    2007-04-01

    An advanced surveillance/security system is being developed for unattended 24/7 image acquisition and automated detection, discrimination, and tracking of humans and vehicles. The low-light video camera incorporates an electron multiplying CCD sensor with a programmable on-chip gain of up to 1000:1, providing effective noise levels of less than 1 electron. The EMCCD camera operates in full color mode under sunlit and moonlit conditions, and monochrome under quarter-moonlight to overcast starlight illumination. Sixty frame per second operation and progressive scanning minimizes motion artifacts. The acquired image sequences are processed with FPGA-compatible real-time algorithms, to detect/localize/track targets and reject non-targets due to clutter under a broad range of illumination conditions and viewing angles. The object detectors that are used are trained from actual image data. Detectors have been developed and demonstrated for faces, upright humans, crawling humans, large animals, cars and trucks. Detection and tracking of targets too small for template-based detection is achieved. For face and vehicle targets the results of the detection are passed to secondary processing to extract recognition templates, which are then compared with a database for identification. When combined with pan-tilt-zoom (PTZ) optics, the resulting system provides a reliable wide-area 24/7 surveillance system that avoids the high life-cycle cost of infrared cameras and image intensifiers.

  17. Low-complexity camera digital signal imaging for video document projection system

    NASA Astrophysics Data System (ADS)

    Hsia, Shih-Chang; Tsai, Po-Shien

    2011-04-01

    We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.

  18. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    NASA Technical Reports Server (NTRS)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the primary pattern [1-3]. We have demonstrated experimentally an optical processor for security verification of objects, products, and persons. This demonstration is very important to encourage industries to consider the proposed system for research and development.

  19. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  1. Characterizing Microbial Mat Morphology with Structure from Motion Techniques in Ice-Covered Lake Joyce, McMurdo Dry Valleys, Antarctica

    NASA Astrophysics Data System (ADS)

    Mackey, T. J.; Leidman, S. Z.; Allen, B.; Hawes, I.; Lawrence, J.; Jungblut, A. D.; Krusor, M.; Coleman, L.; Sumner, D. Y.

    2015-12-01

    Structure from Motion (SFM) techniques can provide quantitative morphological documentation of otherwise inaccessible benthic ecosystems such as microbial mats in Lake Joyce, a perennially ice-covered lake of the Antarctic McMurdo Dry Valleys (MDV). Microbial mats are a key ecosystem of MDV lakes, and diverse mat morphologies like pinnacles emerge from interactions among microbial behavior, mineralization, and environmental conditions. Environmental gradients can be isolated to test mat growth models, but assessment of mat morphology along these gradients is complicated by their inaccessibility: the Lake Joyce ice cover is 4-5 m thick, water depths containing diverse pinnacle morphologies are 9-14 m, and relevant mat features are cm-scale. In order to map mat pinnacle morphology in different sedimentary settings, we deployed drop cameras (SeaViewer and GoPro) through 29 GPS referenced drill holes clustered into six stations along a transect spanning 880 m. Once under the ice cover, a boom containing a second GoPro camera was unfurled and rotated to collect oblique images of the benthic mats within dm of the mat-water interface. This setup allowed imaging from all sides over a ~1.5 m diameter area of the lake bottom. Underwater lens parameters were determined for each camera in Agisoft Lens; images were reconstructed and oriented in space with the SFM software Agisoft Photoscan, using the drop camera axis of rotation as up. The reconstructions were compared to downward facing images to assess accuracy, and similar images of an object with known geometry provided a test for expected error in reconstructions. Downward facing images identify decreasing pinnacle abundance in higher sedimentation settings, and quantitative measurements of 3D reconstructions in KeckCAVES LidarViewer supplement these mat morphological facies with measurements of pinnacle height and orientation. Reconstructions also help isolate confounding variables for mat facies trends with measurements of lake bottom slope and underlying relief that could influence pinnacle growth. Comparison of 3D reconstructions to downward-facing drop camera images demonstrate that SFM is a powerful tool for documenting diverse mat morphologies across environmental gradients in ice-covered lakes.

  2. A reduced-dimensionality approach to uncovering dyadic modes of body motion in conversations.

    PubMed

    Gaziv, Guy; Noy, Lior; Liron, Yuvalal; Alon, Uri

    2017-01-01

    Face-to-face conversations are central to human communication and a fascinating example of joint action. Beyond verbal content, one of the primary ways in which information is conveyed in conversations is body language. Body motion in natural conversations has been difficult to study precisely due to the large number of coordinates at play. There is need for fresh approaches to analyze and understand the data, in order to ask whether dyads show basic building blocks of coupled motion. Here we present a method for analyzing body motion during joint action using depth-sensing cameras, and use it to analyze a sample of scientific conversations. Our method consists of three steps: defining modes of body motion of individual participants, defining dyadic modes made of combinations of these individual modes, and lastly defining motion motifs as dyadic modes that occur significantly more often than expected given the single-person motion statistics. As a proof-of-concept, we analyze the motion of 12 dyads of scientists measured using two Microsoft Kinect cameras. In our sample, we find that out of many possible modes, only two were motion motifs: synchronized parallel torso motion in which the participants swayed from side to side in sync, and still segments where neither person moved. We find evidence of dyad individuality in the use of motion modes. For a randomly selected subset of 5 dyads, this individuality was maintained for at least 6 months. The present approach to simplify complex motion data and to define motion motifs may be used to understand other joint tasks and interactions. The analysis tools developed here and the motion dataset are publicly available.

  3. A reduced-dimensionality approach to uncovering dyadic modes of body motion in conversations

    PubMed Central

    Noy, Lior; Liron, Yuvalal; Alon, Uri

    2017-01-01

    Face-to-face conversations are central to human communication and a fascinating example of joint action. Beyond verbal content, one of the primary ways in which information is conveyed in conversations is body language. Body motion in natural conversations has been difficult to study precisely due to the large number of coordinates at play. There is need for fresh approaches to analyze and understand the data, in order to ask whether dyads show basic building blocks of coupled motion. Here we present a method for analyzing body motion during joint action using depth-sensing cameras, and use it to analyze a sample of scientific conversations. Our method consists of three steps: defining modes of body motion of individual participants, defining dyadic modes made of combinations of these individual modes, and lastly defining motion motifs as dyadic modes that occur significantly more often than expected given the single-person motion statistics. As a proof-of-concept, we analyze the motion of 12 dyads of scientists measured using two Microsoft Kinect cameras. In our sample, we find that out of many possible modes, only two were motion motifs: synchronized parallel torso motion in which the participants swayed from side to side in sync, and still segments where neither person moved. We find evidence of dyad individuality in the use of motion modes. For a randomly selected subset of 5 dyads, this individuality was maintained for at least 6 months. The present approach to simplify complex motion data and to define motion motifs may be used to understand other joint tasks and interactions. The analysis tools developed here and the motion dataset are publicly available. PMID:28141861

  4. Watching elderly and disabled person's physical condition by remotely controlled monorail robot

    NASA Astrophysics Data System (ADS)

    Nagasaka, Yasunori; Matsumoto, Yoshinori; Fukaya, Yasutoshi; Takahashi, Tomoichi; Takeshita, Toru

    2001-10-01

    We are developing a nursing system using robots and cameras. The cameras are mounted on a remote controlled monorail robot which moves inside a room and watches the elderly. It is necessary to pay attention to the elderly at home or nursing homes all time. This requires staffs to pay attention to them at every time. The purpose of our system is to help those staffs. This study intends to improve such situation. A host computer controls a monorail robot to go in front of the elderly using the images taken by cameras on the ceiling. A CCD camera is mounted on the monorail robot to take pictures of their facial expression or movements. The robot sends the images to a host computer that checks them whether something unusual happens or not. We propose a simple calibration method for positioning the monorail robots to track the moves of the elderly for keeping their faces at center of camera view. We built a small experiment system, and evaluated our camera calibration method and image processing algorithm.

  5. Morning view, brick post detail; view also shows dimensional wallconstruction ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, brick post detail; view also shows dimensional wall-construction detail. North wall, with the camera facing northwest. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  6. 2. HEALTH CENTER OFFICE SOUTH BACK AND EAST SIDE, FROM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. HEALTH CENTER OFFICE SOUTH BACK AND EAST SIDE, FROM PASSAGE BEHIND COURTHOUSE, CAMERA FACING NORTHWEST. - Lancaster County Center, Health Center Office, 4845 Cedar Avenue, Lancaster, Los Angeles County, CA

  7. Contextual view showing drainage culvert in foreground boarding east side ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing drainage culvert in foreground boarding east side of knoll with eucalyptus windbreak. Camera facing 278" southwest. - Goerlitz House, 9893 Highland Avenue, Rancho Cucamonga, San Bernardino County, CA

  8. Driver face recognition as a security and safety feature

    NASA Astrophysics Data System (ADS)

    Vetter, Volker; Giefing, Gerd-Juergen; Mai, Rudolf; Weisser, Hubert

    1995-09-01

    We present a driver face recognition system for comfortable access control and individual settings of automobiles. The primary goals are the prevention of car thefts and heavy accidents caused by unauthorized use (joy-riders), as well as the increase of safety through optimal settings, e.g. of the mirrors and the seat position. The person sitting on the driver's seat is observed automatically by a small video camera in the dashboard. All he has to do is to behave cooperatively, i.e. to look into the camera. A classification system validates his access. Only after a positive identification, the car can be used and the driver-specific environment (e.g. seat position, mirrors, etc.) may be set up to ensure the driver's comfort and safety. The driver identification system has been integrated in a Volkswagen research car. Recognition results are presented.

  9. Integrating daylighting into a 3,000 seat church auditorium and network quality television production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holder, L.M. III; Holder, L.M. IV

    1999-07-01

    The project was designed by the Overland Partners Architectural Firm for Riverbend Church of Austin as an Auditorium for Sunday Services and a venue for special theatrical presentations for the church and the community as well. It is an amphitheater on a hillside overlooking the Colorado River Valley. The amphitheater was selected as the building form to keep the audience closer to the speaker. A 175 ft wide by 60 ft tall arched window was installed on the north face to allow the audience to see the panorama views of the tree covered hills on the other side of themore » valley in the Texas Hill Country. Although the design is quite effective in achieving the program goals, these characteristics make it difficult to achieve effective daylighting without glare for the audience and television cameras since both face the north glazing. The design team was faced with providing quality daylighting for the audience and television cameras from the wall behind the stage. Most television studios have carefully controlled lighting systems with the major lighting component from behind the cameras. Virtually all television facilities with daylight contributing to the production lighting are in a building with high shading coefficient glass producing illumination on all areas equally or almost all glass and daylighting from skylights and clearstories above. All television networks have requirements for control of the quality of the video images to parallel those conditions for the program to be aired.« less

  10. Multi-modal low cost mobile indoor surveillance system on the Robust Artificial Intelligence-based Defense Electro Robot (RAIDER)

    NASA Astrophysics Data System (ADS)

    Nair, Binu M.; Diskin, Yakov; Asari, Vijayan K.

    2012-10-01

    We present an autonomous system capable of performing security check routines. The surveillance machine, the Clearpath Husky robotic platform, is equipped with three IP cameras with different orientations for the surveillance tasks of face recognition, human activity recognition, autonomous navigation and 3D reconstruction of its environment. Combining the computer vision algorithms onto a robotic machine has given birth to the Robust Artificial Intelligencebased Defense Electro-Robot (RAIDER). The end purpose of the RAIDER is to conduct a patrolling routine on a single floor of a building several times a day. As the RAIDER travels down the corridors off-line algorithms use two of the RAIDER's side mounted cameras to perform a 3D reconstruction from monocular vision technique that updates a 3D model to the most current state of the indoor environment. Using frames from the front mounted camera, positioned at the human eye level, the system performs face recognition with real time training of unknown subjects. Human activity recognition algorithm will also be implemented in which each detected person is assigned to a set of action classes picked to classify ordinary and harmful student activities in a hallway setting.The system is designed to detect changes and irregularities within an environment as well as familiarize with regular faces and actions to distinguish potentially dangerous behavior. In this paper, we present the various algorithms and their modifications which when implemented on the RAIDER serves the purpose of indoor surveillance.

  11. Lip boundary detection techniques using color and depth information

    NASA Astrophysics Data System (ADS)

    Kim, Gwang-Myung; Yoon, Sung H.; Kim, Jung H.; Hur, Gi Taek

    2002-01-01

    This paper presents our approach to using a stereo camera to obtain 3-D image data to be used to improve existing lip boundary detection techniques. We show that depth information as provided by our approach can be used to significantly improve boundary detection systems. Our system detects the face and mouth area in the image by using color, geometric location, and additional depth information for the face. Initially, color and depth information can be used to localize the face. Then we can determine the lip region from the intensity information and the detected eye locations. The system has successfully been used to extract approximate lip regions using RGB color information of the mouth area. Merely using color information is not robust because the quality of the results may vary depending on light conditions, background, and the human race. To overcome this problem, we used a stereo camera to obtain 3-D facial images. 3-D data constructed from the depth information along with color information can provide more accurate lip boundary detection results as compared to color only based techniques.

  12. Applied learning-based color tone mapping for face recognition in video surveillance system

    NASA Astrophysics Data System (ADS)

    Yew, Chuu Tian; Suandi, Shahrel Azmin

    2012-04-01

    In this paper, we present an applied learning-based color tone mapping technique for video surveillance system. This technique can be applied onto both color and grayscale surveillance images. The basic idea is to learn the color or intensity statistics from a training dataset of photorealistic images of the candidates appeared in the surveillance images, and remap the color or intensity of the input image so that the color or intensity statistics match those in the training dataset. It is well known that the difference in commercial surveillance cameras models, and signal processing chipsets used by different manufacturers will cause the color and intensity of the images to differ from one another, thus creating additional challenges for face recognition in video surveillance system. Using Multi-Class Support Vector Machines as the classifier on a publicly available video surveillance camera database, namely SCface database, this approach is validated and compared to the results of using holistic approach on grayscale images. The results show that this technique is suitable to improve the color or intensity quality of video surveillance system for face recognition.

  13. ETR BUILDING, TRA642, INTERIOR. BASEMENT. CAMERA IS AT MIDPOINT OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. BASEMENT. CAMERA IS AT MIDPOINT OF SOUTH CORRIDOR AND FACES EAST, OPPOSITE DIRECTION FROM VIEWS ID-33-G-98 AND ID-33-G-99. STEEL DOOR AT LEFT OPENS BY ROLLING IT INTO CORRIDOR ON RAILS. TANK AT FAR END OF CORRIDOR IS EMERGENCY CORE COOLING CATCH TANK FOR A TEST LOOP. INL NEGATIVE NO. HD46-30-4. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  14. HOT CELL BUILDING, TRA632. CONTEXTUAL VIEW ALONG WALLEYE AVENUE, CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632. CONTEXTUAL VIEW ALONG WALLEYE AVENUE, CAMERA FACING EASTERLY. HOT CELL BUILDING IS AT CENTER LEFT OF VIEW; THE LOW-BAY PROJECTION WITH LADDER IS THE TEST TRAIN ASSEMBLY FACILITY, ADDED IN 1968. MTR BUILDING IS IN LEFT OF VIEW. HIGH-BAY BUILDING AT RIGHT IS THE ENGINEERING TEST REACTOR BUILDING, TRA-642. INL NEGATIVE NO. HD46-32-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  15. CAMERA: An integrated strategy for compound spectra extraction and annotation of LC/MS data sets

    PubMed Central

    Kuhl, Carsten; Tautenhahn, Ralf; Böttcher, Christoph; Larson, Tony R.; Neumann, Steffen

    2013-01-01

    Liquid chromatography coupled to mass spectrometry is routinely used for metabolomics experiments. In contrast to the fairly routine and automated data acquisition steps, subsequent compound annotation and identification require extensive manual analysis and thus form a major bottle neck in data interpretation. Here we present CAMERA, a Bioconductor package integrating algorithms to extract compound spectra, annotate isotope and adduct peaks, and propose the accurate compound mass even in highly complex data. To evaluate the algorithms, we compared the annotation of CAMERA against a manually defined annotation for a mixture of known compounds spiked into a complex matrix at different concentrations. CAMERA successfully extracted accurate masses for 89.7% and 90.3% of the annotatable compounds in positive and negative ion mode, respectively. Furthermore, we present a novel annotation approach that combines spectral information of data acquired in opposite ion modes to further improve the annotation rate. We demonstrate the utility of CAMERA in two different, easily adoptable plant metabolomics experiments, where the application of CAMERA drastically reduced the amount of manual analysis. PMID:22111785

  16. Smartphone based face recognition tool for the blind.

    PubMed

    Kramer, K M; Hedin, D S; Rolkosky, D J

    2010-01-01

    The inability to identify people during group meetings is a disadvantage for blind people in many professional and educational situations. To explore the efficacy of face recognition using smartphones in these settings, we have prototyped and tested a face recognition tool for blind users. The tool utilizes Smartphone technology in conjunction with a wireless network to provide audio feedback of the people in front of the blind user. Testing indicated that the face recognition technology can tolerate up to a 40 degree angle between the direction a person is looking and the camera's axis and a 96% success rate with no false positives. Future work will be done to further develop the technology for local face recognition on the smartphone in addition to remote server based face recognition.

  17. The Faces in Infant-Perspective Scenes Change over the First Year of Life

    PubMed Central

    Jayaraman, Swapnaa; Fausey, Caitlin M.; Smith, Linda B.

    2015-01-01

    Mature face perception has its origins in the face experiences of infants. However, little is known about the basic statistics of faces in early visual environments. We used head cameras to capture and analyze over 72,000 infant-perspective scenes from 22 infants aged 1-11 months as they engaged in daily activities. The frequency of faces in these scenes declined markedly with age: for the youngest infants, faces were present 15 minutes in every waking hour but only 5 minutes for the oldest infants. In general, the available faces were well characterized by three properties: (1) they belonged to relatively few individuals; (2) they were close and visually large; and (3) they presented views showing both eyes. These three properties most strongly characterized the face corpora of our youngest infants and constitute environmental constraints on the early development of the visual system. PMID:26016988

  18. Late afternoon view of the interior of the westcentral wall ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Late afternoon view of the interior of the west-central wall section to be removed; camera facing north. Gravestones in the foreground. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  19. Faces in water bubbles

    NASA Image and Video Library

    2013-07-12

    NASA astronaut Karen Nyberger, Expedition 36 flight engineer, watches a water bubble float freely between her and the camera, showing her image refracted in the droplet, while in the Node 1Unity module of the International Space Station.

  20. Faces in water bubbles

    NASA Image and Video Library

    2013-07-12

    ISS036-E-018302 (12 July 2013) --- NASA astronaut Chris Cassidy, Expedition 36 flight engineer, watches a water bubble float freely between him and the camera, showing his image refracted, in the Unity node of the International Space Station.

  1. 8. DETAIL OF WINDOW AT EAST END OF NORTH ELEVATION, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF WINDOW AT EAST END OF NORTH ELEVATION, CAMERA FACING SOUTH. - U.S. Coast Guard Support Center Alameda, Warehouse, Spencer Road & Icarrus Drive, Coast Guard Island, Alameda, Alameda County, CA

  2. 7. DETAIL OF WINDOW AT WEST END OF SOUTH ELEVATION, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. DETAIL OF WINDOW AT WEST END OF SOUTH ELEVATION, CAMERA FACING NORTH. - U.S. Coast Guard Support Center Alameda, Warehouse, Spencer Road & Icarrus Drive, Coast Guard Island, Alameda, Alameda County, CA

  3. Interior view showing split levels with buildings 87 windows in ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view showing split levels with buildings 87 windows in distance; camera facing west. - Mare Island Naval Shipyard, Mechanics Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  4. INTERIOR DETAIL OF, CEILINGS OF EAST BEDROOM, NORTH WING, SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR DETAIL OF, CEILINGS OF EAST BEDROOM, NORTH WING, SHOWING PART OF MOUNTAIN LION MURAL; CAMERA FACING NORTHEAST - Harry Carey Ranch, Ranch House, 28515 San Francisquito Canyon Road, Saugus, Los Angeles County, CA

  5. Detail of entry and stairway at center of west elevation; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of entry and stairway at center of west elevation; camera facing east. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  6. Interior detail of stairway at first floor near main entry; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of stairway at first floor near main entry; camera facing south. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  7. Interior view of typical room on second floor, west side; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of typical room on second floor, west side; camera facing north. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  8. Overview in two parts: Right view showing orchard path on ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Overview in two parts: Right view showing orchard path on left eucalyptus windbreak bordering knoll on right. Camera facing 278" west. - Goerlitz House, 9893 Highland Avenue, Rancho Cucamonga, San Bernardino County, CA

  9. View of Chapel Park, showing bomb shelters at right foreground, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Chapel Park, showing bomb shelters at right foreground, from building 746 parking lot across Walnut Avenue; camera facing north. - Mare Island Naval Shipyard, East of Nave Drive, Vallejo, Solano County, CA

  10. Contextual view of building 505 showing west elevation from marsh; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 505 showing west elevation from marsh; camera facing east. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  11. Interior detail of trusses and high windows in north wing; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of trusses and high windows in north wing; camera facing southwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  12. View of steel warehouses (building 710 second in on right); ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses (building 710 second in on right); camera facing south. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  13. View of steel warehouses (building 710 second in on left); ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses (building 710 second in on left); camera facing west. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  14. Interior view of main entry on south elevation, showing railroad ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of main entry on south elevation, showing railroad tracks; camera facing south. - Mare Island Naval Shipyard, Boiler Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  15. Interior view of main entry on south elevation, showing railroad ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of main entry on south elevation, showing railroad tracks; camera facing south. - Mare Island Naval Shipyard, Machine Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  16. Detail of second floor window, south wall of north wing; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of second floor window, south wall of north wing; camera facing south. - Mare Island Naval Shipyard, Marine Prison, Suisun Avenue, west side between Mesa Road & San Pablo, Vallejo, Solano County, CA

  17. Human-Robot Interaction

    NASA Technical Reports Server (NTRS)

    Sandor, Aniko; Cross, E. Vincent, II; Chang, Mai Lee

    2015-01-01

    Human-robot interaction (HRI) is a discipline investigating the factors affecting the interactions between humans and robots. It is important to evaluate how the design of interfaces affect the human's ability to perform tasks effectively and efficiently when working with a robot. By understanding the effects of interface design on human performance, workload, and situation awareness, interfaces can be developed to appropriately support the human in performing tasks with minimal errors and with appropriate interaction time and effort. Thus, the results of research on human-robot interfaces have direct implications for the design of robotic systems. For efficient and effective remote navigation of a rover, a human operator needs to be aware of the robot's environment. However, during teleoperation, operators may get information about the environment only through a robot's front-mounted camera causing a keyhole effect. The keyhole effect reduces situation awareness which may manifest in navigation issues such as higher number of collisions, missing critical aspects of the environment, or reduced speed. One way to compensate for the keyhole effect and the ambiguities operators experience when they teleoperate a robot is adding multiple cameras and including the robot chassis in the camera view. Augmented reality, such as overlays, can also enhance the way a person sees objects in the environment or in camera views by making them more visible. Scenes can be augmented with integrated telemetry, procedures, or map information. Furthermore, the addition of an exocentric (i.e., third-person) field of view from a camera placed in the robot's environment may provide operators with the additional information needed to gain spatial awareness of the robot. Two research studies investigated possible mitigation approaches to address the keyhole effect: 1) combining the inclusion of the robot chassis in the camera view with augmented reality overlays, and 2) modifying the camera frame of reference. The first study investigated the effects of inclusion and exclusion of the robot chassis along with superimposing a simple arrow overlay onto the video feed of operator task performance during teleoperation of a mobile robot in a driving task. In this study, the front half of the robot chassis was made visible through the use of three cameras, two side-facing and one forward-facing. The purpose of the second study was to compare operator performance when teleoperating a robot from an egocentric-only and combined (egocentric plus exocentric camera) view. Camera view parameters that are found to be beneficial in these laboratory experiments can be implemented on NASA rovers and tested in a real-world driving and navigation scenario on-site at the Johnson Space Center.

  18. Combining Deep and Handcrafted Image Features for Presentation Attack Detection in Face Recognition Systems Using Visible-Light Camera Sensors

    PubMed Central

    Nguyen, Dat Tien; Pham, Tuyen Danh; Baek, Na Rae; Park, Kang Ryoung

    2018-01-01

    Although face recognition systems have wide application, they are vulnerable to presentation attack samples (fake samples). Therefore, a presentation attack detection (PAD) method is required to enhance the security level of face recognition systems. Most of the previously proposed PAD methods for face recognition systems have focused on using handcrafted image features, which are designed by expert knowledge of designers, such as Gabor filter, local binary pattern (LBP), local ternary pattern (LTP), and histogram of oriented gradients (HOG). As a result, the extracted features reflect limited aspects of the problem, yielding a detection accuracy that is low and varies with the characteristics of presentation attack face images. The deep learning method has been developed in the computer vision research community, which is proven to be suitable for automatically training a feature extractor that can be used to enhance the ability of handcrafted features. To overcome the limitations of previously proposed PAD methods, we propose a new PAD method that uses a combination of deep and handcrafted features extracted from the images by visible-light camera sensor. Our proposed method uses the convolutional neural network (CNN) method to extract deep image features and the multi-level local binary pattern (MLBP) method to extract skin detail features from face images to discriminate the real and presentation attack face images. By combining the two types of image features, we form a new type of image features, called hybrid features, which has stronger discrimination ability than single image features. Finally, we use the support vector machine (SVM) method to classify the image features into real or presentation attack class. Our experimental results indicate that our proposed method outperforms previous PAD methods by yielding the smallest error rates on the same image databases. PMID:29495417

  19. Combining Deep and Handcrafted Image Features for Presentation Attack Detection in Face Recognition Systems Using Visible-Light Camera Sensors.

    PubMed

    Nguyen, Dat Tien; Pham, Tuyen Danh; Baek, Na Rae; Park, Kang Ryoung

    2018-02-26

    Although face recognition systems have wide application, they are vulnerable to presentation attack samples (fake samples). Therefore, a presentation attack detection (PAD) method is required to enhance the security level of face recognition systems. Most of the previously proposed PAD methods for face recognition systems have focused on using handcrafted image features, which are designed by expert knowledge of designers, such as Gabor filter, local binary pattern (LBP), local ternary pattern (LTP), and histogram of oriented gradients (HOG). As a result, the extracted features reflect limited aspects of the problem, yielding a detection accuracy that is low and varies with the characteristics of presentation attack face images. The deep learning method has been developed in the computer vision research community, which is proven to be suitable for automatically training a feature extractor that can be used to enhance the ability of handcrafted features. To overcome the limitations of previously proposed PAD methods, we propose a new PAD method that uses a combination of deep and handcrafted features extracted from the images by visible-light camera sensor. Our proposed method uses the convolutional neural network (CNN) method to extract deep image features and the multi-level local binary pattern (MLBP) method to extract skin detail features from face images to discriminate the real and presentation attack face images. By combining the two types of image features, we form a new type of image features, called hybrid features, which has stronger discrimination ability than single image features. Finally, we use the support vector machine (SVM) method to classify the image features into real or presentation attack class. Our experimental results indicate that our proposed method outperforms previous PAD methods by yielding the smallest error rates on the same image databases.

  20. Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System

    NASA Astrophysics Data System (ADS)

    Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.

  1. New camera-based microswitch technology to monitor small head and mouth responses of children with multiple disabilities.

    PubMed

    Lancioni, Giulio E; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N; O'Reilly, Mark F; Green, Vanessa A; Furniss, Fred

    2014-06-01

    Assessing a new camera-based microswitch technology, which did not require the use of color marks on the participants' face. Two children with extensive multiple disabilities participated. The responses selected for them consisted of small, lateral head movements and mouth closing or opening. The intervention was carried out according to a multiple probe design across responses. The technology involved a computer with a CPU using a 2-GHz clock, a USB video camera with a 16-mm lens, a USB cable connecting the camera and the computer, and a special software program written in ISO C++ language. The new technology was satisfactorily used with both children. Large increases in their responding were observed during the intervention periods (i.e. when the responses were followed by preferred stimulation). The new technology may be an important resource for persons with multiple disabilities and minimal motor behavior.

  2. a Prompt Methodology to Georeference Complex Hypogea Environments

    NASA Astrophysics Data System (ADS)

    Troisi, S.; Baiocchi, V.; Del Pizzo, S.; Giannone, F.

    2017-02-01

    Actually complex underground structures and facilities occupy a wide space in our cities, most of them are often unsurveyed; cable duct, drainage system are not exception. Furthermore, several inspection operations are performed in critical air condition, that do not allow or make more difficult a conventional survey. In this scenario a prompt methodology to survey and georeferencing such facilities is often indispensable. A visual based approach was proposed in this paper; such methodology provides a 3D model of the environment and the path followed by the camera using the conventional photogrammetric/Structure from motion software tools. The key-role is played by the lens camera; indeed, a fisheye system was employed to obtain a very wide field of view (FOV) and therefore high overlapping among the frames. The camera geometry is in according to a forward motion along the axis camera. Consequently, to avoid instability of bundle adjustment algorithm a preliminary calibration of camera was carried out. A specific case study was reported and the accuracy achieved.

  3. Improving the color fidelity of cameras for advanced television systems

    NASA Astrophysics Data System (ADS)

    Kollarits, Richard V.; Gibbon, David C.

    1992-08-01

    In this paper we compare the accuracy of the color information obtained from television cameras using three and five wavelength bands. This comparison is based on real digital camera data. The cameras are treated as colorimeters whose characteristics are not linked to that of the display. The color matrices for both cameras were obtained by identical optimization procedures that minimized the color error The color error for the five band camera is 2. 5 times smaller than that obtained from the three band camera. Visual comparison of color matches on a characterized color monitor indicate that the five band camera is capable of color measurements that produce no significant visual error on the display. Because the outputs from the five band camera are reduced to the normal three channels conventionally used for display there need be no increase in signal handling complexity outside the camera. Likewise it is possible to construct a five band camera using only three sensors as in conventional cameras. The principal drawback of the five band camera is the reduction in effective camera sensitivity by about 3/4 of an I stop. 1.

  4. Morning view, contextual view showing unpaved corridor down the westernmost ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view showing unpaved corridor down the westernmost lane where the wall section (E) will be removed; camera facing north-northwest. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  5. Morning view, contextual view showing the road and gate to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view showing the road and gate to be widened; view taken from the statue area with the camera facing north. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  6. Morning view of the exterior of the westernmost wall section ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view of the exterior of the westernmost wall section to be removed; camera facing south. Unpaved road in foreground; tree canopy in background. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  7. Detail of window and lamp at entrance on north side ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of window and lamp at entrance on north side of north wing; camera facing south. - Mare Island Naval Shipyard, Administrative Offices, Walnut Avenue, east side between Seventh & Eighth Streets, Vallejo, Solano County, CA

  8. INTERIOR DETAIL OF SECOND FLOOR DOORS AT NORTH END OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR DETAIL OF SECOND FLOOR DOORS AT NORTH END OF BUILDING; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  9. INTERIOR VIEW OF SECOND STORY SPACE LOOKING TOWARD SECOND FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF SECOND STORY SPACE LOOKING TOWARD SECOND FLOOR DOORS; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  10. INTERIOR VIEW OF SECOND STORY SPACE, NORTH END OF BUILDING; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF SECOND STORY SPACE, NORTH END OF BUILDING; CAMERA FACING SOUTHEAST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  11. CONTEXTUAL VIEW OF BUILDING 231 SHOWING WEST AND SOUTH ELEVATIONS; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    CONTEXTUAL VIEW OF BUILDING 231 SHOWING WEST AND SOUTH ELEVATIONS; CAMERA FACING NORTHEAST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  12. DETAIL OF FIRST STORY WINDOWS ON NORTH END OF EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF FIRST STORY WINDOWS ON NORTH END OF EAST ELEVATION; CAMERA FACING WEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  13. DETAIL OF WINDOW AND ROOF VENT AT EAST ELEVATION GABLE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF WINDOW AND ROOF VENT AT EAST ELEVATION GABLE END; CAMERA FACING WEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  14. CONTEXTUAL VIEW OF BUILDING 231 SHOWING EAST AND NORTH ELEVATIONS; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    CONTEXTUAL VIEW OF BUILDING 231 SHOWING EAST AND NORTH ELEVATIONS; CAMERA FACING SOUTHWEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  15. DETAIL OF GABLE END WITH ARCHED WINDOW, SHOWING SOFFIT OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF GABLE END WITH ARCHED WINDOW, SHOWING SOFFIT OF OVERHANG; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  16. Contextual view of summer kitchen, showing blacksmith shop downhill at ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of summer kitchen, showing blacksmith shop downhill at right and cottage at center (between the trees); camera facing northeast - Lemmon-Anderson-Hixson Ranch, Summer Kitchen, 11220 North Virginia Street, Reno, Washoe County, NV

  17. Detail of second story windows at northwest corner of southeast ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of second story windows at northwest corner of southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  18. Detail of parapet, frieze, and castings at windows. South elevation; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of parapet, frieze, and castings at windows. South elevation; camera facing north. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  19. Detail of second floor windows and chimney on southeast elevation; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of second floor windows and chimney on southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  20. Interior view of typical ward on second floor, south wing; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of typical ward on second floor, south wing; camera facing northwest. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  1. Interior view of first floor lobby with detail of columns; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of first floor lobby with detail of columns; camera facing north. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  2. Interior detail of exit door on second floor at north ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of exit door on second floor at north end; camera facing north. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  3. Interior detail of bathroom on first floor north end, west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of bathroom on first floor north end, west side, camera facing west. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  4. Contextual view of Goerlitz Property, showing eucalyptus trees along west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Goerlitz Property, showing eucalyptus trees along west side of driveway; parking lot and utility pole in foreground. Camera facing 38" northeast - Goerlitz House, 9893 Highland Avenue, Rancho Cucamonga, San Bernardino County, CA

  5. View of main terrace retaining wall with mature tree on ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of main terrace retaining wall with mature tree on left center, camera facing southeast - Naval Training Station, Senior Officers' Quarters District, Naval Station Treasure Island, Yerba Buena Island, San Francisco, San Francisco County, CA

  6. Contextual view of building 505 Cedar avenue, showing south and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 505 Cedar avenue, showing south and east elevations; camera facing northwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  7. Detail of north wing with rollup door on north elevation; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of north wing with roll-up door on north elevation; camera facing south. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  8. View of steel warehouses at Gilmore Avenue (building 710 second ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses at Gilmore Avenue (building 710 second in on left); camera facing east. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  9. View of steel warehouses on Ellsberg Drive, building 710 full ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses on Ellsberg Drive, building 710 full building at center; camera facing southeast. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  10. View of steel warehouses (from left: building 807, 808, 809, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses (from left: building 807, 808, 809, 810, 811); camera facing east. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  11. Intermediate view synthesis for eye-gazing

    NASA Astrophysics Data System (ADS)

    Baek, Eu-Ttuem; Ho, Yo-Sung

    2015-01-01

    Nonverbal communication, also known as body language, is an important form of communication. Nonverbal behaviors such as posture, eye contact, and gestures send strong messages. In regard to nonverbal communication, eye contact is one of the most important forms that an individual can use. However, lack of eye contact occurs when we use video conferencing system. The disparity between locations of the eyes and a camera gets in the way of eye contact. The lock of eye gazing can give unapproachable and unpleasant feeling. In this paper, we proposed an eye gazing correction for video conferencing. We use two cameras installed at the top and the bottom of the television. The captured two images are rendered with 2D warping at virtual position. We implement view morphing to the detected face, and synthesize the face and the warped image. Experimental results verify that the proposed system is effective in generating natural gaze-corrected images.

  12. PBF Cooling Tower and it Auxiliary Building (PER624) to left ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower and it Auxiliary Building (PER-624) to left of tower. Camera facing west and the east louvered face of the tower. Details include secondary coolant water riser piping and flow control valves (butterfly valves) to distribute water evenly to all sections of tower. Photographer: Holmes. Date: May, 20, 1970. INEEL negative no. 70-2322 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  13. Examining wildlife responses to phenology and wildfire using a landscape-scale camera trap network

    USGS Publications Warehouse

    Villarreal, Miguel L.; Gass, Leila; Norman, Laura; Sankeya, Joel B.; Wallace, Cynthia S.A.; McMacken, Dennis; Childs, Jack L.; Petrakis, Roy E.

    2012-01-01

    Between 2001 and 2009, the Borderlands Jaguar Detection Project deployed 174 camera traps in the mountains of southern Arizona to record jaguar activity. In addition to jaguars, the motion-activated cameras, placed along known wildlife travel routes, recorded occurrences of ~ 20 other animal species. We examined temporal relationships of white-tailed deer (Odocoileus virginianus) and javelina (Pecari tajacu) to landscape phenology (as measured by monthly Normalized Difference Vegetation Index data) and the timing of wildfire (Alambre Fire of 2007). Mixed model analyses suggest that temporal dynamics of these two species were related to vegetation phenology and natural disturbance in the Sky Island region, information important for wildlife managers faced with uncertainty regarding changing climate and disturbance regimes.

  14. The Dartmouth Database of Children’s Faces: Acquisition and Validation of a New Face Stimulus Set

    PubMed Central

    Dalrymple, Kirsten A.; Gomez, Jesse; Duchaine, Brad

    2013-01-01

    Facial identity and expression play critical roles in our social lives. Faces are therefore frequently used as stimuli in a variety of areas of scientific research. Although several extensive and well-controlled databases of adult faces exist, few databases include children’s faces. Here we present the Dartmouth Database of Children’s Faces, a set of photographs of 40 male and 40 female Caucasian children between 6 and 16 years-of-age. Models posed eight facial expressions and were photographed from five camera angles under two lighting conditions. Models wore black hats and black gowns to minimize extra-facial variables. To validate the images, independent raters identified facial expressions, rated their intensity, and provided an age estimate for each model. The Dartmouth Database of Children’s Faces is freely available for research purposes and can be downloaded by contacting the corresponding author by email. PMID:24244434

  15. Mars Global Surveyor observations of Martian fretted terrain

    USGS Publications Warehouse

    Carr, M.H.

    2001-01-01

    The Martian fretted terrain between latitudes 30?? and 50?? N and between 315?? and 360?? W has been reexamined in light of new Mars Orbiter Camera (MOC) and Mars Orbiter Laser Altimeter (MOLA) data from Mars Global Surveyor. Much of the terrain in the 30??-50?? latitude belt in both hemispheres has a characteristic stippled or pitted texture at MOC (1.5 m) scale. The texture appears to result from partial removal of a formerly smooth, thin deposit as a result of sublimation and deflation. A complex history of deposition and exhumation is indicated by remnants of a former, thicker cover of layered deposits. In some hollows and on some slopes, particularly those facing the pole, are smooth textured deposits outlined by an outward facing escarpment. Throughout the study area are numerous escarpments with debris flows at their base. The escarpments typically have slopes in the 20??-30?? range. At the base of the escarpment is commonly a deposit with striae oriented at right angles to the escarpment. Outside this deposit is the main debris apron with a surface that typically slopes 2??-3?? and complex surface textures suggestive of compression, sublimation, and deflation. The presence of undeformed impact craters indicates that the debris flows are no longer forming. Fretted valleys contain lineated fill and are poorly graded. They likely form from fluvial valleys that were initially like those elsewhere on the planet but were subsequently widened and filled by the same mass-wasting processes that formed the debris aprons. Slope reversals indicate that downvalley flow of the lineated fill is minor. The ubiquitous presence of breaks in slope formed by mass wasting and the complex surface textures that result from mass wasting, deflation, and sublimation decreases the recognizability of the shorelines formerly proposed for this area.

  16. Imaging Dot Patterns for Measuring Gossamer Space Structures

    NASA Technical Reports Server (NTRS)

    Dorrington, A. A.; Danehy, P. M.; Jones, T. W.; Pappa, R. S.; Connell, J. W.

    2005-01-01

    A paper describes a photogrammetric method for measuring the changing shape of a gossamer (membrane) structure deployed in outer space. Such a structure is typified by a solar sail comprising a transparent polymeric membrane aluminized on its Sun-facing side and coated black on the opposite side. Unlike some prior photogrammetric methods, this method does not require an artificial light source or the attachment of retroreflectors to the gossamer structure. In a basic version of the method, the membrane contains a fluorescent dye, and the front and back coats are removed in matching patterns of dots. The dye in the dots absorbs some sunlight and fluoresces at a longer wavelength in all directions, thereby enabling acquisition of high-contrast images from almost any viewing angle. The fluorescent dots are observed by one or more electronic camera(s) on the Sun side, the shade side, or both sides. Filters that pass the fluorescent light and suppress most of the solar spectrum are placed in front of the camera(s) to increase the contrast of the dots against the background. The dot image(s) in the camera(s) are digitized, then processed by use of commercially available photogrammetric software.

  17. Center for Automatic Target Recognition Research. Delivery Order 0005: Image Georegistration, Camera Calibration, and Dismount Categorization in Support of DEBU from Layered Sensing

    DTIC Science & Technology

    2011-07-01

    rendering of a subject using 316,691 polygon faces and 161,951 points. The small white dots on the surface of the subject are landmark points. The...Figure 17: CAESAR Data. The leftmost image is a color polygon rendering of a subject using 316,691 polygon faces and 161,951 points. The small white...polygon rendering of a subject using 316,691 polygon faces and 161,951 points. The small white dots on the surface of the subject are landmark points

  18. A Portable Shoulder-Mounted Camera System for Surgical Education in Spine Surgery.

    PubMed

    Pham, Martin H; Ohiorhenuan, Ifije E; Patel, Neil N; Jakoi, Andre M; Hsieh, Patrick C; Acosta, Frank L; Wang, Jeffrey C; Liu, John C

    2017-02-07

    The past several years have demonstrated an increased recognition of operative videos as an important adjunct for resident education. Currently lacking, however, are effective methods to record video for the purposes of illustrating the techniques of minimally invasive (MIS) and complex spine surgery. We describe here our experiences developing and using a shoulder-mounted camera system for recording surgical video. Our requirements for an effective camera system included wireless portability to allow for movement around the operating room, camera mount location for comfort and loupes/headlight usage, battery life for long operative days, and sterile control of on/off recording. With this in mind, we created a shoulder-mounted camera system utilizing a GoPro™ HERO3+, its Smart Remote (GoPro, Inc., San Mateo, California), a high-capacity external battery pack, and a commercially available shoulder-mount harness. This shoulder-mounted system was more comfortable to wear for long periods of time in comparison to existing head-mounted and loupe-mounted systems. Without requiring any wired connections, the surgeon was free to move around the room as needed. Over the past several years, we have recorded numerous MIS and complex spine surgeries for the purposes of surgical video creation for resident education. Surgical videos serve as a platform to distribute important operative nuances in rich multimedia. Effective and practical camera system setups are needed to encourage the continued creation of videos to illustrate the surgical maneuvers in minimally invasive and complex spinal surgery. We describe here a novel portable shoulder-mounted camera system setup specifically designed to be worn and used for long periods of time in the operating room.

  19. Morning view of the exterior of the gate and white ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view of the exterior of the gate and white posts to be reworked/widened; camera facing south looking into the cemetery toward the statue. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  20. INTERIOR VIEW OF FIRST FLOOR SPACE AT NORTH END, LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF FIRST FLOOR SPACE AT NORTH END, LOOKING AT WEST WALL; CAMERA FACING NORTHWEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  1. Contextual view of building H70 showing southeast and northeast elevations; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building H70 showing southeast and northeast elevations; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  2. Detail of concrete pillars and steps leading to main entry ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of concrete pillars and steps leading to main entry at southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  3. Detail of first and second floor windows at east corner ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of first and second floor windows at east corner of south elevation; camera facing northwest. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  4. Interior view of office with fireplace on second floor off ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of office with fireplace on second floor off south lobby; camera facing southeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  5. 1. BUILDING L (LEFT OF CENTER) EAST END AND SOUTH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. BUILDING L (LEFT OF CENTER) EAST END AND SOUTH SIDE (BUILDING K IS ON RIGHT, BUILDING M IS ON LEFT), CAMERA FACING NORTHWEST - Buffalo Ranch, Office Building, 2418 MacArthur Boulevard, Irvine, Orange County, CA

  6. Interior detail view showing worn threshold in doorway between kitchen ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail view showing worn threshold in doorway between kitchen and west room in north addition. Camera facing west. - Warner Ranch, Ranch House, San Felipe Road (State Highway S2), Warner Springs, San Diego County, CA

  7. Printed products for digital cameras and mobile devices

    NASA Astrophysics Data System (ADS)

    Fageth, Reiner; Schmidt-Sacht, Wulf

    2005-01-01

    Digital photography is no longer simply a successor to film. The digital market is now driven by additional devices such as mobile phones with camera and video functions (camphones) as well as innovative products derived from digital files. A large number of consumers do not print their images and non-printing has become the major enemy of wholesale printers, home printing suppliers and retailers. This paper addresses the challenge facing our industry, namely how to encourage the consumer to print images easily and conveniently from all types of digital media.

  8. PROCESS WATER BUILDING, TRA605, INTERIOR. FIRST FLOOR. CAMERA IS IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605, INTERIOR. FIRST FLOOR. CAMERA IS IN SOUTHEAST CORNER AND FACES NORTHWEST. CONTROL ROOM AT RIGHT. CRANE MONORAIL IS OVER FLOOR HATCHES AND FLOOR OPENINGS. SIX VALVE HANDWHEELS ALONG FAR WALL IN LEFT CENTER VIEW. SEAL TANK IS ON OTHER SIDE OF WALL; PROCESS WATER PIPES ARE BELOW VALVE WHEELS. NOTE CURBS AROUND FLOOR OPENINGS. INL NEGATIVE NO. HD46-26-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. PBF Reactor Building (PER620) under construction. Aerial view with camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620) under construction. Aerial view with camera facing northeast. Steel framework is exposed for west wing and high bay. Concrete block siding on east wing. Railroad crane set up on west side. Note trenches proceeding from front of building. Left trench is for secondary coolant and will lead to Cooling Tower. Shorter trench will contain cables leading to control area. Photographer: Larry Page. Date: March 22, 1967. INEEL negative no. 67-5025 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  10. ETR CRITICAL FACILITY, TRA654. CONTEXTUAL VIEW. CAMERA ON ROOF OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR CRITICAL FACILITY, TRA-654. CONTEXTUAL VIEW. CAMERA ON ROOF OF MTR BUILDING AND FACING SOUTH. ETR AND ITS COOLANT BUILDING AT UPPER PART OF VIEW. ETR COOLING TOWER NEAR TOP EDGE OF VIEW. EXCAVATION AT CENTER IS FOR ETR CF. CENTER OF WHICH WILL CONTAIN POOL FOR REACTOR. NOTE CHOPPER TUBE PROCEEDING FROM MTR IN LOWER LEFT OF VIEW, DIAGONAL TOWARD LEFT. INL NEGATIVE NO. 56-4227. Jack L. Anderson, Photographer, 12/18/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. Comparison of parameters of modern cooled and uncooled thermal cameras

    NASA Astrophysics Data System (ADS)

    Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał

    2017-10-01

    During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.

  12. PBF Cooling Tower. View from highbay roof of Reactor Building ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower. View from high-bay roof of Reactor Building (PER-620). Camera faces northwest. East louvered face has been installed. Inlet pipes protrude from fan deck. Two redwood vents under construction at top. Note piping, control, and power lines at sub-grade level in trench leading to Reactor Building. Photographer: Kirsh. Date: June 6, 1969. INEEL negative no. 69-3466 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  13. Speckle-learning-based object recognition through scattering media.

    PubMed

    Ando, Takamasa; Horisaki, Ryoichi; Tanida, Jun

    2015-12-28

    We experimentally demonstrated object recognition through scattering media based on direct machine learning of a number of speckle intensity images. In the experiments, speckle intensity images of amplitude or phase objects on a spatial light modulator between scattering plates were captured by a camera. We used the support vector machine for binary classification of the captured speckle intensity images of face and non-face data. The experimental results showed that speckles are sufficient for machine learning.

  14. Validation of the facial assessment by computer evaluation (FACE) program for software-aided eyelid measurements.

    PubMed

    Choi, Catherine J; Lefebvre, Daniel R; Yoon, Michael K

    2016-06-01

    The aim of this article is to validate the accuracy of Facial Assessment by Computer Evaluation (FACE) program in eyelid measurements. Sixteen subjects between the ages of 27 and 65 were included with IRB approval. Clinical measurements of upper eyelid margin reflex distance (MRD1) and inter-palpebral fissure (IPF) were obtained. Photographs were then taken with a digital single lens reflex camera with built-in pop-up flash (dSLR-pop) and a dSLR with lens-mounted ring flash (dSLR-ring) with the cameras upright, rotated 90, 180, and 270 degrees. The images were analyzed using both the FACE and ImageJ software to measure MRD1 and IPF.Thirty-two eyes of sixteen subjects were included. Comparison of clinical measurement of MRD1 and IPF with FACE measurements of photos in upright position showed no statistically significant differences for dSLR-pop (MRD1: p = 0.0912, IPF: p = 0.334) and for dSLR-ring (MRD1: p = 0.105, IPF: p = 0.538). One-to-one comparison of MRD1 and IPF measurements in four positions obtained with FACE versus ImageJ for dSLR-pop showed moderate to substantial agreement for MRD1 (intraclass correlation coefficient = 0.534 upright, 0.731 in 90 degree rotation, 0.627 in 180 degree rotation, 0.477 in 270 degree rotation) and substantial to excellent agreement in IPF (ICC = 0.740, 0.859, 0.849, 0.805). In photos taken with dSLR-ring, there was excellent agreement of all MRD1 (ICC = 0.916, 0.932, 0.845, 0.812) and IPF (ICC = 0.937, 0.938, 0.917, 0.888) values. The FACE program is a valid method for measuring margin reflex distance and inter-palpebral fissure.

  15. 41. ARAIII Prototype assembly and evaluation building ARA630. West end ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    41. ARA-III Prototype assembly and evaluation building ARA-630. West end and south side of building. Camera facing northeast. Ineel photo no. 3-22. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  16. 4. INTERIOR VIEW OF CLUB HOUSE REFRIGERATION UNIT, SHOWING COOLING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. INTERIOR VIEW OF CLUB HOUSE REFRIGERATION UNIT, SHOWING COOLING COILS AND CORK-LINED ROOM. CAMERA IS BETWEEN SEVEN AND EIGHT FEET ABOVE FLOOR LEVEL, FACING SOUTHEAST. - Swan Falls Village, Clubhouse 011, Snake River, Kuna, Ada County, ID

  17. Contextual view showing H1 on left and H270 in background; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing H1 on left and H270 in background; camera facing north. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  18. View looking across to building H1 from third floor porch ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View looking across to building H1 from third floor porch over entrance; camera facing south. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  19. Interior detail of scrolled brackets on post, west side of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of scrolled brackets on post, west side of first floor by rear entrance; camera facing north. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  20. Contextual view looking down clubhouse drive. Showing west elevation of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view looking down clubhouse drive. Showing west elevation of H1 on right; camera facing east. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  1. INTERIOR OF BOILER BUILDING, FIRST LEVEL, EAST SIDE, SHOWING STEAMDRIVEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR OF BOILER BUILDING, FIRST LEVEL, EAST SIDE, SHOWING STEAM-DRIVEN PISTON PUMPS FOR FUEL OIL, CAMERA FACING EAST. - New Haven Rail Yard, Central Steam Plant and Oil Storage, Vicinity of Union Avenue, New Haven, New Haven County, CT

  2. Impact Flash Monitoring Facility on the Deep Space Gateway

    NASA Astrophysics Data System (ADS)

    Needham, D. H.; Moser, D. E.; Suggs, R. M.; Cooke, W. J.; Kring, D. A.; Neal, C. R.; Fassett, C. I.

    2018-02-01

    Cameras mounted to the Deep Space Gateway exterior will detect flashes caused by impacts on the lunar surface. Observed flashes will help constrain the current lunar impact flux and assess hazards faced by crews living and working in cislunar space.

  3. 104. ARAIII. Interior view of room 110 in ARA607 used ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    104. ARA-III. Interior view of room 110 in ARA-607 used as data acquisition control room. Camera facing northeast. Ineel photo no. 81-103. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  4. Use of 3-dimensional surface acquisition to study facial morphology in 5 populations.

    PubMed

    Kau, Chung How; Richmond, Stephen; Zhurov, Alexei; Ovsenik, Maja; Tawfik, Wael; Borbely, Peter; English, Jeryl D

    2010-04-01

    The aim of this study was to assess the use of 3-dimensional facial averages for determining morphologic differences from various population groups. We recruited 473 subjects from 5 populations. Three-dimensional images of the subjects were obtained in a reproducible and controlled environment with a commercially available stereo-photogrammetric camera capture system. Minolta VI-900 (Konica Minolta, Tokyo, Japan) and 3dMDface (3dMD LLC, Atlanta, Ga) systems were used. Each image was obtained as a facial mesh and orientated along a triangulated axis. All faces were overlaid, one on top of the other, and a complex mathematical algorithm was performed until average composite faces of 1 man and 1 woman were achieved for each subgroup. These average facial composites were superimposed based on a previously validated superimposition method, and the facial differences were quantified. Distinct facial differences were observed among the groups. The linear differences between surface shells ranged from 0.37 to 1.00 mm for the male groups. The linear differences ranged from 0.28 and 0.87 mm for the women. The color histograms showed that the similarities in facial shells between the subgroups by sex ranged from 26.70% to 70.39% for men and 36.09% to 79.83% for women. The average linear distance from the signed color histograms for the male subgroups ranged from -6.30 to 4.44 mm. The female subgroups ranged from -6.32 to 4.25 mm. Average faces can be efficiently and effectively created from a sample of 3-dimensional faces. Average faces can be used to compare differences in facial morphologies for various populations and sexes. Facial morphologic differences were greatest when totally different ethnic variations were compared. Facial morphologic similarities were present in comparable groups, but there were large variations in concentrated areas of the face. Copyright 2010 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  5. Practical image registration concerns overcome by the weighted and filtered mutual information metric

    NASA Astrophysics Data System (ADS)

    Keane, Tommy P.; Saber, Eli; Rhody, Harvey; Savakis, Andreas; Raj, Jeffrey

    2012-04-01

    Contemporary research in automated panorama creation utilizes camera calibration or extensive knowledge of camera locations and relations to each other to achieve successful results. Research in image registration attempts to restrict these same camera parameters or apply complex point-matching schemes to overcome the complications found in real-world scenarios. This paper presents a novel automated panorama creation algorithm by developing an affine transformation search based on maximized mutual information (MMI) for region-based registration. Standard MMI techniques have been limited to applications with airborne/satellite imagery or medical images. We show that a novel MMI algorithm can approximate an accurate registration between views of realistic scenes of varying depth distortion. The proposed algorithm has been developed using stationary, color, surveillance video data for a scenario with no a priori camera-to-camera parameters. This algorithm is robust for strict- and nearly-affine-related scenes, while providing a useful approximation for the overlap regions in scenes related by a projective homography or a more complex transformation, allowing for a set of efficient and accurate initial conditions for pixel-based registration.

  6. On the performances of computer vision algorithms on mobile platforms

    NASA Astrophysics Data System (ADS)

    Battiato, S.; Farinella, G. M.; Messina, E.; Puglisi, G.; Ravì, D.; Capra, A.; Tomaselli, V.

    2012-01-01

    Computer Vision enables mobile devices to extract the meaning of the observed scene from the information acquired with the onboard sensor cameras. Nowadays, there is a growing interest in Computer Vision algorithms able to work on mobile platform (e.g., phone camera, point-and-shot-camera, etc.). Indeed, bringing Computer Vision capabilities on mobile devices open new opportunities in different application contexts. The implementation of vision algorithms on mobile devices is still a challenging task since these devices have poor image sensors and optics as well as limited processing power. In this paper we have considered different algorithms covering classic Computer Vision tasks: keypoint extraction, face detection, image segmentation. Several tests have been done to compare the performances of the involved mobile platforms: Nokia N900, LG Optimus One, Samsung Galaxy SII.

  7. DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER OF THE MLP - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  8. The potential of low-cost RPAS for multi-view reconstruction of rock cliffs

    NASA Astrophysics Data System (ADS)

    Ettore Guccione, Davide; Thoeni, Klaus; Santise, Marina; Giacomini, Anna; Roncella, Riccardo; Forlani, Gianfranco

    2016-04-01

    RPAS, also known as drones or UAVs, have been used in military applications for many years. Nevertheless, the technology has become accessible to everyone only in recent years (Westoby et al., 2012; Nex and Remondino, 2014). Electric multirotor helicopters or multicopters have become one of the most exciting developments and several off-the-shelf platforms (including camera) are now available. In particular, RPAS can provide 3D models of sub-vertical rock faces, which for instance are needed for rockfall hazard assessments along road cuts and very steep mountains. The current work investigates the potential of two low-cost off-the-shelf quadcopters equipped with digital cameras for multi-view reconstruction of sub-vertical rock cliffs. The two platforms used are a DJI Phantom 1 (P1) equipped with a Gopro Hero 3+ (12MP) and a DJI Phantom 3 Professional (P3). The latter comes with an integrated 12MP camera mounted on a 3-axis gimbal. Both platforms cost less than 1.500€ including camera. The study area is a small rock cliff near the Callaghan Campus of the University of Newcastle (Thoeni et al., 2014). The wall is partly smooth with some evident geological features such as non-persistent joints and sharp edges. Several flights were performed with both cameras set in time-lapse mode. Hence, images were taken automatically but the flights were performed manually since the investigated rock face is very irregular which required adjusting the yaw and roll for optimal coverage since the flights were performed very close to the cliff face. The digital images were processed with a commercial SfM software package. Thereby, several processing options and camera networks were investigated in order to define the most accurate configuration. Firstly, the difference between the use of coded ground control targets versus natural features was studied. Coded targets generally provide the best accuracy but they need to be placed on the surface which is not always possible as rock cliffs are not easily accessible. Nevertheless, work natural features can provide a good alternative if chosen wisely. Secondly, the influence of using fixed interior orientation parameters and self-calibration was investigated. The results show that in the case of the used sensors and camera networks self-calibration provides better results. This can mainly be attributed to the fact that the object distance is not constant and rather small (less than 10m) and that both cameras do not provide an option for fixing the interior orientation parameters. Finally, the results of both platforms are as well compared to a point cloud obtained with a terrestrial laser scanner where generally a very good agreement is observed. References Nex, F., Remondino, F. (2014) UAV for 3D mapping applications: a review. Applied Geomatics 6(1), 1-15. Thoeni, K., Giacomini, A., Murtagh, R., Kniest, E. (2014) A comparison of multi-view 3D reconstruction of a rock wall using several cameras and a laser scanner. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XL-5, 573-580. Westoby, M.J., Brasington, J., Glasser, N.F., Hambrey, M.J., Reynolds, J.M. (2012) 'Structure-from-Motion' photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 179, 300-314.

  9. Development of a safe ultraviolet camera system to enhance awareness by showing effects of UV radiation and UV protection of the skin (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Verdaasdonk, Rudolf M.; Wedzinga, Rosaline; van Montfrans, Bibi; Stok, Mirte; Klaessens, John; van der Veen, Albert

    2016-03-01

    The significant increase of skin cancer occurring in the western world is attributed to longer sun expose during leisure time. For prevention, people should become aware of the risks of UV light exposure by showing skin damage and the protective effect of sunscreen with an UV camera. An UV awareness imaging system optimized for 365 nm (UV-A) was develop using consumer components being interactive, safe and mobile. A Sony NEX5t camera was adapted to full spectral range. In addition, UV transparent lenses and filters were selected based on spectral characteristics measured (Schott S8612 and Hoya U-340 filters) to obtain the highest contrast for e.g. melanin spots and wrinkles on the skin. For uniform UV illumination, 2 facial tanner units were adapted with UV 365 nm black light fluorescent tubes. Safety of the UV illumination was determined relative to the sun and with absolute irradiance measurements at the working distance. A maximum exposure time over 15 minutes was calculate according the international safety standards. The UV camera was successfully demonstrated during the Dutch National Skin Cancer day and was well received by dermatologists and participating public. Especially, the 'black paint' effect putting sun screen on the face was dramatic and contributed to the awareness of regions on the face what are likely to be missed applying sunscreen. The UV imaging system shows to be promising for diagnostics and clinical studies in dermatology and potentially in other areas (dentistry and ophthalmology)

  10. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  11. Hyperspectral image analysis using artificial color

    NASA Astrophysics Data System (ADS)

    Fu, Jian; Caulfield, H. John; Wu, Dongsheng; Tadesse, Wubishet

    2010-03-01

    By definition, HSC (HyperSpectral Camera) images are much richer in spectral data than, say, a COTS (Commercial-Off-The-Shelf) color camera. But data are not information. If we do the task right, useful information can be derived from the data in HSC images. Nature faced essentially the identical problem. The incident light is so complex spectrally that measuring it with high resolution would provide far more data than animals can handle in real time. Nature's solution was to do irreversible POCS (Projections Onto Convex Sets) to achieve huge reductions in data with minimal reduction in information. Thus we can arrange for our manmade systems to do what nature did - project the HSC image onto two or more broad, overlapping curves. The task we have undertaken in the last few years is to develop this idea that we call Artificial Color. What we report here is the use of the measured HSC image data projected onto two or three convex, overlapping, broad curves in analogy with the sensitivity curves of human cone cells. Testing two quite different HSC images in that manner produced the desired result: good discrimination or segmentation that can be done very simply and hence are likely to be doable in real time with specialized computers. Using POCS on the HSC data to reduce the processing complexity produced excellent discrimination in those two cases. For technical reasons discussed here, the figures of merit for the kind of pattern recognition we use is incommensurate with the figures of merit of conventional pattern recognition. We used some force fitting to make a comparison nevertheless, because it shows what is also obvious qualitatively. In our tasks our method works better.

  12. User-assisted visual search and tracking across distributed multi-camera networks

    NASA Astrophysics Data System (ADS)

    Raja, Yogesh; Gong, Shaogang; Xiang, Tao

    2011-11-01

    Human CCTV operators face several challenges in their task which can lead to missed events, people or associations, including: (a) data overload in large distributed multi-camera environments; (b) short attention span; (c) limited knowledge of what to look for; and (d) lack of access to non-visual contextual intelligence to aid search. Developing a system to aid human operators and alleviate such burdens requires addressing the problem of automatic re-identification of people across disjoint camera views, a matching task made difficult by factors such as lighting, viewpoint and pose changes and for which absolute scoring approaches are not best suited. Accordingly, we describe a distributed multi-camera tracking (MCT) system to visually aid human operators in associating people and objects effectively over multiple disjoint camera views in a large public space. The system comprises three key novel components: (1) relative measures of ranking rather than absolute scoring to learn the best features for matching; (2) multi-camera behaviour profiling as higher-level knowledge to reduce the search space and increase the chance of finding correct matches; and (3) human-assisted data mining to interactively guide search and in the process recover missing detections and discover previously unknown associations. We provide an extensive evaluation of the greater effectiveness of the system as compared to existing approaches on industry-standard i-LIDS multi-camera data.

  13. Late afternoon view of the interior of the eastcentral wall ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Late afternoon view of the interior of the east-central wall section to be removed; camera facing north. Stubby crape myrtle in front of wall. Metal Quonset hut in background. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  14. Morning view, contextual view of the exterior west side of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view of the exterior west side of the north wall along the unpaved road; camera facing west, positioned in road approximately 8 posts west of the gate. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  15. Contextual view of Treasure Island showing Palace of Fine and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Treasure Island showing Palace of Fine and Decorative Arts (building 3) at right,and Port of the Trade Winds is in foreground, camera facing north - Golden Gate International Exposition, Treasure Island, San Francisco, San Francisco County, CA

  16. 80. ARAIII. Forming of the mechanical equipment pit in reactor ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    80. ARA-III. Forming of the mechanical equipment pit in reactor building (ARA-608). Camera facing northwest. September 22, 1958. Ineel photo no. 58-4675. Photographer: Jack L. Anderson. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  17. Contextual view showing west elevations of building H81 on right ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing west elevations of building H81 on right and H1 in middle; camera facing northeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  18. OVERVIEW OF CENTRAL HEATING PLANT, WITH OIL STORAGE ON LEFT, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    OVERVIEW OF CENTRAL HEATING PLANT, WITH OIL STORAGE ON LEFT, BOILER BUILDING ON RIGHT, SOUTH AND EAST ELEVATIONS, CAMERA FACING NORTH. - New Haven Rail Yard, Central Steam Plant and Oil Storage, Vicinity of Union Avenue, New Haven, New Haven County, CT

  19. Simultaneous Calibration: A Joint Optimization Approach for Multiple Kinect and External Cameras.

    PubMed

    Liao, Yajie; Sun, Ying; Li, Gongfa; Kong, Jianyi; Jiang, Guozhang; Jiang, Du; Cai, Haibin; Ju, Zhaojie; Yu, Hui; Liu, Honghai

    2017-06-24

    Camera calibration is a crucial problem in many applications, such as 3D reconstruction, structure from motion, object tracking and face alignment. Numerous methods have been proposed to solve the above problem with good performance in the last few decades. However, few methods are targeted at joint calibration of multi-sensors (more than four devices), which normally is a practical issue in the real-time systems. In this paper, we propose a novel method and a corresponding workflow framework to simultaneously calibrate relative poses of a Kinect and three external cameras. By optimizing the final cost function and adding corresponding weights to the external cameras in different locations, an effective joint calibration of multiple devices is constructed. Furthermore, the method is tested in a practical platform, and experiment results show that the proposed joint calibration method can achieve a satisfactory performance in a project real-time system and its accuracy is higher than the manufacturer's calibration.

  20. Simultaneous Calibration: A Joint Optimization Approach for Multiple Kinect and External Cameras

    PubMed Central

    Liao, Yajie; Sun, Ying; Li, Gongfa; Kong, Jianyi; Jiang, Guozhang; Jiang, Du; Cai, Haibin; Ju, Zhaojie; Yu, Hui; Liu, Honghai

    2017-01-01

    Camera calibration is a crucial problem in many applications, such as 3D reconstruction, structure from motion, object tracking and face alignment. Numerous methods have been proposed to solve the above problem with good performance in the last few decades. However, few methods are targeted at joint calibration of multi-sensors (more than four devices), which normally is a practical issue in the real-time systems. In this paper, we propose a novel method and a corresponding workflow framework to simultaneously calibrate relative poses of a Kinect and three external cameras. By optimizing the final cost function and adding corresponding weights to the external cameras in different locations, an effective joint calibration of multiple devices is constructed. Furthermore, the method is tested in a practical platform, and experiment results show that the proposed joint calibration method can achieve a satisfactory performance in a project real-time system and its accuracy is higher than the manufacturer’s calibration. PMID:28672823

  1. Use of a microscope-mounted wide-angle point of view camera to record optimal hand position in ocular surgery.

    PubMed

    Gooi, Patrick; Ahmed, Yusuf; Ahmed, Iqbal Ike K

    2014-07-01

    We describe the use of a microscope-mounted wide-angle point-of-view camera to record optimal hand positions in ocular surgery. The camera is mounted close to the objective lens beneath the surgeon's oculars and faces the same direction as the surgeon, providing a surgeon's view. A wide-angle lens enables viewing of both hands simultaneously and does not require repositioning the camera during the case. Proper hand positioning and instrument placement through microincisions are critical for effective and atraumatic handling of tissue within the eye. Our technique has potential in the assessment and training of optimal hand position for surgeons performing intraocular surgery. It is an innovative way to routinely record instrument and operating hand positions in ophthalmic surgery and has minimal requirements in terms of cost, personnel, and operating-room space. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  2. From a Million Miles Away, NASA Camera Shows Moon Crossing Face of Earth

    NASA Image and Video Library

    2015-08-05

    This animation shows images of the far side of the moon, illuminated by the sun, as it crosses between the DISCOVR spacecraft's Earth Polychromatic Imaging Camera (EPIC) camera and telescope, and the Earth - one million miles away. Credits: NASA/NOAA A NASA camera aboard the Deep Space Climate Observatory (DSCOVR) satellite captured a unique view of the moon as it moved in front of the sunlit side of Earth last month. The series of test images shows the fully illuminated “dark side” of the moon that is never visible from Earth. The images were captured by NASA’s Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope on the DSCOVR satellite orbiting 1 million miles from Earth. From its position between the sun and Earth, DSCOVR conducts its primary mission of real-time solar wind monitoring for the National Oceanic and Atmospheric Administration (NOAA). Read more: www.nasa.gov/feature/goddard/from-a-million-miles-away-na... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  3. From a Million Miles Away, NASA Camera Shows Moon Crossing Face of Earth

    NASA Image and Video Library

    2017-12-08

    This animation still image shows the far side of the moon, illuminated by the sun, as it crosses between the DISCOVR spacecraft's Earth Polychromatic Imaging Camera (EPIC) camera and telescope, and the Earth - one million miles away. Credits: NASA/NOAA A NASA camera aboard the Deep Space Climate Observatory (DSCOVR) satellite captured a unique view of the moon as it moved in front of the sunlit side of Earth last month. The series of test images shows the fully illuminated “dark side” of the moon that is never visible from Earth. The images were captured by NASA’s Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope on the DSCOVR satellite orbiting 1 million miles from Earth. From its position between the sun and Earth, DSCOVR conducts its primary mission of real-time solar wind monitoring for the National Oceanic and Atmospheric Administration (NOAA). Read more: www.nasa.gov/feature/goddard/from-a-million-miles-away-na... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  4. A TV Camera System Which Extracts Feature Points For Non-Contact Eye Movement Detection

    NASA Astrophysics Data System (ADS)

    Tomono, Akira; Iida, Muneo; Kobayashi, Yukio

    1990-04-01

    This paper proposes a highly efficient camera system which extracts, irrespective of background, feature points such as the pupil, corneal reflection image and dot-marks pasted on a human face in order to detect human eye movement by image processing. Two eye movement detection methods are sugested: One utilizing face orientation as well as pupil position, The other utilizing pupil and corneal reflection images. A method of extracting these feature points using LEDs as illumination devices and a new TV camera system designed to record eye movement are proposed. Two kinds of infra-red LEDs are used. These LEDs are set up a short distance apart and emit polarized light of different wavelengths. One light source beams from near the optical axis of the lens and the other is some distance from the optical axis. The LEDs are operated in synchronization with the camera. The camera includes 3 CCD image pick-up sensors and a prism system with 2 boundary layers. Incident rays are separated into 2 wavelengths by the first boundary layer of the prism. One set of rays forms an image on CCD-3. The other set is split by the half-mirror layer of the prism and forms an image including the regularly reflected component by placing a polarizing filter in front of CCD-1 or another image not including the component by not placing a polarizing filter in front of CCD-2. Thus, three images with different reflection characteristics are obtained by three CCDs. Through the experiment, it is shown that two kinds of subtraction operations between the three images output from CCDs accentuate three kinds of feature points: the pupil and corneal reflection images and the dot-marks. Since the S/N ratio of the subtracted image is extremely high, the thresholding process is simple and allows reducting the intensity of the infra-red illumination. A high speed image processing apparatus using this camera system is decribed. Realtime processing of the subtraction, thresholding and gravity position calculation of the feature points is possible.

  5. 3D FaceCam: a fast and accurate 3D facial imaging device for biometrics applications

    NASA Astrophysics Data System (ADS)

    Geng, Jason; Zhuang, Ping; May, Patrick; Yi, Steven; Tunnell, David

    2004-08-01

    Human faces are fundamentally three-dimensional (3D) objects, and each face has its unique 3D geometric profile. The 3D geometric features of a human face can be used, together with its 2D texture, for rapid and accurate face recognition purposes. Due to the lack of low-cost and robust 3D sensors and effective 3D facial recognition (FR) algorithms, almost all existing FR systems use 2D face images. Genex has developed 3D solutions that overcome the inherent problems in 2D while also addressing limitations in other 3D alternatives. One important aspect of our solution is a unique 3D camera (the 3D FaceCam) that combines multiple imaging sensors within a single compact device to provide instantaneous, ear-to-ear coverage of a human face. This 3D camera uses three high-resolution CCD sensors and a color encoded pattern projection system. The RGB color information from each pixel is used to compute the range data and generate an accurate 3D surface map. The imaging system uses no moving parts and combines multiple 3D views to provide detailed and complete 3D coverage of the entire face. Images are captured within a fraction of a second and full-frame 3D data is produced within a few seconds. This described method provides much better data coverage and accuracy in feature areas with sharp features or details (such as the nose and eyes). Using this 3D data, we have been able to demonstrate that a 3D approach can significantly improve the performance of facial recognition. We have conducted tests in which we have varied the lighting conditions and angle of image acquisition in the "field." These tests have shown that the matching results are significantly improved when enrolling a 3D image rather than a single 2D image. With its 3D solutions, Genex is working toward unlocking the promise of powerful 3D FR and transferring FR from a lab technology into a real-world biometric solution.

  6. Two-step superresolution approach for surveillance face image through radial basis function-partial least squares regression and locality-induced sparse representation

    NASA Astrophysics Data System (ADS)

    Jiang, Junjun; Hu, Ruimin; Han, Zhen; Wang, Zhongyuan; Chen, Jun

    2013-10-01

    Face superresolution (SR), or face hallucination, refers to the technique of generating a high-resolution (HR) face image from a low-resolution (LR) one with the help of a set of training examples. It aims at transcending the limitations of electronic imaging systems. Applications of face SR include video surveillance, in which the individual of interest is often far from cameras. A two-step method is proposed to infer a high-quality and HR face image from a low-quality and LR observation. First, we establish the nonlinear relationship between LR face images and HR ones, according to radial basis function and partial least squares (RBF-PLS) regression, to transform the LR face into the global face space. Then, a locality-induced sparse representation (LiSR) approach is presented to enhance the local facial details once all the global faces for each LR training face are constructed. A comparison of some state-of-the-art SR methods shows the superiority of the proposed two-step approach, RBF-PLS global face regression followed by LiSR-based local patch reconstruction. Experiments also demonstrate the effectiveness under both simulation conditions and some real conditions.

  7. Energy conservation using face detection

    NASA Astrophysics Data System (ADS)

    Deotale, Nilesh T.; Kalbande, Dhananjay R.; Mishra, Akassh A.

    2011-10-01

    Computerized Face Detection, is concerned with the difficult task of converting a video signal of a person to written text. It has several applications like face recognition, simultaneous multiple face processing, biometrics, security, video surveillance, human computer interface, image database management, digital cameras use face detection for autofocus, selecting regions of interest in photo slideshows that use a pan-and-scale and The Present Paper deals with energy conservation using face detection. Automating the process to a computer requires the use of various image processing techniques. There are various methods that can be used for Face Detection such as Contour tracking methods, Template matching, Controlled background, Model based, Motion based and color based. Basically, the video of the subject are converted into images are further selected manually for processing. However, several factors like poor illumination, movement of face, viewpoint-dependent Physical appearance, Acquisition geometry, Imaging conditions, Compression artifacts makes Face detection difficult. This paper reports an algorithm for conservation of energy using face detection for various devices. The present paper suggests Energy Conservation can be done by Detecting the Face and reducing the brightness of complete image and then adjusting the brightness of the particular area of an image where the face is located using histogram equalization.

  8. Structure Formation in Complex Plasma

    DTIC Science & Technology

    2011-08-24

    Dewer bottle (upper figures) or in the vapor of liquid helium (lower figures). Liq. He Ring electrode Particles Green Laser RF Plasma ... Ring electrode CCD camera Prism mirror Liq. He Glass Tube Liq. N2 Glass Dewar Acrylic particles Gas Helium Green Laser CCD camera Pressure

  9. 67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST OF ASSISTANT LAUNCH CONDUCTOR PANEL SHOWN IN CA-133-1-A-66 - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  10. Trench Reveals Two Faces of Soils

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This approximate true-color image mosaic from the panoramic camera on the Mars Exploration Rover Opportunity shows a trench dug by the rover in the vicinity of the 'Anatolia' region. Two imprints from the rover's Mossbauer spectrometer instrument were left in the exposed soils. Detailed comparisons between soils exposed at the surface and those found at depth reveal that surface soils have higher levels of hematite while subsurface soils show fine particles derived from basalt. The trench is approximately 11 centimeters deep. This image was taken on sol 81 with the panoramic camera's 430-, 530- and 750-nanometer filters.

  11. 'El Capitan's' Scientific Gems

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This mosaic of images taken by the panoramic camera onboard the Mars Exploration Rover Opportunity shows the rock region dubbed 'El Capitan,' which lies within the larger outcrop near the rover's landing site. 'El Capitan' is being studied in great detail using the scientific instruments on the rover's arm; images from the panoramic camera help scientists choose the locations for this compositional work. The millimeter-scale detail of the lamination covering these rocks can be seen. The face of the rock to the right of the mosaic may be a future target for grinding with the rover's rock abrasion tool.

  12. UCXp camera imaging principle and key technologies of data post-processing

    NASA Astrophysics Data System (ADS)

    Yuan, Fangyan; Li, Guoqing; Zuo, Zhengli; Liu, Jianmin; Wu, Liang; Yu, Xiaoping; Zhao, Haitao

    2014-03-01

    The large format digital aerial camera product UCXp was introduced into the Chinese market in 2008, the image consists of 17310 columns and 11310 rows with a pixel size of 6 mm. The UCXp camera has many advantages compared with the same generation camera, with multiple lenses exposed almost at the same time and no oblique lens. The camera has a complex imaging process whose principle will be detailed in this paper. On the other hand, the UCXp image post-processing method, including data pre-processing and orthophoto production, will be emphasized in this article. Based on the data of new Beichuan County, this paper will describe the data processing and effects.

  13. LPT. Low power test control building (TAN641) east facade. Sign ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Low power test control building (TAN-641) east facade. Sign says "Energy and Systems Technology Laboratory, INEL" (Post-ANP-use). Camera facing west. INEEL negative no. HD-40-3-2 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  14. 20. VIEW OF TEST FACILITY IN 1967 WHEN EQUIPPED FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. VIEW OF TEST FACILITY IN 1967 WHEN EQUIPPED FOR DOSIMETER TEST BY HEALTH PHYSICISTS. CAMERA FACING EAST. INEL PHOTO NUMBER 76-2853, TAKEN MAY 16, 1967. PHOTOGRAPHER: CAPEK. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  15. Contextual view of Fyffe Avenue and Boone Drive. Dispensary (Naval ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Fyffe Avenue and Boone Drive. Dispensary (Naval Medical Center Oakland and Dental Clinic San Francisco Branch Clinics, Building no. 417) is shown at left. Camera facing northwest. - Naval Supply Annex Stockton, Rough & Ready Island, Stockton, San Joaquin County, CA

  16. Contextual view of Fyffe Avenue and Boone Drive. Dispensary (Naval ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Fyffe Avenue and Boone Drive. Dispensary (Naval Medical Center Oakland and Dental Clinic San Francisco Branch Clinics, building no. 417) is shown at the center. Camera facing northeast. - Naval Supply Annex Stockton, Rough & Ready Island, Stockton, San Joaquin County, CA

  17. SPERTI, Instrument Cell Building (PER606). Oblique view of north and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SPERT-I, Instrument Cell Building (PER-606). Oblique view of north and east facades. Camera facing southwest. Date: August 2003. INEEL negative no. HD-35-4-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  18. 6. CONSTRUCTION PROGRESS VIEW (EXTERIOR) OF TANK, CABLE CHASE, AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. CONSTRUCTION PROGRESS VIEW (EXTERIOR) OF TANK, CABLE CHASE, AND MOUNDED BUNKER. CONSTRUCTION WAS 99 PERCENT COMPLETE. CAMERA IS FACING WEST. INEL PHOTO NUMBER 65-5435, TAKEN OCTOBER 20, 1965. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  19. Contextual view showing building H70 at left with building H81 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing building H70 at left with building H81 at right in background; camera facing northeast. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  20. 3. VIEW OF ARVFS BUNKER TAKEN FROM APPROXIMATELY 150 FEET ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VIEW OF ARVFS BUNKER TAKEN FROM APPROXIMATELY 150 FEET EAST OF BUNKER DOOR. CAMERA FACING WEST. VIEW SHOWS EARTH MOUND COVERING CONTROL BUNKER AND REMAINS OF CABLE CHASE. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  1. FAST CHOPPER BUILDING, TRA665. DETAIL OF STEEL DOOR ENTRY TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FAST CHOPPER BUILDING, TRA-665. DETAIL OF STEEL DOOR ENTRY TO LOWER LEVEL. CAMERA FACING NORTH. INL NEGATIVE NO. HD42-1. Mike Crane, Photographer, 3/2004 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  2. KENNEDY SPACE CENTER, FLA. - NASA Vehicle Manager Scott Thurston (facing camera) talks to the media in the Orbiter Processing Facility. The media was invited to see the orbiter Atlantis as it is being prepared for Return to Flight. Both local and national reporters representing print and TV networks were able to see work in progress on Atlantis, including the reinstallation of the Reinforced Carbon-Carbon panels on the orbiter’s wing leading edge; wiring inspections; and checks of the engines in the Orbital Maneuvering System.

    NASA Image and Video Library

    2003-09-26

    KENNEDY SPACE CENTER, FLA. - NASA Vehicle Manager Scott Thurston (facing camera) talks to the media in the Orbiter Processing Facility. The media was invited to see the orbiter Atlantis as it is being prepared for Return to Flight. Both local and national reporters representing print and TV networks were able to see work in progress on Atlantis, including the reinstallation of the Reinforced Carbon-Carbon panels on the orbiter’s wing leading edge; wiring inspections; and checks of the engines in the Orbital Maneuvering System.

  3. Covariance analysis for evaluating head trackers

    NASA Astrophysics Data System (ADS)

    Kang, Donghoon

    2017-10-01

    Existing methods for evaluating the performance of head trackers usually rely on publicly available face databases, which contain facial images and the ground truths of their corresponding head orientations. However, most of the existing publicly available face databases are constructed by assuming that a frontal head orientation can be determined by compelling the person under examination to look straight ahead at the camera on the first video frame. Since nobody can accurately direct one's head toward the camera, this assumption may be unrealistic. Rather than obtaining estimation errors, we present a method for computing the covariance of estimation error rotations to evaluate the reliability of head trackers. As an uncertainty measure of estimators, the Schatten 2-norm of a square root of error covariance (or the algebraic average of relative error angles) can be used. The merit of the proposed method is that it does not disturb the person under examination by asking him to direct his head toward certain directions. Experimental results using real data validate the usefulness of our method.

  4. 3D for the people: multi-camera motion capture in the field with consumer-grade cameras and open source software

    PubMed Central

    Evangelista, Dennis J.; Ray, Dylan D.; Hedrick, Tyson L.

    2016-01-01

    ABSTRACT Ecological, behavioral and biomechanical studies often need to quantify animal movement and behavior in three dimensions. In laboratory studies, a common tool to accomplish these measurements is the use of multiple, calibrated high-speed cameras. Until very recently, the complexity, weight and cost of such cameras have made their deployment in field situations risky; furthermore, such cameras are not affordable to many researchers. Here, we show how inexpensive, consumer-grade cameras can adequately accomplish these measurements both within the laboratory and in the field. Combined with our methods and open source software, the availability of inexpensive, portable and rugged cameras will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts. PMID:27444791

  5. The sequence measurement system of the IR camera

    NASA Astrophysics Data System (ADS)

    Geng, Ai-hui; Han, Hong-xia; Zhang, Hai-bo

    2011-08-01

    Currently, the IR cameras are broadly used in the optic-electronic tracking, optic-electronic measuring, fire control and optic-electronic countermeasure field, but the output sequence of the most presently applied IR cameras in the project is complex and the giving sequence documents from the leave factory are not detailed. Aiming at the requirement that the continuous image transmission and image procession system need the detailed sequence of the IR cameras, the sequence measurement system of the IR camera is designed, and the detailed sequence measurement way of the applied IR camera is carried out. The FPGA programming combined with the SignalTap online observation way has been applied in the sequence measurement system, and the precise sequence of the IR camera's output signal has been achieved, the detailed document of the IR camera has been supplied to the continuous image transmission system, image processing system and etc. The sequence measurement system of the IR camera includes CameraLink input interface part, LVDS input interface part, FPGA part, CameraLink output interface part and etc, thereinto the FPGA part is the key composed part in the sequence measurement system. Both the video signal of the CmaeraLink style and the video signal of LVDS style can be accepted by the sequence measurement system, and because the image processing card and image memory card always use the CameraLink interface as its input interface style, the output signal style of the sequence measurement system has been designed into CameraLink interface. The sequence measurement system does the IR camera's sequence measurement work and meanwhile does the interface transmission work to some cameras. Inside the FPGA of the sequence measurement system, the sequence measurement program, the pixel clock modification, the SignalTap file configuration and the SignalTap online observation has been integrated to realize the precise measurement to the IR camera. Te sequence measurement program written by the verilog language combining the SignalTap tool on line observation can count the line numbers in one frame, pixel numbers in one line and meanwhile account the line offset and row offset of the image. Aiming at the complex sequence of the IR camera's output signal, the sequence measurement system of the IR camera accurately measures the sequence of the project applied camera, supplies the detailed sequence document to the continuous system such as image processing system and image transmission system and gives out the concrete parameters of the fval, lval, pixclk, line offset and row offset. The experiment shows that the sequence measurement system of the IR camera can get the precise sequence measurement result and works stably, laying foundation for the continuous system.

  6. Strategic options towards an affordable high-performance infrared camera

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.

    2016-05-01

    The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.

  7. Study of Cryogenic Complex Plasma

    DTIC Science & Technology

    2007-04-26

    enabled us to detect the formation of the Coulomb crystals as shown in Fig. 2. Liq. He Ring electrode Particles Green Laser RF Plasma ... Ring electrode CCD camera Prism mirror Liq. He Glass Tube Liq. N2 Glass Dewar Acrylic particles Gas Helium Green Laser CCD camera Pressure

  8. Standardized rendering from IR surveillance motion imagery

    NASA Astrophysics Data System (ADS)

    Prokoski, F. J.

    2014-06-01

    Government agencies, including defense and law enforcement, increasingly make use of video from surveillance systems and camera phones owned by non-government entities.Making advanced and standardized motion imaging technology available to private and commercial users at cost-effective prices would benefit all parties. In particular, incorporating thermal infrared into commercial surveillance systems offers substantial benefits beyond night vision capability. Face rendering is a process to facilitate exploitation of thermal infrared surveillance imagery from the general area of a crime scene, to assist investigations with and without cooperating eyewitnesses. Face rendering automatically generates greyscale representations similar to police artist sketches for faces in surveillance imagery collected from proximate locations and times to a crime under investigation. Near-realtime generation of face renderings can provide law enforcement with an investigation tool to assess witness memory and credibility, and integrate reports from multiple eyewitnesses, Renderings can be quickly disseminated through social media to warn of a person who may pose an immediate threat, and to solicit the public's help in identifying possible suspects and witnesses. Renderings are pose-standardized so as to not divulge the presence and location of eyewitnesses and surveillance cameras. Incorporation of thermal infrared imaging into commercial surveillance systems will significantly improve system performance, and reduce manual review times, at an incremental cost that will continue to decrease. Benefits to criminal justice would include improved reliability of eyewitness testimony and improved accuracy of distinguishing among minority groups in eyewitness and surveillance identifications.

  9. Reverse alignment "mirror image" visualization as a laparoscopic training tool improves task performance.

    PubMed

    Dunnican, Ward J; Singh, T Paul; Ata, Ashar; Bendana, Emma E; Conlee, Thomas D; Dolce, Charles J; Ramakrishnan, Rakesh

    2010-06-01

    Reverse alignment (mirror image) visualization is a disconcerting situation occasionally faced during laparoscopic operations. This occurs when the camera faces back at the surgeon in the opposite direction from which the surgeon's body and instruments are facing. Most surgeons will attempt to optimize trocar and camera placement to avoid this situation. The authors' objective was to determine whether the intentional use of reverse alignment visualization during laparoscopic training would improve performance. A standard box trainer was configured for reverse alignment, and 34 medical students and junior surgical residents were randomized to train with either forward alignment (DIRECT) or reverse alignment (MIRROR) visualization. Enrollees were tested on both modalities before and after a 4-week structured training program specific to their modality. Student's t test was used to determine differences in task performance between the 2 groups. Twenty-one participants completed the study (10 DIRECT, 11 MIRROR). There were no significant differences in performance time between DIRECT or MIRROR participants during forward or reverse alignment initial testing. At final testing, DIRECT participants had improved times only in forward alignment performance; they demonstrated no significant improvement in reverse alignment performance. MIRROR participants had significant time improvement in both forward and reverse alignment performance at final testing. Reverse alignment imaging for laparoscopic training improves task performance for both reverse alignment and forward alignment tasks. This may be translated into improved performance in the operating room when faced with reverse alignment situations. Minimal lab training can account for drastic adaptation to this environment.

  10. Real-time vehicle matching for multi-camera tunnel surveillance

    NASA Astrophysics Data System (ADS)

    Jelača, Vedran; Niño Castañeda, Jorge Oswaldo; Frías-Velázquez, Andrés; Pižurica, Aleksandra; Philips, Wilfried

    2011-03-01

    Tracking multiple vehicles with multiple cameras is a challenging problem of great importance in tunnel surveillance. One of the main challenges is accurate vehicle matching across the cameras with non-overlapping fields of view. Since systems dedicated to this task can contain hundreds of cameras which observe dozens of vehicles each, for a real-time performance computational efficiency is essential. In this paper, we propose a low complexity, yet highly accurate method for vehicle matching using vehicle signatures composed of Radon transform like projection profiles of the vehicle image. The proposed signatures can be calculated by a simple scan-line algorithm, by the camera software itself and transmitted to the central server or to the other cameras in a smart camera environment. The amount of data is drastically reduced compared to the whole image, which relaxes the data link capacity requirements. Experiments on real vehicle images, extracted from video sequences recorded in a tunnel by two distant security cameras, validate our approach.

  11. Image quality assessment for selfies with and without super resolution

    NASA Astrophysics Data System (ADS)

    Kubota, Aya; Gohshi, Seiichi

    2018-04-01

    With the advent of cellphone cameras, in particular, on smartphones, many people now take photos of themselves alone and with others in the frame; such photos are popularly known as "selfies". Most smartphones are equipped with two cameras: the front-facing and rear cameras. The camera located on the back of the smartphone is referred to as the "out-camera," whereas the one located on the front of the smartphone is called the "in-camera." In-cameras are mainly used for selfies. Some smartphones feature high-resolution cameras. However, the original image quality cannot be obtained because smartphone cameras often have low-performance lenses. Super resolution (SR) is one of the recent technological advancements that has increased image resolution. We developed a new SR technology that can be processed on smartphones. Smartphones with new SR technology are currently available in the market have already registered sales. However, the effective use of new SR technology has not yet been verified. Comparing the image quality with and without SR on smartphone display is necessary to confirm the usefulness of this new technology. Methods that are based on objective and subjective assessments are required to quantitatively measure image quality. It is known that the typical object assessment value, such as Peak Signal to Noise Ratio (PSNR), does not go together with how we feel when we assess image/video. When digital broadcast started, the standard was determined using subjective assessment. Although subjective assessment usually comes at high cost because of personnel expenses for observers, the results are highly reproducible when they are conducted under right conditions and statistical analysis. In this study, the subjective assessment results for selfie images are reported.

  12. 2. EXTERIOR VIEW OF DOWNSTREAM SIDE OF COTTAGE 191 TAKEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. EXTERIOR VIEW OF DOWNSTREAM SIDE OF COTTAGE 191 TAKEN FROM ROOF OF GARAGE 393. CAMERA FACING SOUTHEAST. COTTAGE 181 AND CHILDREN'S PLAY AREA VISIBLE ON EITHER SIDE OF ROOF. GRAPE ARBOR IN FOREGROUND. - Swan Falls Village, Cottage 191, Snake River, Kuna, Ada County, ID

  13. Contextual view of the Hall of Transportation from Yerba Buena ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of the Hall of Transportation from Yerba Buena Island, showing Palace of Fine and Decorative Arts (Building 3) at far right, camera facing northwest - Golden Gate International Exposition, Hall of Transportation, 440 California Avenue, Treasure Island, San Francisco, San Francisco County, CA

  14. FAN HOUSE INTERIOR. THREE MOTOR DRIVES FOR POSITIVE DISPLACEMENT BLOWERS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FAN HOUSE INTERIOR. THREE MOTOR DRIVES FOR POSITIVE DISPLACEMENT BLOWERS LINE UP ON NORTH WALL. CONCRETE PEDESTALS. CAMERA FACES NORTHEAST. INL NEGATIVE NO. 4291. Unknown Photographer, 2/26/1952 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  15. Interior view of west main room in original tworoom portion. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of west main room in original two-room portion. Note muslin ceiling temporarily tacked up by the HABS team to afford clearer view. Camera facing west. - Warner Ranch, Ranch House, San Felipe Road (State Highway S2), Warner Springs, San Diego County, CA

  16. Methodology of the determination of the uncertainties by using the biometric device the broadway 3D

    NASA Astrophysics Data System (ADS)

    Jasek, Roman; Talandova, Hana; Adamek, Milan

    2016-06-01

    The biometric identification by face is among one of the most widely used methods of biometric identification. Due to it provides a faster and more accurate identification; it was implemented into area of security 3D face reader by Broadway manufacturer was used to measure. It is equipped with the 3D camera system, which uses the method of structured light scanning and saves the template into the 3D model of face. The obtained data were evaluated by software Turnstile Enrolment Application (TEA). The measurements were used 3D face reader the Broadway 3D. First, the person was scanned and stored in the database. Thereafter person has already been compared with the stored template in the database for each method. Finally, a measure of reliability was evaluated for the Broadway 3D face reader.

  17. Dark Spots on Titan

    NASA Image and Video Library

    2005-05-02

    This recent image of Titan reveals more complex patterns of bright and dark regions on the surface, including a small, dark, circular feature, completely surrounded by brighter material. During the two most recent flybys of Titan, on March 31 and April 16, 2005, Cassini captured a number of images of the hemisphere of Titan that faces Saturn. The image at the left is taken from a mosaic of images obtained in March 2005 (see PIA06222) and shows the location of the more recently acquired image at the right. The new image shows intriguing details in the bright and dark patterns near an 80-kilometer-wide (50-mile) crater seen first by Cassini's synthetic aperture radar experiment during a Titan flyby in February 2005 (see PIA07368) and subsequently seen by the imaging science subsystem cameras as a dark spot (center of the image at the left). Interestingly, a smaller, roughly 20-kilometer-wide (12-mile), dark and circular feature can be seen within an irregularly-shaped, brighter ring, and is similar to the larger dark spot associated with the radar crater. However, the imaging cameras see only brightness variations, and without topographic information, the identity of this feature as an impact crater cannot be conclusively determined from this image. The visual infrared mapping spectrometer, which is sensitive to longer wavelengths where Titan's atmospheric haze is less obscuring -- observed this area simultaneously with the imaging cameras, so those data, and perhaps future observations by Cassini's radar, may help to answer the question of this feature's origin. The new image at the right consists of five images that have been added together and enhanced to bring out surface detail and to reduce noise, although some camera artifacts remain. These images were taken with the Cassini spacecraft narrow-angle camera using a filter sensitive to wavelengths of infrared light centered at 938 nanometers -- considered to be the imaging science subsystem's best spectral filter for observing the surface of Titan. This view was acquired from a distance of 33,000 kilometers (20,500 miles). The pixel scale of this image is 390 meters (0.2 miles) per pixel, although the actual resolution is likely to be several times larger. http://photojournal.jpl.nasa.gov/catalog/PIA06234

  18. An Orientation Sensor-Based Head Tracking System for Driver Behaviour Monitoring.

    PubMed

    Zhao, Yifan; Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros

    2017-11-22

    Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers' behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone.

  19. Nonintrusive iris image acquisition system based on a pan-tilt-zoom camera and light stripe projection

    NASA Astrophysics Data System (ADS)

    Yoon, Soweon; Jung, Ho Gi; Park, Kang Ryoung; Kim, Jaihie

    2009-03-01

    Although iris recognition is one of the most accurate biometric technologies, it has not yet been widely used in practical applications. This is mainly due to user inconvenience during the image acquisition phase. Specifically, users try to adjust their eye position within small capture volume at a close distance from the system. To overcome these problems, we propose a novel iris image acquisition system that provides users with unconstrained environments: a large operating range, enabling movement from standing posture, and capturing good-quality iris images in an acceptable time. The proposed system has the following three contributions compared with previous works: (1) the capture volume is significantly increased by using a pan-tilt-zoom (PTZ) camera guided by a light stripe projection, (2) the iris location in the large capture volume is found fast due to 1-D vertical face searching from the user's horizontal position obtained by the light stripe projection, and (3) zooming and focusing on the user's irises at a distance are accurate and fast using the estimated 3-D position of a face by the light stripe projection and the PTZ camera. Experimental results show that the proposed system can capture good-quality iris images in 2.479 s on average at a distance of 1.5 to 3 m, while allowing a limited amount of movement by the user.

  20. Automatic 2.5-D Facial Landmarking and Emotion Annotation for Social Interaction Assistance.

    PubMed

    Zhao, Xi; Zou, Jianhua; Li, Huibin; Dellandrea, Emmanuel; Kakadiaris, Ioannis A; Chen, Liming

    2016-09-01

    People with low vision, Alzheimer's disease, and autism spectrum disorder experience difficulties in perceiving or interpreting facial expression of emotion in their social lives. Though automatic facial expression recognition (FER) methods on 2-D videos have been extensively investigated, their performance was constrained by challenges in head pose and lighting conditions. The shape information in 3-D facial data can reduce or even overcome these challenges. However, high expenses of 3-D cameras prevent their widespread use. Fortunately, 2.5-D facial data from emerging portable RGB-D cameras provide a good balance for this dilemma. In this paper, we propose an automatic emotion annotation solution on 2.5-D facial data collected from RGB-D cameras. The solution consists of a facial landmarking method and a FER method. Specifically, we propose building a deformable partial face model and fit the model to a 2.5-D face for localizing facial landmarks automatically. In FER, a novel action unit (AU) space-based FER method has been proposed. Facial features are extracted using landmarks and further represented as coordinates in the AU space, which are classified into facial expressions. Evaluated on three publicly accessible facial databases, namely EURECOM, FRGC, and Bosphorus databases, the proposed facial landmarking and expression recognition methods have achieved satisfactory results. Possible real-world applications using our algorithms have also been discussed.

  1. High-accuracy and robust face recognition system based on optical parallel correlator using a temporal image sequence

    NASA Astrophysics Data System (ADS)

    Watanabe, Eriko; Ishikawa, Mami; Ohta, Maiko; Kodate, Kashiko

    2005-09-01

    Face recognition is used in a wide range of security systems, such as monitoring credit card use, searching for individuals with street cameras via Internet and maintaining immigration control. There are still many technical subjects under study. For instance, the number of images that can be stored is limited under the current system, and the rate of recognition must be improved to account for photo shots taken at different angles under various conditions. We implemented a fully automatic Fast Face Recognition Optical Correlator (FARCO) system by using a 1000 frame/s optical parallel correlator designed and assembled by us. Operational speed for the 1: N (i.e. matching a pair of images among N, where N refers to the number of images in the database) identification experiment (4000 face images) amounts to less than 1.5 seconds, including the pre/post processing. From trial 1: N identification experiments using FARCO, we acquired low error rates of 2.6% False Reject Rate and 1.3% False Accept Rate. By making the most of the high-speed data-processing capability of this system, much more robustness can be achieved for various recognition conditions when large-category data are registered for a single person. We propose a face recognition algorithm for the FARCO while employing a temporal image sequence of moving images. Applying this algorithm to a natural posture, a two times higher recognition rate scored compared with our conventional system. The system has high potential for future use in a variety of purposes such as search for criminal suspects by use of street and airport video cameras, registration of babies at hospitals or handling of an immeasurable number of images in a database.

  2. I spy with my little eye: typical, daily exposure to faces documented from a first-person infant perspective.

    PubMed

    Sugden, Nicole A; Mohamed-Ali, Marwan I; Moulson, Margaret C

    2014-02-01

    Exposure to faces is known to shape and change the face processing system; however, no study has yet documented infants' natural daily first-hand exposure to faces. One- and three-month-old infants' visual experience was recorded through head-mounted cameras. The video recordings were coded for faces to determine: (1) How often are infants exposed to faces? (2) To what type of faces are they exposed? and (3) Do frequently encountered face types reflect infants' typical pattern of perceptual narrowing? As hypothesized, infants spent a large proportion of their time (25%) exposed to faces; these faces were primarily female (70%), own-race (96%), and adult-age (81%). Infants were exposed to more individual exemplars of female, own-race, and adult-age faces than to male, other-race, and child- or older-adult-age faces. Each exposure to own-race faces was longer than to other-race faces. There were no differences in exposure duration related to the gender or age of the face. Previous research has found that the face types frequently experienced by our participants are preferred over and more successfully recognized than other face types. The patterns of face exposure revealed in the current study coincide with the known trajectory of perceptual narrowing seen later in infancy. © 2013 The Authors. Developmental Psychobiology Published by Wiley Periodicals, Inc.

  3. HOT CELL BUILDING, TRA632. CONTEXTUAL AERIAL VIEW OF HOT CELL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632. CONTEXTUAL AERIAL VIEW OF HOT CELL BUILDING, IN VIEW AT LEFT, AS YET WITHOUT ROOF. PLUG STORAGE BUILDING LIES BETWEEN IT AND THE SOUTH SIDE OF THE MTR BUILDING AND ITS WING. NOTE CONCRETE DRIVE BETWEEN ROLL-UP DOOR IN MTR BUILDING AND CHARGING FACE OF PLUG STORAGE. REACTOR SERVICES BUILDING (TRA-635) WILL COVER THIS DRIVE AND BUTT UP TO CHARGING FACE. DOTTED LINE IS ON ORIGINAL NEGATIVE. TRA PARKING LOT IN LEFT CORNER OF THE VIEW. CAMERA FACING NORTHWESTERLY. INL NEGATIVE NO. 8274. Unknown Photographer, 7/2/1953 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  4. Instruments for Imaging from Far to Near

    NASA Technical Reports Server (NTRS)

    Mungas, Greg; Boynton, John; Sepulveda, Cesar

    2009-01-01

    The acronym CHAMP (signifying camera, hand lens, and microscope ) denotes any of several proposed optoelectronic instruments that would be capable of color imaging at working distances that could be varied continuously through a range from infinity down to several millimeters. As in any optical instrument, the magnification, depth of field, and spatial resolution would vary with the working distance. For example, in one CHAMP version, at a working distance of 2.5 m, the instrument would function as an electronic camera with a magnification of 1/100, whereas at a working distance of 7 mm, the instrument would function as a microscope/electronic camera with a magnification of 4.4. Moreover, as described below, when operating at or near the shortest-working-distance/highest-magnification combination, a CHAMP could be made to perform one or more spectral imaging functions. CHAMPs were originally intended to be used in robotic geological exploration of the Moon and Mars. The CHAMP concept also has potential for diverse terrestrial applications that could include remotely controlled or robotic geological exploration, prospecting, field microbiology, environmental surveying, and assembly- line inspection. A CHAMP (see figure) would include two lens cells: (1) a distal cell corresponding to the objective lens assembly of a conventional telescope or microscope and (2) a proximal cell that would contain the focusing camera lens assembly and the camera electronic image-detector chip, which would be of the active-pixel-sensor (APS) type. The distal lens cell would face outward from a housing, while the proximal lens cell would lie in a clean environment inside the housing. The proximal lens cell would contain a beam splitter that would enable simultaneous use of the imaging optics (that is, proximal and distal lens assemblies) for imaging and illumination of the field of view. The APS chip would be mounted on a focal plane on a side face of the beam splitter, while light for illuminating the field of view would enter the imaging optics via the end face of the beam splitter. The proximal lens cell would be mounted on a sled that could be translated along the optical axis for focus adjustment. The position of the CHAMP would initially be chosen at the desired working distance of the distal lens from (corresponding to an approximate desired magnification of) an object to be examined. During subsequent operation, the working distance would ordinarily remain fixed at the chosen value and the position of the proximal lens cell within the instrument would be adjusted for focus as needed.

  5. ETR ELECTRICAL BUILDING, TRA648. EMERGENCY STANDBY GENERATOR AND DIESEL UNIT. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR ELECTRICAL BUILDING, TRA-648. EMERGENCY STANDBY GENERATOR AND DIESEL UNIT. METAL ROOF AND PUMICE BLOCK WALLS. CAMERA FACING SOUTHWEST. INL NEGATIVE NO. 56-3708. R.G. Larsen, Photographer, 11/13/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  6. ETR BUILDING, TRA642, INTERIOR. CONSOLE FLOOR, SOUTH HALF. CABLE TUNNEL. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. CONSOLE FLOOR, SOUTH HALF. CABLE TUNNEL. CAMERA FACING SOUTH INTO ETR ELECTRICAL BUILDING (TRA-648). INL NEGATIVE NO. HD46-20-2. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. 4. CONSTRUCTION PROGRESS VIEW OF EQUIPMENT IN FRONT PART OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. CONSTRUCTION PROGRESS VIEW OF EQUIPMENT IN FRONT PART OF CONTROL BUNKER (TRANSFORMER, HYDRAULIC TANK, PUMP, MOTOR). SHOWS UNLINED CORRUGATED METAL WALL. CAMERA FACING EAST. INEL PHOTO NUMBER 65-5433, TAKEN OCTOBER 20, 1965. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  8. ETR HEAT EXCHANGER BUILDING, TRA644. DETAIL OF SOUTH SIDE BUILDING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR HEAT EXCHANGER BUILDING, TRA-644. DETAIL OF SOUTH SIDE BUILDING INSET. DEMINERALIZER WING AT RIGHT. CAMERA FACING NORTH. INL NEGATIVE NO. HD46-36-2. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. SPERTI Terminal Building (PER604) with view into interior. Storage tanks ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SPERT-I Terminal Building (PER-604) with view into interior. Storage tanks and equipment in view. Camera facing west. Photographer: R.G. Larsen. Date: May 20, 1955. INEEL negative no. 55-1291 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  10. How Does Your Face Compute?

    ERIC Educational Resources Information Center

    Mathes, Len

    1999-01-01

    Offers a solution to overcoming students' lack of motivation for self-portrait assignments and complaints about their final product. Explains that a digital camera, computer, and a flatbed scanner provide students with the means to create self-portraits in their exact likeness by transforming a photograph of themselves using photograph enhancing…

  11. 24. VIEW OF CANYON TAKEN FROM NORTH CANYON RIM AROUND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. VIEW OF CANYON TAKEN FROM NORTH CANYON RIM AROUND 1920. CAMERA FACES SOUTH. VILLAGE IS TREE-COVERED AREA TO LEFT OF DAM AND POWERHOUSE. SUPERINTENDENT SAM GLASS'S ORCHARD IS DOWNSTREAM OF DAM ABOUT A QUARTER OF A MILE. - Swan Falls Village, Snake River, Kuna, Ada County, ID

  12. Detail of west side, showing the secondstory of two story ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of west side, showing the second-story of two story bay and standing-seam metal roof, camera facing northeast - Naval Training Station, Senior Officers' Quarters District, Quarters No. 1, Naval Station Treasure Island, 1 Whiting Way, Yerba Buena Island, San Francisco, San Francisco County, CA

  13. A&M. TAN607. Southern sections added in expansion project of 1957. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. TAN-607. Southern sections added in expansion project of 1957. Camera facing northwest. Concrete decontamination section on left end. Photographer: Jack L. Anderson. Date: October 23, 1957. INEEL negative no. 57-5337 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  14. Center of parcel with picture tube wall along walkway. Leaning ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Center of parcel with picture tube wall along walkway. Leaning Tower of Bottle Village at frame right; oblique view of Rumpus Room, remnants of Little Hut destroyed by Northridge earthquake at frame left. Camera facing northeast. - Grandma Prisbrey's Bottle Village, 4595 Cochran Street, Simi Valley, Ventura County, CA

  15. The So-Called 'Face on Mars' in Infrared

    NASA Technical Reports Server (NTRS)

    2002-01-01

    [figure removed for brevity, see original site] (Released 24 July 2002) This set of THEMIS infrared images shows the so-called 'face on Mars' landform located in the northern plains of Mars near 40o N, 10o W (350 o E). The 'face' is located near the center of the image approximately 1/6 of the way down from the top, and is one of a large number of knobs, mesas, hills, and buttes that are visible in this THEMIS image. The THEMIS infrared camera has ten different filters between 6.2 and 15 micrometers - nine view the surface and one views the CO2 atmosphere. The calibrated and geometrically projected data from all of the nine surface-viewing filters are shown in this figure. The major differences seen in this region are due to temperature effects -- sunlit slopes are warm (bright), whereas those in shadow are cold (dark), The temperature in this scene ranges from 50 oC (darkest) to 15 oC (brightest). The major differences between the different filters are due to the expected variation in the amount of energy emitted from the surface at different wavelengths. Minor spectral differences (infrared 'color') also exist between the different filters, but these differences are small in this region due to the uniform composition of the rocks and soils exposed at the surface. The THEMIS infrared camera provides an excellent regional view of Mars - this image covers an area 32 kilometers (20 miles) by approximately 200 kilometers (125 miles) at a resolution of 100 meters per picture element ('pixel'). This image provides a broad perspective of the landscape and geology of the Cydonia region, showing numerous knobs and hills that have been eroded into a remarkable array of different shapes. In this 'big picture' view the Cydonia region is seen to be covered with dozens of interesting knobs and mesas that are similar in many ways to the knob named the 'face' - so many in fact that it requires care to discover the 'face' among this jumble of knobs and hills. The 3-km long 'face' knob was first imaged by the Viking spacecraft in the 1970's and was seen by some to resemble a face carved into the rocks of Mars. Since that time the Mars Orbiter Camera on the Mars Global Surveyor spacecraft has provided detailed views of this hill that clearly show that it is a normal geologic feature with slopes and ridges carved by eons of wind and downslope motion due to gravity. Many of the knobs in Cydonia, including the 'face', have several flat ledges partway up the hill slopes. These ledges are made of more resistant layers of rock and are the last remnants of layers that once were continuous across this entire region. Erosion has completely removed these layers in most places, leaving behind only the small isolated hills and knobs seen today.

  16. Researches on hazard avoidance cameras calibration of Lunar Rover

    NASA Astrophysics Data System (ADS)

    Li, Chunyan; Wang, Li; Lu, Xin; Chen, Jihua; Fan, Shenghong

    2017-11-01

    Lunar Lander and Rover of China will be launched in 2013. It will finish the mission targets of lunar soft landing and patrol exploration. Lunar Rover has forward facing stereo camera pair (Hazcams) for hazard avoidance. Hazcams calibration is essential for stereo vision. The Hazcam optics are f-theta fish-eye lenses with a 120°×120° horizontal/vertical field of view (FOV) and a 170° diagonal FOV. They introduce significant distortion in images and the acquired images are quite warped, which makes conventional camera calibration algorithms no longer work well. A photogrammetric calibration method of geometric model for the type of optical fish-eye constructions is investigated in this paper. In the method, Hazcams model is represented by collinearity equations with interior orientation and exterior orientation parameters [1] [2]. For high-precision applications, the accurate calibration model is formulated with the radial symmetric distortion and the decentering distortion as well as parameters to model affinity and shear based on the fisheye deformation model [3] [4]. The proposed method has been applied to the stereo camera calibration system for Lunar Rover.

  17. Thermal imaging as a smartphone application: exploring and implementing a new concept

    NASA Astrophysics Data System (ADS)

    Yanai, Omer

    2014-06-01

    Today's world is going mobile. Smartphone devices have become an important part of everyday life for billions of people around the globe. Thermal imaging cameras have been around for half a century and are now making their way into our daily lives. Originally built for military applications, thermal cameras are starting to be considered for personal use, enabling enhanced vision and temperature mapping for different groups of professional individuals. Through a revolutionary concept that turns smartphones into fully functional thermal cameras, we have explored how these two worlds can converge by utilizing the best of each technology. We will present the thought process, design considerations and outcome of our development process, resulting in a low-power, high resolution, lightweight USB thermal imaging device that turns Android smartphones into thermal cameras. We will discuss the technological challenges that we faced during the development of the product, and what are the system design decisions taken during the implementation. We will provide some insights we came across during this development process. Finally, we will discuss the opportunities that this innovative technology brings to the market.

  18. Efficient large-scale graph data optimization for intelligent video surveillance

    NASA Astrophysics Data System (ADS)

    Shang, Quanhong; Zhang, Shujun; Wang, Yanbo; Sun, Chen; Wang, Zepeng; Zhang, Luming

    2017-08-01

    Society is rapidly accepting the use of a wide variety of cameras Location and applications: site traffic monitoring, parking Lot surveillance, car and smart space. These ones here the camera provides data every day in an analysis Effective way. Recent advances in sensor technology Manufacturing, communications and computing are stimulating.The development of new applications that can change the traditional Vision system incorporating universal smart camera network. This Analysis of visual cues in multi camera networks makes wide Applications ranging from smart home and office automation to large area surveillance and traffic surveillance. In addition, dense Camera networks, most of which have large overlapping areas of cameras. In the view of good research, we focus on sparse camera networks. One Sparse camera network using large area surveillance. As few cameras as possible, most cameras do not overlap Each other’s field of vision. This task is challenging Lack of knowledge of topology Network, the specific changes in appearance and movement Track different opinions of the target, as well as difficulties Understanding complex events in a network. In this review in this paper, we present a comprehensive survey of recent studies Results to solve the problem of topology learning, Object appearance modeling and global activity understanding sparse camera network. In addition, some of the current open Research issues are discussed.

  19. Integrated Lloyd's mirror on planar waveguide facet as a spectrometer.

    PubMed

    Morand, Alain; Benech, Pierre; Gri, Martine

    2017-12-10

    A low-cost and simple Fourier transform spectrometer based on the Lloyd's mirror configuration is proposed in order to have a very stable interferogram. A planar waveguide coupled to a fiber injection is used to spatially disperse the optical beam. A second beam superposed to the previous one is obtained by a total reflection of the incident beam on a vertical glass face integrated in the chip by dicing with a specific circular precision saw. The interferogram at the waveguide output is imaged on a near-infrared camera with an objective lens. The contrast and the fringe period are thus dependent on the type and the fiber position and can be optimized to the pixel size and the length of the camera. Spectral resolution close to λ/Δλ=80 is reached with a camera with 320 pixels of 25 μm width in a wavelength range from O to L bands.

  20. How much camera separation should be used for the capture and presentation of 3D stereoscopic imagery on binocular HMDs?

    NASA Astrophysics Data System (ADS)

    McIntire, John; Geiselman, Eric; Heft, Eric; Havig, Paul

    2011-06-01

    Designers, researchers, and users of binocular stereoscopic head- or helmet-mounted displays (HMDs) face the tricky issue of what imagery to present in their particular displays, and how to do so effectively. Stereoscopic imagery must often be created in-house with a 3D graphics program or from within a 3D virtual environment, or stereoscopic photos/videos must be carefully captured, perhaps for relaying to an operator in a teleoperative system. In such situations, the question arises as to what camera separation (real or virtual) is appropriate or desirable for end-users and operators. We review some of the relevant literature regarding the question of stereo pair camera separation using deskmounted or larger scale stereoscopic displays, and employ our findings to potential HMD applications, including command & control, teleoperation, information and scientific visualization, and entertainment.

  1. Feasibility evaluation of a motion detection system with face images for stereotactic radiosurgery.

    PubMed

    Yamakawa, Takuya; Ogawa, Koichi; Iyatomi, Hitoshi; Kunieda, Etsuo

    2011-01-01

    In stereotactic radiosurgery we can irradiate a targeted volume precisely with a narrow high-energy x-ray beam, and thus the motion of a targeted area may cause side effects to normal organs. This paper describes our motion detection system with three USB cameras. To reduce the effect of change in illuminance in a tracking area we used an infrared light and USB cameras that were sensitive to the infrared light. The motion detection of a patient was performed by tracking his/her ears and nose with three USB cameras, where pattern matching between a predefined template image for each view and acquired images was done by an exhaustive search method with a general-purpose computing on a graphics processing unit (GPGPU). The results of the experiments showed that the measurement accuracy of our system was less than 0.7 mm, amounting to less than half of that of our previous system.

  2. Design and realization of an AEC&AGC system for the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Feng, Bing; Wang, Peng; Li, Yan; Wei, Hao yun

    2015-08-01

    An AEC and AGC(Automatic Exposure Control and Automatic Gain Control) system was designed for a CCD aerial camera with fixed aperture and electronic shutter. The normal AEC and AGE algorithm is not suitable to the aerial camera since the camera always takes high-resolution photographs in high-speed moving. The AEC and AGE system adjusts electronic shutter and camera gain automatically according to the target brightness and the moving speed of the aircraft. An automatic Gamma correction is used before the image is output so that the image is better for watching and analyzing by human eyes. The AEC and AGC system could avoid underexposure, overexposure, or image blurring caused by fast moving or environment vibration. A series of tests proved that the system meet the requirements of the camera system with its fast adjusting speed, high adaptability, high reliability in severe complex environment.

  3. Virtual Vision

    NASA Astrophysics Data System (ADS)

    Terzopoulos, Demetri; Qureshi, Faisal Z.

    Computer vision and sensor networks researchers are increasingly motivated to investigate complex multi-camera sensing and control issues that arise in the automatic visual surveillance of extensive, highly populated public spaces such as airports and train stations. However, they often encounter serious impediments to deploying and experimenting with large-scale physical camera networks in such real-world environments. We propose an alternative approach called "Virtual Vision", which facilitates this type of research through the virtual reality simulation of populated urban spaces, camera sensor networks, and computer vision on commodity computers. We demonstrate the usefulness of our approach by developing two highly automated surveillance systems comprising passive and active pan/tilt/zoom cameras that are deployed in a virtual train station environment populated by autonomous, lifelike virtual pedestrians. The easily reconfigurable virtual cameras distributed in this environment generate synthetic video feeds that emulate those acquired by real surveillance cameras monitoring public spaces. The novel multi-camera control strategies that we describe enable the cameras to collaborate in persistently observing pedestrians of interest and in acquiring close-up videos of pedestrians in designated areas.

  4. Error analysis for creating 3D face templates based on cylindrical quad-tree structure

    NASA Astrophysics Data System (ADS)

    Gutfeter, Weronika

    2015-09-01

    Development of new biometric algorithms is parallel to advances in technology of sensing devices. Some of the limitations of the current face recognition systems may be eliminated by integrating 3D sensors into these systems. Depth sensing devices can capture a spatial structure of the face in addition to the texture and color. This kind of data is yet usually very voluminous and requires large amount of computer resources for being processed (face scans obtained with typical depth cameras contain more than 150 000 points per face). That is why defining efficient data structures for processing spatial images is crucial for further development of 3D face recognition methods. The concept described in this work fulfills the aforementioned demands. Modification of the quad-tree structure was chosen because it can be easily transformed into less dimensional data structures and maintains spatial relations between data points. We are able to interpret data stored in the tree as a pyramid of features which allow us to analyze face images using coarse-to-fine strategy, often exploited in biometric recognition systems.

  5. Face antispoofing based on frame difference and multilevel representation

    NASA Astrophysics Data System (ADS)

    Benlamoudi, Azeddine; Aiadi, Kamal Eddine; Ouafi, Abdelkrim; Samai, Djamel; Oussalah, Mourad

    2017-07-01

    Due to advances in technology, today's biometric systems become vulnerable to spoof attacks made by fake faces. These attacks occur when an intruder attempts to fool an established face-based recognition system by presenting a fake face (e.g., print photo or replay attacks) in front of the camera instead of the intruder's genuine face. For this purpose, face antispoofing has become a hot topic in face analysis literature, where several applications with antispoofing task have emerged recently. We propose a solution for distinguishing between real faces and fake ones. Our approach is based on extracting features from the difference between successive frames instead of individual frames. We also used a multilevel representation that divides the frame difference into multiple multiblocks. Different texture descriptors (local binary patterns, local phase quantization, and binarized statistical image features) have then been applied to each block. After the feature extraction step, a Fisher score is applied to sort the features in ascending order according to the associated weights. Finally, a support vector machine is used to differentiate between real and fake faces. We tested our approach on three publicly available databases: CASIA Face Antispoofing database, Replay-Attack database, and MSU Mobile Face Spoofing database. The proposed approach outperforms the other state-of-the-art methods in different media and quality metrics.

  6. Appearance-based multimodal human tracking and identification for healthcare in the digital home.

    PubMed

    Yang, Mau-Tsuen; Huang, Shen-Yen

    2014-08-05

    There is an urgent need for intelligent home surveillance systems to provide home security, monitor health conditions, and detect emergencies of family members. One of the fundamental problems to realize the power of these intelligent services is how to detect, track, and identify people at home. Compared to RFID tags that need to be worn all the time, vision-based sensors provide a natural and nonintrusive solution. Observing that body appearance and body build, as well as face, provide valuable cues for human identification, we model and record multi-view faces, full-body colors and shapes of family members in an appearance database by using two Kinects located at a home's entrance. Then the Kinects and another set of color cameras installed in other parts of the house are used to detect, track, and identify people by matching the captured color images with the registered templates in the appearance database. People are detected and tracked by multisensor fusion (Kinects and color cameras) using a Kalman filter that can handle duplicate or partial measurements. People are identified by multimodal fusion (face, body appearance, and silhouette) using a track-based majority voting. Moreover, the appearance-based human detection, tracking, and identification modules can cooperate seamlessly and benefit from each other. Experimental results show the effectiveness of the human tracking across multiple sensors and human identification considering the information of multi-view faces, full-body clothes, and silhouettes. The proposed home surveillance system can be applied to domestic applications in digital home security and intelligent healthcare.

  7. Appearance-Based Multimodal Human Tracking and Identification for Healthcare in the Digital Home

    PubMed Central

    Yang, Mau-Tsuen; Huang, Shen-Yen

    2014-01-01

    There is an urgent need for intelligent home surveillance systems to provide home security, monitor health conditions, and detect emergencies of family members. One of the fundamental problems to realize the power of these intelligent services is how to detect, track, and identify people at home. Compared to RFID tags that need to be worn all the time, vision-based sensors provide a natural and nonintrusive solution. Observing that body appearance and body build, as well as face, provide valuable cues for human identification, we model and record multi-view faces, full-body colors and shapes of family members in an appearance database by using two Kinects located at a home's entrance. Then the Kinects and another set of color cameras installed in other parts of the house are used to detect, track, and identify people by matching the captured color images with the registered templates in the appearance database. People are detected and tracked by multisensor fusion (Kinects and color cameras) using a Kalman filter that can handle duplicate or partial measurements. People are identified by multimodal fusion (face, body appearance, and silhouette) using a track-based majority voting. Moreover, the appearance-based human detection, tracking, and identification modules can cooperate seamlessly and benefit from each other. Experimental results show the effectiveness of the human tracking across multiple sensors and human identification considering the information of multi-view faces, full-body clothes, and silhouettes. The proposed home surveillance system can be applied to domestic applications in digital home security and intelligent healthcare. PMID:25098207

  8. Using a Smartphone Camera for Nanosatellite Attitude Determination

    NASA Astrophysics Data System (ADS)

    Shimmin, R.

    2014-09-01

    The PhoneSat project at NASA Ames Research Center has repeatedly flown a commercial cellphone in space. As this project continues, additional utility is being extracted from the cell phone hardware to enable more complex missions. The camera in particular shows great potential as an instrument for position and attitude determination, but this requires complex image processing. This paper outlines progress towards that image processing capability. Initial tests on a small collection of sample images have demonstrated the determination of a Moon vector from an image by automatic thresholding and centroiding, allowing the calibration of existing attitude control systems. Work has been undertaken on a further set of sample images towards horizon detection using a variety of techniques including thresholding, edge detection, applying a Hough transform, and circle fitting. Ultimately it is hoped this will allow calculation of an Earth vector for attitude determination and an approximate altitude. A quick discussion of work towards using the camera as a star tracker is then presented, followed by an introduction to further applications of the camera on space missions.

  9. I spy with my little eye: Typical, daily exposure to faces documented from a first‐person infant perspective

    PubMed Central

    Mohamed‐Ali, Marwan I.; Moulson, Margaret C.

    2013-01-01

    ABSTRACT Exposure to faces is known to shape and change the face processing system; however, no study has yet documented infants' natural daily first‐hand exposure to faces. One‐ and three‐month‐old infants' visual experience was recorded through head‐mounted cameras. The video recordings were coded for faces to determine: (1) How often are infants exposed to faces? (2) To what type of faces are they exposed? and (3) Do frequently encountered face types reflect infants' typical pattern of perceptual narrowing? As hypothesized, infants spent a large proportion of their time (25%) exposed to faces; these faces were primarily female (70%), own‐race (96%), and adult‐age (81%). Infants were exposed to more individual exemplars of female, own‐race, and adult‐age faces than to male, other‐race, and child‐ or older‐adult‐age faces. Each exposure to own‐race faces was longer than to other‐race faces. There were no differences in exposure duration related to the gender or age of the face. Previous research has found that the face types frequently experienced by our participants are preferred over and more successfully recognized than other face types. The patterns of face exposure revealed in the current study coincide with the known trajectory of perceptual narrowing seen later in infancy.© 2013 The Authors. Developmental Psychobiology Published by Wiley Periodicals, Inc. Dev Psychobiol 56: 249–261, 2014. PMID:24285109

  10. Keyboard before Head Tracking Depresses User Success in Remote Camera Control

    NASA Astrophysics Data System (ADS)

    Zhu, Dingyun; Gedeon, Tom; Taylor, Ken

    In remote mining, operators of complex machinery have more tasks or devices to control than they have hands. For example, operating a rock breaker requires two handed joystick control to position and fire the jackhammer, leaving the camera control to either automatic control or require the operator to switch between controls. We modelled such a teleoperated setting by performing experiments using a simple physical game analogue, being a half size table soccer game with two handles. The complex camera angles of the mining application were modelled by obscuring the direct view of the play area and the use of a Pan-Tilt-Zoom (PTZ) camera. The camera control was via either a keyboard or via head tracking using two different sets of head gestures called “head motion” and “head flicking” for turning camera motion on/off. Our results show that the head motion control was able to provide a comparable performance to using a keyboard, while head flicking was significantly worse. In addition, the sequence of use of the three control methods is highly significant. It appears that use of the keyboard first depresses successful use of the head tracking methods, with significantly better results when one of the head tracking methods was used first. Analysis of the qualitative survey data collected supports that the worst (by performance) method was disliked by participants. Surprisingly, use of that worst method as the first control method significantly enhanced performance using the other two control methods.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolme, David S; Tokola, Ryan A; Boehnen, Chris Bensing

    Automatic recognition systems are a valuable tool for identifying unknown deceased individuals. Immediately af- ter death fingerprint and face biometric samples are easy to collect using standard sensors and cameras and can be easily matched to anti-mortem biometric samples. Even though post-mortem fingerprints and faces have been used for decades, there are no studies that track these biomet- rics through the later stages of decomposition to determine the length of time the biometrics remain viable. This paper discusses a multimodal dataset of fingerprints, faces, and irises from 14 human cadavers that decomposed outdoors under natural conditions. Results include predictive modelsmore » relating time and temperature, measured as Accumulated Degree Days (ADD), and season (winter, spring, summer) to the predicted probably of automatic verification using a commercial algorithm.« less

  12. CUTS FOR MTR EXCAVATION ILLUSTRATE SEDIMENTARY MANTLE OF SOIL AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    CUTS FOR MTR EXCAVATION ILLUSTRATE SEDIMENTARY MANTLE OF SOIL AND GRAVEL OVERLAYING LAVA ROCK FIFTY FEET BELOW. SAGEBRUSH HAS BEEN SCOURED FROM REST OF SITE. CAMERA PROBABLY FACES SOUTHWEST. INL NEGATIVE NO. 67. Unknown Photographer, 6/4/1950 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. 6. COMPRESSOR CONTROL PANELS: AT LEFT, 6,000 P.S.I. PANEL, CIRCA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. COMPRESSOR CONTROL PANELS: AT LEFT, 6,000 P.S.I. PANEL, CIRCA 1957; AT RIGHT, FACING CAMERA, 10,000 P.S.I. PANEL. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Helium Compression Plant, Test Area 1-115, intersection of Altair & Saturn Boulevards, Boron, Kern County, CA

  14. PROCESS WATER BUILDING, TRA605, INTERIOR. FIRST FLOOR. ELECTRICAL EQUIPMENT IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605, INTERIOR. FIRST FLOOR. ELECTRICAL EQUIPMENT IN LEFT HALF OF VIEW. CAMERA IS IN NORTHWEST CORNER FACING SOUTHEAST. INL NEGATIVE NO. HD46-27-1. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  15. ETR BUILDING, TRA642, INTERIOR. FIRST FLOOR. INSIDE UTILITY CORRIDOR ALONG ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. FIRST FLOOR. INSIDE UTILITY CORRIDOR ALONG SOUTH PERIMETER WALL (COMMON TO ELECTRICAL BUILDING, TRA-648). CAMERA FACES WEST. INL NEGATIVE NO. HD46-16-2. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  16. Morning view, contextual view showing the role of the brick ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view showing the role of the brick walls along the boundary of the cemetery; interior view taken from midway down the paved west road with the camera facing west to capture the morning light on the west wall. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  17. PBF Cooling Tower. View of stairway to fan deck. Vents ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower. View of stairway to fan deck. Vents are made of redwood. Camera facing southwest toward north side of Cooling Tower. Siding is corrugated asbestos concrete. Photographer: Kirsh. Date: June 6, 1969. INEEL negative no. 69-3463 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  18. Detail view of northwest side of Signal Corps Radar (S.C.R.) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of northwest side of Signal Corps Radar (S.C.R.) 296 Station 5 Transmitter Building foundation, showing portion of concrete gutter drainage system and asphalt floor tiles, camera facing north - Fort Barry, Signal Corps Radar 296, Station 5, Transmitter Building Foundation, Point Bonita, Marin Headlands, Sausalito, Marin County, CA

  19. LPT. Shield test facility (TAN646) exterior, as modified for EBOR. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Shield test facility (TAN-646) exterior, as modified for EBOR. Camera facing northeast. Heat exchange fans, helium storage tanks, and completed EBOR perimeter road. Photographer: Page Comisky. Date: ca. August 20, 1965. INEEL negative no. 65-4328 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  20. 70. VIEW OF UNIT 2 THROUGH ACCESS DOOR, LOOKING DOWN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    70. VIEW OF UNIT 2 THROUGH ACCESS DOOR, LOOKING DOWN AT MAIN SHAFT. NOTE WELDER'S SIGNATURE IN SHADOWS IN UPPER LEFT CORNER AND PHOTOGRAPHER'S STROBE POWER CABLE IN LOWER RIGHT CORNER. ORIENTATION OF CAMERA IS FACING LEFT BANK, PERPENDICULAR TO RIVER FLOW - Swan Falls Dam, Snake River, Kuna, Ada County, ID

  1. PBF. Oblique and contextual view of PBF Cooling Tower, PER720. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF. Oblique and contextual view of PBF Cooling Tower, PER-720. Camera facing northeast. Auxiliary Building (PER-624) abuts Cooling Tower. Demolition equipment has arrived. Date: August 2003. INEEL negative no. HD-35-11-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  2. PBF (PER620) interior. Detail view of door in north wall ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) interior. Detail view of door in north wall of reactor bay. Camera facing north. Note tonnage weighting of hatch covers in floor. Date: May 2004. INEEL negative no. HD-41-8-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  3. PBF (PER620) interior. Detail view of actuator platform and control ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) interior. Detail view of actuator platform and control rod mechanism. Camera facing easterly from floor level. Reactor pool at lower left of view. Date: March 2004. INEEL negative no. HD-41-3-3 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  4. Lights, Cameras, Mirrors, Action!

    ERIC Educational Resources Information Center

    Purcell, John

    2012-01-01

    In this article, the author describes how his first-grade students made their own compositions based on James Rosenquist's collage series in which long shards of faces were painted over a background that appeared to be abstract. The background was made up of enlarged details of things such as flowers, leaves, fire, and water. The students'…

  5. Location of Spirit's Home

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This image shows where Earth would set on the martian horizon from the perspective of the Mars Exploration Rover Spirit if it were facing northwest atop its lander at Gusev Crater. Earth cannot be seen in this image, but engineers have mapped its location. This image mosaic was taken by the hazard-identification camera onboard Spirit.

  6. LOFT. Reactor arrives at containment building (TAN650), now being pushed ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Reactor arrives at containment building (TAN-650), now being pushed by locomotive. Camera facing northerly. Note "Hello Dolly" and "PWR MTA No. 1" (pressurized water reactor mobile test assembly) signs. Date: 1973. INEEL negative no. 73-3710 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  7. Third International Workshop on Grid Simulator Testing of Wind Turbine

    Science.gov Websites

    Drivetrains | Grid Modernization | NRELA> Third International Workshop on Grid Simulator Testing agenda. For technical questions about the workshop, contact Vahan Gevorgian. A photo of a large group of people standing facing the camera for a group photo Attendees and speakers for the Third International

  8. MTR WING A, TRA604, INTERIOR. MAIN FLOOR. DETAIL VIEW INSIDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR WING A, TRA-604, INTERIOR. MAIN FLOOR. DETAIL VIEW INSIDE LABORATORY 114. CAMERA FACING NORTH. DISPOSAL OF RADIOACTIVE MATERIALS IS UNDERWAY. INL NEGATIVE NO. HD46-12-4. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. ETR BUILDING, TRA642, INTERIOR. CONSOLE FLOOR, SOUTH HALF. SOUTH SIDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. CONSOLE FLOOR, SOUTH HALF. SOUTH SIDE OF ETR REACTOR, CAMERA FACING NORTH. CABINET CONTAINING "NUCLEAR INSTRUMENT SYSTEMS" IS RESTRICTED. INL NEGATIVE NO. HD46-18-4. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  10. LOFT. Reactor apparatus leaves A&M building (TAN607). Shielded locomotive has ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Reactor apparatus leaves A&M building (TAN-607). Shielded locomotive has aerojet logo, which replaced old general electric logo, pulls reactor from assembly shop on dolly. Camera facing easterly. Date: 1973. INEEL negative no. 73-3700 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  11. View of Signal Corps Radar (S.C.R.) 296 Station 5 Transmitter ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Signal Corps Radar (S.C.R.) 296 Station 5 Transmitter Building foundation, showing Fire Control Stations (Buildings 621 and 622) and concrete stairway (top left) camera facing southwest - Fort Barry, Signal Corps Radar 296, Station 5, Transmitter Building Foundation, Point Bonita, Marin Headlands, Sausalito, Marin County, CA

  12. FAST CHOPPER BUILDING, TRA665. CONTEXTUAL VIEW: CHOPPER BUILDING IN CENTER. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FAST CHOPPER BUILDING, TRA-665. CONTEXTUAL VIEW: CHOPPER BUILDING IN CENTER. MTR REACTOR SERVICES BUILDING,TRA-635, TO LEFT; MTR BUILDING TO RIGHT. CAMERA FACING WEST. INL NEGATIVE NO. HD42-1. Mike Crane, Photographer, 3/2004 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. Contextual view of Warner's Ranch. Third of three sequential views ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Warner's Ranch. Third of three sequential views (from west to east) of the buildings in relation to the surrounding geography. Note approximate location of Overland Trail crossing left to right. Camera facing northeast - Warner Ranch, Ranch House, San Felipe Road (State Highway S2), Warner Springs, San Diego County, CA

  14. Cleopatra's Bedroom west facade with 12' scale (in tenths) with ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Cleopatra's Bedroom west facade with 12' scale (in tenths) with picture tube wall along walkway. Structure is made solely of amber colored bottles. Roof supported by telephone poles. Areas of wall collapsed in the 1994 Northridge earthquake. Camera facing east. - Grandma Prisbrey's Bottle Village, 4595 Cochran Street, Simi Valley, Ventura County, CA

  15. Cleopatra's Bedroom oblique with picture tube wall along walkway. Structure ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Cleopatra's Bedroom oblique with picture tube wall along walkway. Structure is made solely of amber colored bottles. Roof supported by telephone poles. Areas of bottle wall above window opening collapsed in the 1994 Northridge earthquake. Camera facing northeast. - Grandma Prisbrey's Bottle Village, 4595 Cochran Street, Simi Valley, Ventura County, CA

  16. View of structures at rear of parcel with 12' scale ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of structures at rear of parcel with 12' scale (in tenths). From right: edge of Round House, Pencil house, Shell House, edge of School House. Heart Shrine made from mortared car headlights at frame left. Camera facing east. - Grandma Prisbrey's Bottle Village, 4595 Cochran Street, Simi Valley, Ventura County, CA

  17. Estimating Clothing Thermal Insulation Using an Infrared Camera

    PubMed Central

    Lee, Jeong-Hoon; Kim, Young-Keun; Kim, Kyung-Soo; Kim, Soohyun

    2016-01-01

    In this paper, a novel algorithm for estimating clothing insulation is proposed to assess thermal comfort, based on the non-contact and real-time measurements of the face and clothing temperatures by an infrared camera. The proposed method can accurately measure the clothing insulation of various garments under different clothing fit and sitting postures. The proposed estimation method is investigated to be effective to measure its clothing insulation significantly in different seasonal clothing conditions using a paired t-test in 99% confidence interval. Temperatures simulated with the proposed estimated insulation value show closer to the values of actual temperature than those with individual clothing insulation values. Upper clothing’s temperature is more accurate within 3% error and lower clothing’s temperature is more accurate by 3.7%~6.2% error in indoor working scenarios. The proposed algorithm can reflect the effect of air layer which makes insulation different in the calculation to estimate clothing insulation using the temperature of the face and clothing. In future, the proposed method is expected to be applied to evaluate the customized passenger comfort effectively. PMID:27005625

  18. An Orientation Sensor-Based Head Tracking System for Driver Behaviour Monitoring

    PubMed Central

    Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros

    2017-01-01

    Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers’ behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone. PMID:29165331

  19. 2D surface temperature measurement of plasma facing components with modulated active pyrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiel, S.; Loarer, T.; Pocheau, C.

    2014-10-01

    In nuclear fusion devices, such as Tore Supra, the plasma facing components (PFC) are in carbon. Such components are exposed to very high heat flux and the surface temperature measurement is mandatory for the safety of the device and also for efficient plasma scenario development. Besides this measurement is essential to evaluate these heat fluxes for a better knowledge of the physics of plasma-wall interaction, it is also required to monitor the fatigue of PFCs. Infrared system (IR) is used to manage to measure surface temperature in real time. For carbon PFCs, the emissivity is high and known (ε ~more » 0.8), therefore the contribution of the reflected flux from environment and collected by the IR cameras can be neglected. However, the future tokamaks such as WEST and ITER will be equipped with PFCs in metal (W and Be/W, respectively) with low and variable emissivities (ε ~ 0.1–0.4). Consequently, the reflected flux will contribute significantly in the collected flux by IR camera. The modulated active pyrometry, using a bicolor camera, proposed in this paper allows a 2D surface temperature measurement independently of the reflected fluxes and the emissivity. Experimental results with Tungsten sample are reported and compared with simultaneous measurement performed with classical pyrometry (monochromatic and bichromatic) with and without reflective flux demonstrating the efficiency of this method for surface temperature measurement independently of the reflected flux and the emissivity.« less

  20. DeitY-TU face database: its design, multiple camera capturing, characteristics, and evaluation

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Saha, Kankan; Saha, Priya; Bhattacharjee, Debotosh

    2014-10-01

    The development of the latest face databases is providing researchers different and realistic problems that play an important role in the development of efficient algorithms for solving the difficulties during automatic recognition of human faces. This paper presents the creation of a new visual face database, named the Department of Electronics and Information Technology-Tripura University (DeitY-TU) face database. It contains face images of 524 persons belonging to different nontribes and Mongolian tribes of north-east India, with their anthropometric measurements for identification. Database images are captured within a room with controlled variations in illumination, expression, and pose along with variability in age, gender, accessories, make-up, and partial occlusion. Each image contains the combined primary challenges of face recognition, i.e., illumination, expression, and pose. This database also represents some new features: soft biometric traits such as mole, freckle, scar, etc., and facial anthropometric variations that may be helpful for researchers for biometric recognition. It also gives an equivalent study of the existing two-dimensional face image databases. The database has been tested using two baseline algorithms: linear discriminant analysis and principal component analysis, which may be used by other researchers as the control algorithm performance score.

Top