Sample records for construction camera facing

  1. A&M. Hot liquid waste building (TAN616) under construction. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste building (TAN-616) under construction. Camera facing northeast. Date: November 25, 1953. INEEL negative no. 9232 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  2. A multicenter prospective cohort study on camera navigation training for key user groups in minimally invasive surgery.

    PubMed

    Graafland, Maurits; Bok, Kiki; Schreuder, Henk W R; Schijven, Marlies P

    2014-06-01

    Untrained laparoscopic camera assistants in minimally invasive surgery (MIS) may cause suboptimal view of the operating field, thereby increasing risk for errors. Camera navigation is often performed by the least experienced member of the operating team, such as inexperienced surgical residents, operating room nurses, and medical students. The operating room nurses and medical students are currently not included as key user groups in structured laparoscopic training programs. A new virtual reality laparoscopic camera navigation (LCN) module was specifically developed for these key user groups. This multicenter prospective cohort study assesses face validity and construct validity of the LCN module on the Simendo virtual reality simulator. Face validity was assessed through a questionnaire on resemblance to reality and perceived usability of the instrument among experts and trainees. Construct validity was assessed by comparing scores of groups with different levels of experience on outcome parameters of speed and movement proficiency. The results obtained show uniform and positive evaluation of the LCN module among expert users and trainees, signifying face validity. Experts and intermediate experience groups performed significantly better in task time and camera stability during three repetitions, compared to the less experienced user groups (P < .007). Comparison of learning curves showed significant improvement of proficiency in time and camera stability for all groups during three repetitions (P < .007). The results of this study show face validity and construct validity of the LCN module. The module is suitable for use in training curricula for operating room nurses and novice surgical trainees, aimed at improving team performance in minimally invasive surgery. © The Author(s) 2013.

  3. LPT. Low power test (TAN640) interior. Basement level. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Low power test (TAN-640) interior. Basement level. Camera facing north. Cable trays and conduit cross tunnel between critical experiment cell and critical experiment control room. Construction 93% complete. Photographer: Jack L. Anderson. Date: October 23, 1957. INEEL negative no. 57-5339 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  4. PBF Reactor Building (PER620). Aerial view of early construction. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Aerial view of early construction. Camera facing northwest. Excavation and concrete placement in two basements are underway. Note exposed lava rock. Photographer: Farmer. Date: March 22, 1965. INEEL negative no. 65-2219 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  5. REACTOR SERVICE BUILDING, TRA635, CONTEXTUAL VIEW DURING CONSTRUCTION. CAMERA IS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    REACTOR SERVICE BUILDING, TRA-635, CONTEXTUAL VIEW DURING CONSTRUCTION. CAMERA IS ATOP MTR BUILDING AND LOOKING SOUTHERLY. FOUNDATION AND DRAINS ARE UNDER CONSTRUCTION. THE BUILDING WILL BUTT AGAINST CHARGING FACE OF PLUG STORAGE BUILDING. HOT CELL BUILDING, TRA-632, IS UNDER CONSTRUCTION AT TOP CENTER OF VIEW. INL NEGATIVE NO. 8518. Unknown Photographer, 8/25/1953 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  6. Construct and face validity of a virtual reality-based camera navigation curriculum.

    PubMed

    Shetty, Shohan; Panait, Lucian; Baranoski, Jacob; Dudrick, Stanley J; Bell, Robert L; Roberts, Kurt E; Duffy, Andrew J

    2012-10-01

    Camera handling and navigation are essential skills in laparoscopic surgery. Surgeons rely on camera operators, usually the least experienced members of the team, for visualization of the operative field. Essential skills for camera operators include maintaining orientation, an effective horizon, appropriate zoom control, and a clean lens. Virtual reality (VR) simulation may be a useful adjunct to developing camera skills in a novice population. No standardized VR-based camera navigation curriculum is currently available. We developed and implemented a novel curriculum on the LapSim VR simulator platform for our residents and students. We hypothesize that our curriculum will demonstrate construct and face validity in our trainee population, distinguishing levels of laparoscopic experience as part of a realistic training curriculum. Overall, 41 participants with various levels of laparoscopic training completed the curriculum. Participants included medical students, surgical residents (Postgraduate Years 1-5), fellows, and attendings. We stratified subjects into three groups (novice, intermediate, and advanced) based on previous laparoscopic experience. We assessed face validity with a questionnaire. The proficiency-based curriculum consists of three modules: camera navigation, coordination, and target visualization using 0° and 30° laparoscopes. Metrics include time, target misses, drift, path length, and tissue contact. We analyzed data using analysis of variance and Student's t-test. We noted significant differences in repetitions required to complete the curriculum: 41.8 for novices, 21.2 for intermediates, and 11.7 for the advanced group (P < 0.05). In the individual modules, coordination required 13.3 attempts for novices, 4.2 for intermediates, and 1.7 for the advanced group (P < 0.05). Target visualization required 19.3 attempts for novices, 13.2 for intermediates, and 8.2 for the advanced group (P < 0.05). Participants believe that training improves camera handling skills (95%), is relevant to surgery (95%), and is a valid training tool (93%). Graphics (98%) and realism (93%) were highly regarded. The VR-based camera navigation curriculum demonstrates construct and face validity for our training population. Camera navigation simulation may be a valuable tool that can be integrated into training protocols for residents and medical students during their surgery rotations. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. A system for tracking and recognizing pedestrian faces using a network of loosely coupled cameras

    NASA Astrophysics Data System (ADS)

    Gagnon, L.; Laliberté, F.; Foucher, S.; Branzan Albu, A.; Laurendeau, D.

    2006-05-01

    A face recognition module has been developed for an intelligent multi-camera video surveillance system. The module can recognize a pedestrian face in terms of six basic emotions and the neutral state. Face and facial features detection (eyes, nasal root, nose and mouth) are first performed using cascades of boosted classifiers. These features are used to normalize the pose and dimension of the face image. Gabor filters are then sampled on a regular grid covering the face image to build a facial feature vector that feeds a nearest neighbor classifier with a cosine distance similarity measure for facial expression interpretation and face model construction. A graphical user interface allows the user to adjust the module parameters.

  8. 6. CONSTRUCTION PROGRESS VIEW (EXTERIOR) OF TANK, CABLE CHASE, AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. CONSTRUCTION PROGRESS VIEW (EXTERIOR) OF TANK, CABLE CHASE, AND MOUNDED BUNKER. CONSTRUCTION WAS 99 PERCENT COMPLETE. CAMERA IS FACING WEST. INEL PHOTO NUMBER 65-5435, TAKEN OCTOBER 20, 1965. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  9. A&M. TAN607. Detail of fuel storage pool under construction. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. TAN-607. Detail of fuel storage pool under construction. Camera is on berm and facing northwest. Note depth of excavation. Formwork underway for floor and concrete walls of pool; wall between pool and vestibule. At center left of view, foundation for liquid waste treatment plant is poured. Date: August 25, 1953. INEEL negative no. 8541 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  10. ETR COMPLEX. CAMERA FACING SOUTH. FROM BOTTOM OF VIEW TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPLEX. CAMERA FACING SOUTH. FROM BOTTOM OF VIEW TO TOP: MTR, MTR SERVICE BUILDING, ETR CRITICAL FACILITY, ETR CONTROL BUILDING (ATTACHED TO ETR), ETR BUILDING (HIGH-BAY), COMPRESSOR BUILDING (ATTACHED AT LEFT OF ETR), HEAT EXCHANGER BUILDING (JUST BEYOND COMPRESSOR BUILDING), COOLING TOWER PUMP HOUSE, COOLING TOWER. OTHER BUILDINGS ARE CONTRACTORS' CONSTRUCTION BUILDINGS. INL NEGATIVE NO. 56-4105. Unknown Photographer, ca. 1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. 4. CONSTRUCTION PROGRESS VIEW OF EQUIPMENT IN FRONT PART OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. CONSTRUCTION PROGRESS VIEW OF EQUIPMENT IN FRONT PART OF CONTROL BUNKER (TRANSFORMER, HYDRAULIC TANK, PUMP, MOTOR). SHOWS UNLINED CORRUGATED METAL WALL. CAMERA FACING EAST. INEL PHOTO NUMBER 65-5433, TAKEN OCTOBER 20, 1965. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  12. A&M. Hot liquid waste treatment building (TAN616), south side. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616), south side. Camera facing north. Personnel door at left side of wall. Partial view of outdoor stairway to upper level platform. Note concrete construction. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-1-3 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  13. PBF Reactor Building (PER620) under construction. Aerial view with camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620) under construction. Aerial view with camera facing northeast. Steel framework is exposed for west wing and high bay. Concrete block siding on east wing. Railroad crane set up on west side. Note trenches proceeding from front of building. Left trench is for secondary coolant and will lead to Cooling Tower. Shorter trench will contain cables leading to control area. Photographer: Larry Page. Date: March 22, 1967. INEEL negative no. 67-5025 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  14. Application of real-time single camera SLAM technology for image-guided targeting in neurosurgery

    NASA Astrophysics Data System (ADS)

    Chang, Yau-Zen; Hou, Jung-Fu; Tsao, Yi Hsiang; Lee, Shih-Tseng

    2012-10-01

    In this paper, we propose an application of augmented reality technology for targeting tumors or anatomical structures inside the skull. The application is a combination of the technologies of MonoSLAM (Single Camera Simultaneous Localization and Mapping) and computer graphics. A stereo vision system is developed to construct geometric data of human face for registration with CT images. Reliability and accuracy of the application is enhanced by the use of fiduciary markers fixed to the skull. The MonoSLAM keeps track of the current location of the camera with respect to an augmented reality (AR) marker using the extended Kalman filter. The fiduciary markers provide reference when the AR marker is invisible to the camera. Relationship between the markers on the face and the augmented reality marker is obtained by a registration procedure by the stereo vision system and is updated on-line. A commercially available Android based tablet PC equipped with a 320×240 front-facing camera was used for implementation. The system is able to provide a live view of the patient overlaid by the solid models of tumors or anatomical structures, as well as the missing part of the tool inside the skull.

  15. IET. Aerial view of project, 95 percent complete. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Aerial view of project, 95 percent complete. Camera facing east. Left to right: stack, duct, mobile test cell building (TAN-624), four-rail track, dolly. Retaining wall between mobile test building and shielded control building (TAN-620) just beyond. North of control building are tank building (TAN-627) and fuel-transfer pump building (TAN-625). Guard house at upper right along exclusion fence. Construction vehicles and temporary warehouse in view near guard house. Date: June 6, 1955. INEEL negative no. 55-1462 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  16. Morning view, brick post detail; view also shows dimensional wallconstruction ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, brick post detail; view also shows dimensional wall-construction detail. North wall, with the camera facing northwest. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  17. 79. ARAIII. Early construction view of GCRE reactor building (ARA608) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    79. ARA-III. Early construction view of GCRE reactor building (ARA-608) showing deep excavation, reinforcing steel, and forms for concrete placement for reactor and other pits. Camera facing southeast. July 22, 1958. Ineel photo no. 58-3466. Photographer: Ken Mansfield. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  18. PBF Cooling Tower under construction. Cold water basin is five ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower under construction. Cold water basin is five feet deep. Foundation and basin walls are reinforced concrete. Camera facing west. Pipe openings through wall in front are outlets for return flow of cool water to reactor building. Photographer: John Capek. Date: September 4, 1968. INEEL negative no. 68-3473 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  19. PBF Cooling Tower. View from highbay roof of Reactor Building ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower. View from high-bay roof of Reactor Building (PER-620). Camera faces northwest. East louvered face has been installed. Inlet pipes protrude from fan deck. Two redwood vents under construction at top. Note piping, control, and power lines at sub-grade level in trench leading to Reactor Building. Photographer: Kirsh. Date: June 6, 1969. INEEL negative no. 69-3466 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  20. STS-98 U.S. Lab Destiny rests in Atlantis' payload bay

    NASA Technical Reports Server (NTRS)

    2001-01-01

    KENNEDY SPACE CENTER, Fla. -- In this view from Level 5, wing platform, of Atlantis''' payload bay, the U.S. Lab Destiny can be seen near the bottom. A key element in the construction of the International Space Station, Destiny is 28 feet long and weighs 16 tons. Destiny will be attached to the Unity node of the ISS using the Shuttle'''s robot arm, seen here on the left with the help of an elbow camera, facing left. Measurements of the elbow camera revealed only a one-inch clearance from the U.S. Lab payload, which is under review. Destiny will fly on STS-98, the seventh construction flight to the ISS. Launch of STS-98 is scheduled for Jan. 19 at 2:11 a.m. EST.

  1. WATER PUMP HOUSE, TRA619. VIEW OF PUMP HOUSE UNDER CONSTRUCTION. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    WATER PUMP HOUSE, TRA-619. VIEW OF PUMP HOUSE UNDER CONSTRUCTION. CAMERA IS ON WATER TOWER AND FACES NORTHWEST. TWO RESERVOIR TANKS ALREADY ARE COMPLETED. NOTE EXCAVATIONS FOR PIPE LINES EXITING FROM BELOW GROUND ON SOUTH SIDE OF PUMP HOUSE. BUILDING AT LOWER RIGHT IS ELECTRICAL CONTROL BUILDING, TRA-623. SWITCHYARD IS IN LOWER RIGHT CORNER OF VIEW. INL NEGATIVE NO. 2753. Unknown Photographer, ca. 6/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  2. Face recognition system for set-top box-based intelligent TV.

    PubMed

    Lee, Won Oh; Kim, Yeong Gon; Hong, Hyung Gil; Park, Kang Ryoung

    2014-11-18

    Despite the prevalence of smart TVs, many consumers continue to use conventional TVs with supplementary set-top boxes (STBs) because of the high cost of smart TVs. However, because the processing power of a STB is quite low, the smart TV functionalities that can be implemented in a STB are very limited. Because of this, negligible research has been conducted regarding face recognition for conventional TVs with supplementary STBs, even though many such studies have been conducted with smart TVs. In terms of camera sensors, previous face recognition systems have used high-resolution cameras, cameras with high magnification zoom lenses, or camera systems with panning and tilting devices that can be used for face recognition from various positions. However, these cameras and devices cannot be used in intelligent TV environments because of limitations related to size and cost, and only small, low cost web-cameras can be used. The resulting face recognition performance is degraded because of the limited resolution and quality levels of the images. Therefore, we propose a new face recognition system for intelligent TVs in order to overcome the limitations associated with low resource set-top box and low cost web-cameras. We implement the face recognition system using a software algorithm that does not require special devices or cameras. Our research has the following four novelties: first, the candidate regions in a viewer's face are detected in an image captured by a camera connected to the STB via low processing background subtraction and face color filtering; second, the detected candidate regions of face are transmitted to a server that has high processing power in order to detect face regions accurately; third, in-plane rotations of the face regions are compensated based on similarities between the left and right half sub-regions of the face regions; fourth, various poses of the viewer's face region are identified using five templates obtained during the initial user registration stage and multi-level local binary pattern matching. Experimental results indicate that the recall; precision; and genuine acceptance rate were about 95.7%; 96.2%; and 90.2%, respectively.

  3. Lip boundary detection techniques using color and depth information

    NASA Astrophysics Data System (ADS)

    Kim, Gwang-Myung; Yoon, Sung H.; Kim, Jung H.; Hur, Gi Taek

    2002-01-01

    This paper presents our approach to using a stereo camera to obtain 3-D image data to be used to improve existing lip boundary detection techniques. We show that depth information as provided by our approach can be used to significantly improve boundary detection systems. Our system detects the face and mouth area in the image by using color, geometric location, and additional depth information for the face. Initially, color and depth information can be used to localize the face. Then we can determine the lip region from the intensity information and the detected eye locations. The system has successfully been used to extract approximate lip regions using RGB color information of the mouth area. Merely using color information is not robust because the quality of the results may vary depending on light conditions, background, and the human race. To overcome this problem, we used a stereo camera to obtain 3-D facial images. 3-D data constructed from the depth information along with color information can provide more accurate lip boundary detection results as compared to color only based techniques.

  4. PROCESS WATER BUILDING, TRA605. FLASH EVAPORATORS ARE PLACED ON UPPER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. FLASH EVAPORATORS ARE PLACED ON UPPER LEVEL OF EAST SIDE OF BUILDING. WALLS WILL BE FORMED AROUND THEM. WORKING RESERVOIR BEYOND. CAMERA FACING EASTERLY. EXHAUST AIR STACK IS UNDER CONSTRUCTION AT RIGHT OF VIEW. INL NEGATIVE NO. 2579. Unknown Photographer, 6/18/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  5. ETR WASTE GAS EXITED THE ETR COMPLEX FROM THE NORTH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR WASTE GAS EXITED THE ETR COMPLEX FROM THE NORTH SIDE THROUGH A TUNNEL AND THEN TO A FILTER PIT. TUNNEL EXIT IS UNDER CONSTRUCTION WHILE CONTROL BUILDING IS BEING FORMED BEYOND. CAMERA FACING WEST. INL NEGATIVE NO. 56-1238. Jack L. Anderson, Photographer, 4/17/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  6. Face detection assisted auto exposure: supporting evidence from a psychophysical study

    NASA Astrophysics Data System (ADS)

    Jin, Elaine W.; Lin, Sheng; Dharumalingam, Dhandapani

    2010-01-01

    Face detection has been implemented in many digital still cameras and camera phones with the promise of enhancing existing camera functions (e.g. auto exposure) and adding new features to cameras (e.g. blink detection). In this study we examined the use of face detection algorithms in assisting auto exposure (AE). The set of 706 images, used in this study, was captured using Canon Digital Single Lens Reflex cameras and subsequently processed with an image processing pipeline. A psychophysical study was performed to obtain optimal exposure along with the upper and lower bounds of exposure for all 706 images. Three methods of marking faces were utilized: manual marking, face detection algorithm A (FD-A), and face detection algorithm B (FD-B). The manual marking method found 751 faces in 426 images, which served as the ground-truth for face regions of interest. The remaining images do not have any faces or the faces are too small to be considered detectable. The two face detection algorithms are different in resource requirements and in performance. FD-A uses less memory and gate counts compared to FD-B, but FD-B detects more faces and has less false positives. A face detection assisted auto exposure algorithm was developed and tested against the evaluation results from the psychophysical study. The AE test results showed noticeable improvement when faces were detected and used in auto exposure. However, the presence of false positives would negatively impact the added benefit.

  7. Simultaneous Calibration: A Joint Optimization Approach for Multiple Kinect and External Cameras.

    PubMed

    Liao, Yajie; Sun, Ying; Li, Gongfa; Kong, Jianyi; Jiang, Guozhang; Jiang, Du; Cai, Haibin; Ju, Zhaojie; Yu, Hui; Liu, Honghai

    2017-06-24

    Camera calibration is a crucial problem in many applications, such as 3D reconstruction, structure from motion, object tracking and face alignment. Numerous methods have been proposed to solve the above problem with good performance in the last few decades. However, few methods are targeted at joint calibration of multi-sensors (more than four devices), which normally is a practical issue in the real-time systems. In this paper, we propose a novel method and a corresponding workflow framework to simultaneously calibrate relative poses of a Kinect and three external cameras. By optimizing the final cost function and adding corresponding weights to the external cameras in different locations, an effective joint calibration of multiple devices is constructed. Furthermore, the method is tested in a practical platform, and experiment results show that the proposed joint calibration method can achieve a satisfactory performance in a project real-time system and its accuracy is higher than the manufacturer's calibration.

  8. Simultaneous Calibration: A Joint Optimization Approach for Multiple Kinect and External Cameras

    PubMed Central

    Liao, Yajie; Sun, Ying; Li, Gongfa; Kong, Jianyi; Jiang, Guozhang; Jiang, Du; Cai, Haibin; Ju, Zhaojie; Yu, Hui; Liu, Honghai

    2017-01-01

    Camera calibration is a crucial problem in many applications, such as 3D reconstruction, structure from motion, object tracking and face alignment. Numerous methods have been proposed to solve the above problem with good performance in the last few decades. However, few methods are targeted at joint calibration of multi-sensors (more than four devices), which normally is a practical issue in the real-time systems. In this paper, we propose a novel method and a corresponding workflow framework to simultaneously calibrate relative poses of a Kinect and three external cameras. By optimizing the final cost function and adding corresponding weights to the external cameras in different locations, an effective joint calibration of multiple devices is constructed. Furthermore, the method is tested in a practical platform, and experiment results show that the proposed joint calibration method can achieve a satisfactory performance in a project real-time system and its accuracy is higher than the manufacturer’s calibration. PMID:28672823

  9. Experimental study of 3-D structure and evolution of foam

    NASA Astrophysics Data System (ADS)

    Thoroddsen, S. T.; Tan, E.; Bauer, J. M.

    1998-11-01

    Liquid foam coarsens due to diffusion of gas between adjacent foam cells. This evolution process is slow, but leads to rapid topological changes taking place during localized rearrangements of Plateau borders or disappearance of small cells. We are developing a new imaging technique to construct the three-dimensional topology of real soap foam contained in a small glass container. The technique uses 3 video cameras equipped with lenses having narrow depth-of-field. These cameras are moved with respect to the container, in effect obtaining numerous slices through the foam. Preliminary experimental results showing typical rearrangement events will also be presented. These events involve for example disappearance of either triangular or rectangular cell faces.

  10. Covariance analysis for evaluating head trackers

    NASA Astrophysics Data System (ADS)

    Kang, Donghoon

    2017-10-01

    Existing methods for evaluating the performance of head trackers usually rely on publicly available face databases, which contain facial images and the ground truths of their corresponding head orientations. However, most of the existing publicly available face databases are constructed by assuming that a frontal head orientation can be determined by compelling the person under examination to look straight ahead at the camera on the first video frame. Since nobody can accurately direct one's head toward the camera, this assumption may be unrealistic. Rather than obtaining estimation errors, we present a method for computing the covariance of estimation error rotations to evaluate the reliability of head trackers. As an uncertainty measure of estimators, the Schatten 2-norm of a square root of error covariance (or the algebraic average of relative error angles) can be used. The merit of the proposed method is that it does not disturb the person under examination by asking him to direct his head toward certain directions. Experimental results using real data validate the usefulness of our method.

  11. Two-step superresolution approach for surveillance face image through radial basis function-partial least squares regression and locality-induced sparse representation

    NASA Astrophysics Data System (ADS)

    Jiang, Junjun; Hu, Ruimin; Han, Zhen; Wang, Zhongyuan; Chen, Jun

    2013-10-01

    Face superresolution (SR), or face hallucination, refers to the technique of generating a high-resolution (HR) face image from a low-resolution (LR) one with the help of a set of training examples. It aims at transcending the limitations of electronic imaging systems. Applications of face SR include video surveillance, in which the individual of interest is often far from cameras. A two-step method is proposed to infer a high-quality and HR face image from a low-quality and LR observation. First, we establish the nonlinear relationship between LR face images and HR ones, according to radial basis function and partial least squares (RBF-PLS) regression, to transform the LR face into the global face space. Then, a locality-induced sparse representation (LiSR) approach is presented to enhance the local facial details once all the global faces for each LR training face are constructed. A comparison of some state-of-the-art SR methods shows the superiority of the proposed two-step approach, RBF-PLS global face regression followed by LiSR-based local patch reconstruction. Experiments also demonstrate the effectiveness under both simulation conditions and some real conditions.

  12. Multifunctional microcontrollable interface module

    NASA Astrophysics Data System (ADS)

    Spitzer, Mark B.; Zavracky, Paul M.; Rensing, Noa M.; Crawford, J.; Hockman, Angela H.; Aquilino, P. D.; Girolamo, Henry J.

    2001-08-01

    This paper reports the development of a complete eyeglass- mounted computer interface system including display, camera and audio subsystems. The display system provides an SVGA image with a 20 degree horizontal field of view. The camera system has been optimized for face recognition and provides a 19 degree horizontal field of view. A microphone and built-in pre-amp optimized for voice recognition and a speaker on an articulated arm are included for audio. An important feature of the system is a high degree of adjustability and reconfigurability. The system has been developed for testing by the Military Police, in a complete system comprising the eyeglass-mounted interface, a wearable computer, and an RF link. Details of the design, construction, and performance of the eyeglass-based system are discussed.

  13. Interior view showing south entrance; camera facing south. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view showing south entrance; camera facing south. - Mare Island Naval Shipyard, Machine Shop, California Avenue, southwest corner of California Avenue & Thirteenth Street, Vallejo, Solano County, CA

  14. PBF Reactor Building (PER620). Camera faces southeast. Concrete placement will ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces southeast. Concrete placement will leave opening for neutron camera to be installed later. Note vertical piping within rebar. Photographer: John Capek. Date: July 6, 1967. INEEL negative no. 67-3514 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  15. VIEW OF EAST ELEVATION; CAMERA FACING WEST Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF EAST ELEVATION; CAMERA FACING WEST - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  16. VIEW OF SOUTH ELEVATION; CAMERA FACING NORTH Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF SOUTH ELEVATION; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  17. VIEW OF WEST ELEVATION: CAMERA FACING NORTHEAST Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF WEST ELEVATION: CAMERA FACING NORTHEAST - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  18. VIEW OF NORTH ELEVATION; CAMERA FACING SOUTH Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF NORTH ELEVATION; CAMERA FACING SOUTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  19. View of south elevation; camera facing northeast. Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of south elevation; camera facing northeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  20. View of north elevation; camera facing southeast. Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of north elevation; camera facing southeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  1. Detail of main entrance; camera facing southwest. Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main entrance; camera facing southwest. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  2. Contextual view of building 733; camera facing southeast. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 733; camera facing southeast. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  3. Interior detail of tower space; camera facing southwest. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of tower space; camera facing southwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  4. Oblique view of southeast corner; camera facing northwest. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Oblique view of southeast corner; camera facing northwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  5. Researches on hazard avoidance cameras calibration of Lunar Rover

    NASA Astrophysics Data System (ADS)

    Li, Chunyan; Wang, Li; Lu, Xin; Chen, Jihua; Fan, Shenghong

    2017-11-01

    Lunar Lander and Rover of China will be launched in 2013. It will finish the mission targets of lunar soft landing and patrol exploration. Lunar Rover has forward facing stereo camera pair (Hazcams) for hazard avoidance. Hazcams calibration is essential for stereo vision. The Hazcam optics are f-theta fish-eye lenses with a 120°×120° horizontal/vertical field of view (FOV) and a 170° diagonal FOV. They introduce significant distortion in images and the acquired images are quite warped, which makes conventional camera calibration algorithms no longer work well. A photogrammetric calibration method of geometric model for the type of optical fish-eye constructions is investigated in this paper. In the method, Hazcams model is represented by collinearity equations with interior orientation and exterior orientation parameters [1] [2]. For high-precision applications, the accurate calibration model is formulated with the radial symmetric distortion and the decentering distortion as well as parameters to model affinity and shear based on the fisheye deformation model [3] [4]. The proposed method has been applied to the stereo camera calibration system for Lunar Rover.

  6. Detail of stairway at north elevation; camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of stairway at north elevation; camera facing southwest. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  7. Interior view of second floor sleeping area; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor sleeping area; camera facing south. - Mare Island Naval Shipyard, Marine Barracks, Cedar Avenue, west side between Twelfth & Fourteenth Streets, Vallejo, Solano County, CA

  8. Interior detail of lobby ceiling design; camera facing east. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of lobby ceiling design; camera facing east. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  9. Interior detail of stairway in tower; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of stairway in tower; camera facing south. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  10. View of camera station located northeast of Building 70022, facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of camera station located northeast of Building 70022, facing northwest - Naval Ordnance Test Station Inyokern, Randsburg Wash Facility Target Test Towers, Tower Road, China Lake, Kern County, CA

  11. Interior view of second floor lobby; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor lobby; camera facing south. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  12. Interior detail of first floor lobby; camera facing northeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of first floor lobby; camera facing northeast. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  13. Interior view of second floor space; camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor space; camera facing southwest. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  14. Detail of columns, cornice and eaves; camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of columns, cornice and eaves; camera facing southwest. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  15. Detail of cupola on south wing; camera facing southeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of cupola on south wing; camera facing southeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  16. Interior view of north wing, south wall offices; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of north wing, south wall offices; camera facing south. - Mare Island Naval Shipyard, Smithery, California Avenue, west side at California Avenue & Eighth Street, Vallejo, Solano County, CA

  17. Interior detail of main entry with railroad tracks; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of main entry with railroad tracks; camera facing east. - Mare Island Naval Shipyard, Mechanics Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  18. DETAIL OF LAMP ABOVE SOUTH SIDE ENTRANCE; CAMERA FACING EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF LAMP ABOVE SOUTH SIDE ENTRANCE; CAMERA FACING EAST - Mare Island Naval Shipyard, Bachelor Enlisted Quarters & Offices, Walnut Avenue, east side between D Street & C Street, Vallejo, Solano County, CA

  19. Contextual view of building 926 west elevation; camera facing east. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 926 west elevation; camera facing east. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  20. Detail of main doors on east elevation; camera facing west. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main doors on east elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  1. Detail of main hall porch on east elevation; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main hall porch on east elevation; camera facing west. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  2. Detail of central portion of southeast elevation; camera facing west. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of central portion of southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  3. Detail of windows at center of west elevation; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of windows at center of west elevation; camera facing east. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  4. Interior view of hallway on second floor; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of hallway on second floor; camera facing south. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  5. Detail of balcony and windows on west elevation; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of balcony and windows on west elevation; camera facing northeast. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  6. Contextual view of building 733 along Cedar Avenue; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 733 along Cedar Avenue; camera facing southwest. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  7. View of main terrace with mature tree, camera facing southeast ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of main terrace with mature tree, camera facing southeast - Naval Training Station, Senior Officers' Quarters District, Naval Station Treasure Island, Yerba Buena Island, San Francisco, San Francisco County, CA

  8. Detail of main entry on east elevation; camera facing west. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main entry on east elevation; camera facing west. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  9. Detail of south wing south elevation wall section; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of south wing south elevation wall section; camera facing northwest - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  10. Detail of large industrial doors on north elevation; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of large industrial doors on north elevation; camera facing south. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  11. View of steel warehouses, building 710 north sidewalk; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses, building 710 north sidewalk; camera facing east. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  12. Interior detail of main stairway from first floor; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of main stairway from first floor; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  13. Interior detail of arched doorway at second floor; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of arched doorway at second floor; camera facing north. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  14. INTERIOR DETAIL OF STAIRWAY AT SOUTH WING ENTRANCE; CAMERA FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR DETAIL OF STAIRWAY AT SOUTH WING ENTRANCE; CAMERA FACING SOUTH - Mare Island Naval Shipyard, Bachelor Enlisted Quarters & Offices, Walnut Avenue, east side between D Street & C Street, Vallejo, Solano County, CA

  15. LPT. Low power assembly and test building (TAN640). Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Low power assembly and test building (TAN-640). Camera facing west. Rollup doors to each test cell face east. Concrete walls poured in place. Apparatus at right of view was part of a post-ANP program. INEEL negative no. HD-40-1-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  16. SPERTI, Instrument Cell Building (PER606). West facade. Camera facing northeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SPERT-I, Instrument Cell Building (PER-606). West facade. Camera facing northeast. Date: August 2003. INEEL negative no. HD-35-3-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  17. SPERTI, Instrument Cell Building (PER606). East facade. Camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SPERT-I, Instrument Cell Building (PER-606). East facade. Camera facing southwest. Date: August 2003. INEEL negative no. HD-35-3-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  18. 4. First floor interior. Camera facing west. Mirrored interior supports ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. First floor interior. Camera facing west. Mirrored interior supports mark the former division between 11 and 13 North Broad Street. - 11-15 North Broad Street (Commercial Building), 11-15 North Broad Street, Trenton, Mercer County, NJ

  19. 2. View from same camera position facing 232 degrees southwest ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. View from same camera position facing 232 degrees southwest showing abandoned section of old grade - Oak Creek Administrative Center, One half mile east of Zion-Mount Carmel Highway at Oak Creek, Springdale, Washington County, UT

  20. Automated face detection for occurrence and occupancy estimation in chimpanzees.

    PubMed

    Crunchant, Anne-Sophie; Egerer, Monika; Loos, Alexander; Burghardt, Tilo; Zuberbühler, Klaus; Corogenes, Katherine; Leinert, Vera; Kulik, Lars; Kühl, Hjalmar S

    2017-03-01

    Surveying endangered species is necessary to evaluate conservation effectiveness. Camera trapping and biometric computer vision are recent technological advances. They have impacted on the methods applicable to field surveys and these methods have gained significant momentum over the last decade. Yet, most researchers inspect footage manually and few studies have used automated semantic processing of video trap data from the field. The particular aim of this study is to evaluate methods that incorporate automated face detection technology as an aid to estimate site use of two chimpanzee communities based on camera trapping. As a comparative baseline we employ traditional manual inspection of footage. Our analysis focuses specifically on the basic parameter of occurrence where we assess the performance and practical value of chimpanzee face detection software. We found that the semi-automated data processing required only 2-4% of the time compared to the purely manual analysis. This is a non-negligible increase in efficiency that is critical when assessing the feasibility of camera trap occupancy surveys. Our evaluations suggest that our methodology estimates the proportion of sites used relatively reliably. Chimpanzees are mostly detected when they are present and when videos are filmed in high-resolution: the highest recall rate was 77%, for a false alarm rate of 2.8% for videos containing only chimpanzee frontal face views. Certainly, our study is only a first step for transferring face detection software from the lab into field application. Our results are promising and indicate that the current limitation of detecting chimpanzees in camera trap footage due to lack of suitable face views can be easily overcome on the level of field data collection, that is, by the combined placement of multiple high-resolution cameras facing reverse directions. This will enable to routinely conduct chimpanzee occupancy surveys based on camera trapping and semi-automated processing of footage. Using semi-automated ape face detection technology for processing camera trap footage requires only 2-4% of the time compared to manual analysis and allows to estimate site use by chimpanzees relatively reliably. © 2017 Wiley Periodicals, Inc.

  1. Multiview face detection based on position estimation over multicamera surveillance system

    NASA Astrophysics Data System (ADS)

    Huang, Ching-chun; Chou, Jay; Shiu, Jia-Hou; Wang, Sheng-Jyh

    2012-02-01

    In this paper, we propose a multi-view face detection system that locates head positions and indicates the direction of each face in 3-D space over a multi-camera surveillance system. To locate 3-D head positions, conventional methods relied on face detection in 2-D images and projected the face regions back to 3-D space for correspondence. However, the inevitable false face detection and rejection usually degrades the system performance. Instead, our system searches for the heads and face directions over the 3-D space using a sliding cube. Each searched 3-D cube is projected onto the 2-D camera views to determine the existence and direction of human faces. Moreover, a pre-process to estimate the locations of candidate targets is illustrated to speed-up the searching process over the 3-D space. In summary, our proposed method can efficiently fuse multi-camera information and suppress the ambiguity caused by detection errors. Our evaluation shows that the proposed approach can efficiently indicate the head position and face direction on real video sequences even under serious occlusion.

  2. Remote repair appliance

    DOEpatents

    Heumann, Frederick K.; Wilkinson, Jay C.; Wooding, David R.

    1997-01-01

    A remote appliance for supporting a tool for performing work at a worksite on a substantially circular bore of a workpiece and for providing video signals of the worksite to a remote monitor comprising: a baseplate having an inner face and an outer face; a plurality of rollers, wherein each roller is rotatably and adjustably attached to the inner face of the baseplate and positioned to roll against the bore of the workpiece when the baseplate is positioned against the mouth of the bore such that the appliance may be rotated about the bore in a plane substantially parallel to the baseplate; a tool holding means for supporting the tool, the tool holding means being adjustably attached to the outer face of the baseplate such that the working end of the tool is positioned on the inner face side of the baseplate; a camera for providing video signals of the worksite to the remote monitor; and a camera holding means for supporting the camera on the inner face side of the baseplate, the camera holding means being adjustably attached to the outer face of the baseplate. In a preferred embodiment, roller guards are provided to protect the rollers from debris and a bore guard is provided to protect the bore from wear by the rollers and damage from debris.

  3. Semi-automatic 2D-to-3D conversion of human-centered videos enhanced by age and gender estimation

    NASA Astrophysics Data System (ADS)

    Fard, Mani B.; Bayazit, Ulug

    2014-01-01

    In this work, we propose a feasible 3D video generation method to enable high quality visual perception using a monocular uncalibrated camera. Anthropometric distances between face standard landmarks are approximated based on the person's age and gender. These measurements are used in a 2-stage approach to facilitate the construction of binocular stereo images. Specifically, one view of the background is registered in initial stage of video shooting. It is followed by an automatically guided displacement of the camera toward its secondary position. At the secondary position the real-time capturing is started and the foreground (viewed person) region is extracted for each frame. After an accurate parallax estimation the extracted foreground is placed in front of the background image that was captured at the initial position. So the constructed full view of the initial position combined with the view of the secondary (current) position, form the complete binocular pairs during real-time video shooting. The subjective evaluation results present a competent depth perception quality through the proposed system.

  4. Validation of a virtual reality-based simulator for shoulder arthroscopy.

    PubMed

    Rahm, Stefan; Germann, Marco; Hingsammer, Andreas; Wieser, Karl; Gerber, Christian

    2016-05-01

    This study was to determine face and construct validity of a new virtual reality-based shoulder arthroscopy simulator which uses passive haptic feedback. Fifty-one participants including 25 novices (<20 shoulder arthroscopies) and 26 experts (>100 shoulder arthroscopies) completed two tests: for assessment of face validity, a questionnaire was filled out concerning quality of simulated reality and training potential using a 7-point Likert scale (range 1-7). Construct validity was tested by comparing simulator metrics (operation time in seconds, camera and grasper pathway in centimetre and grasper openings) between novices and experts test results. Overall simulated reality was rated high with a median value of 5.5 (range 2.8-7) points. Training capacity scored a median value of 5.8 (range 3-7) points. Experts were significantly faster in the diagnostic test with a median of 91 (range 37-208) s than novices with 1177 (range 81-383) s (p < 0.0001) and in the therapeutic test 102 (range 58-283) s versus 229 (range 114-399) s (p < 0.0001). Similar results were seen in the other metric values except in the camera pathway in the therapeutic test. The tested simulator achieved high scores in terms of realism and training capability. It reliably discriminated between novices and experts. Further improvements of the simulator, especially in the field of therapeutic arthroscopy, might improve its value as training and assessment tool for shoulder arthroscopy skills. II.

  5. An Implementation of Privacy Protection for a Surveillance Camera Using ROI Coding of JPEG2000 with Face Detection

    NASA Astrophysics Data System (ADS)

    Muneyasu, Mitsuji; Odani, Shuhei; Kitaura, Yoshihiro; Namba, Hitoshi

    On the use of a surveillance camera, there is a case where privacy protection should be considered. This paper proposes a new privacy protection method by automatically degrading the face region in surveillance images. The proposed method consists of ROI coding of JPEG2000 and a face detection method based on template matching. The experimental result shows that the face region can be detected and hidden correctly.

  6. SOUTH WING, TRA661. SOUTH SIDE. CAMERA FACING NORTH. MTR HIGH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SOUTH WING, TRA-661. SOUTH SIDE. CAMERA FACING NORTH. MTR HIGH BAY BEYOND. INL NEGATIVE NO. HD46-45-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. PBF Cooling Tower (PER720). Camera faces south to show north ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower (PER-720). Camera faces south to show north facade. Note enclosed stairway. Date: August 2003. INEEL negative no. HD-35-10-3 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  8. MTR AND ETR COMPLEXES. CAMERA FACING EASTERLY TOWARD CHEMICAL PROCESSING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR AND ETR COMPLEXES. CAMERA FACING EASTERLY TOWARD CHEMICAL PROCESSING PLANT. MTR AND ITS ATTACHMENTS IN FOREGROUND. ETR BEYOND TO RIGHT. INL NEGATIVE NO. 56-4100. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. A&M. Radioactive parts security storage area. camera facing northwest. Outdoor ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Radioactive parts security storage area. camera facing northwest. Outdoor storage of concrete storage casks. Photographer: M. Holmes. Date: November 21, 1959. INEEL negative no. 59-6081 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  10. A&M. Hot liquid waste holding tanks. Camera faces southeast. Located ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste holding tanks. Camera faces southeast. Located in vicinity of TAN-616, hot liquid waste treatment plant. Date: November 13, 1953. INEEL negative no. 9159 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  11. 24. ARAIII Reactor building ARA608 interior. Camera facing south. Chalk ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. ARA-III Reactor building ARA-608 interior. Camera facing south. Chalk marks on wall indicate presence or absence of spot contamination. Ineel photo no. 3-2. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  12. Barrier Coverage for 3D Camera Sensor Networks

    PubMed Central

    Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi; Ji, Peng; Chu, Hao

    2017-01-01

    Barrier coverage, an important research area with respect to camera sensor networks, consists of a number of camera sensors to detect intruders that pass through the barrier area. Existing works on barrier coverage such as local face-view barrier coverage and full-view barrier coverage typically assume that each intruder is considered as a point. However, the crucial feature (e.g., size) of the intruder should be taken into account in the real-world applications. In this paper, we propose a realistic resolution criterion based on a three-dimensional (3D) sensing model of a camera sensor for capturing the intruder’s face. Based on the new resolution criterion, we study the barrier coverage of a feasible deployment strategy in camera sensor networks. Performance results demonstrate that our barrier coverage with more practical considerations is capable of providing a desirable surveillance level. Moreover, compared with local face-view barrier coverage and full-view barrier coverage, our barrier coverage is more reasonable and closer to reality. To the best of our knowledge, our work is the first to propose barrier coverage for 3D camera sensor networks. PMID:28771167

  13. Barrier Coverage for 3D Camera Sensor Networks.

    PubMed

    Si, Pengju; Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi; Ji, Peng; Chu, Hao

    2017-08-03

    Barrier coverage, an important research area with respect to camera sensor networks, consists of a number of camera sensors to detect intruders that pass through the barrier area. Existing works on barrier coverage such as local face-view barrier coverage and full-view barrier coverage typically assume that each intruder is considered as a point. However, the crucial feature (e.g., size) of the intruder should be taken into account in the real-world applications. In this paper, we propose a realistic resolution criterion based on a three-dimensional (3D) sensing model of a camera sensor for capturing the intruder's face. Based on the new resolution criterion, we study the barrier coverage of a feasible deployment strategy in camera sensor networks. Performance results demonstrate that our barrier coverage with more practical considerations is capable of providing a desirable surveillance level. Moreover, compared with local face-view barrier coverage and full-view barrier coverage, our barrier coverage is more reasonable and closer to reality. To the best of our knowledge, our work is the first to propose barrier coverage for 3D camera sensor networks.

  14. 3. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY, CAMERA FACING NORTHEAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY, CAMERA FACING NORTHEAST. SHOWS RELATIONSHIP BETWEEN DECONTAMINATION ROOM, ADSORBER REMOVAL HATCHES (FLAT ON GRADE), AND BRIDGE CRANE. INEEL PROOF NUMBER HD-17-2. - Idaho National Engineering Laboratory, Old Waste Calcining Facility, Scoville, Butte County, ID

  15. PBF Cooling Tower detail. Camera facing southwest. Wood fill rises ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower detail. Camera facing southwest. Wood fill rises from foundation piers of cold water basin. Photographer: Kirsh. Date: May 1, 1969. INEEL negative no. 69-2826 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  16. PBF (PER620) west facade. Camera facing east. Note 1980 addition ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) west facade. Camera facing east. Note 1980 addition on south side of west wall. Date: March 2004. INEEL negative no. HD-41-3-3 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  17. ETRMTR MECHANICAL SERVICES BUILDING, TRA653. CAMERA FACING NORTHWEST AS BUILDING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR-MTR MECHANICAL SERVICES BUILDING, TRA-653. CAMERA FACING NORTHWEST AS BUILDING WAS NEARLY COMPLETE. INL NEGATIVE NO. 57-3653. K. Mansfield, Photographer, 7/22/1957 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  18. Have a Nice Spring! MOC Revisits "Happy Face" Crater

    NASA Image and Video Library

    2005-05-16

    Smile! Spring has sprung in the martian southern hemisphere. With it comes the annual retreat of the winter polar frost cap. This view of "Happy Face Crater"--officially named "Galle Crater"--shows patches of white water ice frost in and around the crater's south-facing slopes. Slopes that face south will retain frost longer than north-facing slopes because they do not receive as much sunlight in early spring. This picture is a composite of images taken by the Mars Global Surveyor Mars Orbiter Camera (MOC) red and blue wide angle cameras. The wide angle cameras were designed to monitor the changing weather, frost, and wind patterns on Mars. Galle Crater is located on the east rim of the Argyre Basin and is about 215 kilometers (134 miles) across. In this picture, illumination is from the upper left and north is up. http://photojournal.jpl.nasa.gov/catalog/PIA02325

  19. PBF (PER620) north facade. Camera facing south. Small metal shed ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) north facade. Camera facing south. Small metal shed at right is Stack Gas Monitor Building, PER-629. Date: March 2004. INEEL negative no. HD-41-2-4 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  20. MTR BUILDING AND BALCONY FLOORS. CAMERA FACING EASTERLY. PHOTOGRAPHER DID ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BUILDING AND BALCONY FLOORS. CAMERA FACING EASTERLY. PHOTOGRAPHER DID NOT EXPLAIN DARK CLOUD. MTR WING WILL ATTACH TO GROUND FLOOR. INL NEGATIVE NO. 1567. Unknown Photographer, 2/28/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  1. A&M. Hot liquid waste treatment building (TAN616). Camera facing east. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616). Camera facing east. Showing west facades of structure. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-1-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  2. PBF Control Building (PER619) south facade. Camera faces north. Note ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Control Building (PER-619) south facade. Camera faces north. Note buried tanks with bollards protecting their access hatches. Date: July 2004. INEEL negative no. HD-41-10-4 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  3. Remote repair appliance

    DOEpatents

    Heumann, F.K.; Wilkinson, J.C.; Wooding, D.R.

    1997-12-16

    A remote appliance for supporting a tool for performing work at a work site on a substantially circular bore of a work piece and for providing video signals of the work site to a remote monitor comprises: a base plate having an inner face and an outer face; a plurality of rollers, wherein each roller is rotatably and adjustably attached to the inner face of the base plate and positioned to roll against the bore of the work piece when the base plate is positioned against the mouth of the bore such that the appliance may be rotated about the bore in a plane substantially parallel to the base plate; a tool holding means for supporting the tool, the tool holding means being adjustably attached to the outer face of the base plate such that the working end of the tool is positioned on the inner face side of the base plate; a camera for providing video signals of the work site to the remote monitor; and a camera holding means for supporting the camera on the inner face side of the base plate, the camera holding means being adjustably attached to the outer face of the base plate. In a preferred embodiment, roller guards are provided to protect the rollers from debris and a bore guard is provided to protect the bore from wear by the rollers and damage from debris. 5 figs.

  4. Extending the depth of field in a fixed focus lens using axial colour

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Niamh; Dainty, Christopher; Goncharov, Alexander V.

    2017-11-01

    We propose a method of extending the depth of field (EDOF) of conventional lenses for a low cost iris recognition front-facing smartphone camera. Longitudinal chromatic aberration (LCA) can be induced in the lens by means of dual wavelength illumination. The EDOF region is then constructed from the sum of the adjacent depths of field from each wavelength illumination. The lens parameters can be found analytically with paraxial raytracing. The extended depth of field is dependant on the glass chosen and position of the near object point.

  5. LPT. Shield test control building (TAN645), north facade. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Shield test control building (TAN-645), north facade. Camera facing south. Obsolete sign dating from post-1970 program says "Energy and Systems Technology Experimental Facility, INEL." INEEL negative no. HD-40-5-4 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  6. MTR WING, TRA604, INTERIOR. BASEMENT. WEST CORRIDOR. CAMERA FACES NORTH. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR WING, TRA-604, INTERIOR. BASEMENT. WEST CORRIDOR. CAMERA FACES NORTH. HVAC AREA IS AT RIGHT OF CORRIDOR. INL NEGATIVE NO. HD46-13-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. PBF Reactor Building (PER620). Camera in first basement, facing south ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera in first basement, facing south and upward toward main floor. Cable trays being erected. Photographer: Kirsh. Date: May 20, 1969. INEEL negative no. 69-3110 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  8. PBF (PER620) south facade. Camera facing north. Note pedestrian bridge ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) south facade. Camera facing north. Note pedestrian bridge crossing over conduit. Central high bay contains reactor room and canal. Date: March 2004. INEEL negative no. HD-41-2-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  9. MTR STACK, TRA710, CONTEXTUAL VIEW, CAMERA FACING SOUTH. PERIMETER SECURITY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR STACK, TRA-710, CONTEXTUAL VIEW, CAMERA FACING SOUTH. PERIMETER SECURITY FENCE AND SECURITY LIGHTING IN VIEW AT LEFT. INL NEGATIVE NO. HD52-1-1. Mike Crane, Photographer, 5/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  10. PBF Cooling Tower Auxiliary Building (PER624) interior. Camera facing north. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower Auxiliary Building (PER-624) interior. Camera facing north. Deluge valves and automatic fire protection piping for Cooling Tower. Photographer: Holmes. Date: May 20, 1970. INEEL negative no. 70-2323 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  11. A&M. Hot liquid waste treatment building (TAN616). Camera facing northeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616). Camera facing northeast. South wall with oblique views of west sides of structure. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-1-2 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  12. A&M. Hot liquid waste treatment building (TAN616). Camera facing north. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616). Camera facing north. Detail of personnel entrance door, stoop, and stairway. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-2-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  13. Applications of digital image acquisition in anthropometry

    NASA Technical Reports Server (NTRS)

    Woolford, B.; Lewis, J. L.

    1981-01-01

    A description is given of a video kinesimeter, a device for the automatic real-time collection of kinematic and dynamic data. Based on the detection of a single bright spot by three TV cameras, the system provides automatic real-time recording of three-dimensional position and force data. It comprises three cameras, two incandescent lights, a voltage comparator circuit, a central control unit, and a mass storage device. The control unit determines the signal threshold for each camera before testing, sequences the lights, synchronizes and analyzes the scan voltages from the three cameras, digitizes force from a dynamometer, and codes the data for transmission to a floppy disk for recording. Two of the three cameras face each other along the 'X' axis; the third camera, which faces the center of the line between the first two, defines the 'Y' axis. An image from the 'Y' camera and either 'X' camera is necessary for determining the three-dimensional coordinates of the point.

  14. A Comparison of the Visual Attention Patterns of People With Aphasia and Adults Without Neurological Conditions for Camera-Engaged and Task-Engaged Visual Scenes.

    PubMed

    Thiessen, Amber; Beukelman, David; Hux, Karen; Longenecker, Maria

    2016-04-01

    The purpose of the study was to compare the visual attention patterns of adults with aphasia and adults without neurological conditions when viewing visual scenes with 2 types of engagement. Eye-tracking technology was used to measure the visual attention patterns of 10 adults with aphasia and 10 adults without neurological conditions. Participants viewed camera-engaged (i.e., human figure facing camera) and task-engaged (i.e., human figure looking at and touching an object) visual scenes. Participants with aphasia responded to engagement cues by focusing on objects of interest more for task-engaged scenes than camera-engaged scenes; however, the difference in their responses to these scenes were not as pronounced as those observed in adults without neurological conditions. In addition, people with aphasia spent more time looking at background areas of interest and less time looking at person areas of interest for camera-engaged scenes than did control participants. Results indicate people with aphasia visually attend to scenes differently than adults without neurological conditions. As a consequence, augmentative and alternative communication (AAC) facilitators may have different visual attention behaviors than the people with aphasia for whom they are constructing or selecting visual scenes. Further examination of the visual attention of people with aphasia may help optimize visual scene selection.

  15. LPT. Shield test facility test building interior (TAN646). Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Shield test facility test building interior (TAN-646). Camera facing south. Distant pool contained EBOR reactor; near pool was intended for fuel rod storage. Other post-1970 activity equipment remains in pool. INEEL negative no. HD-40-9-4 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  16. LOFT. Interior, control room in control building (TAN630). Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Interior, control room in control building (TAN-630). Camera facing north. Sign says "This control console is partially active. Do not operate any switch handle without authorization." Date: May 2004. INEEL negative no. HD-39-14-3 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  17. PBF Cooling Tower contextual view. Camera facing southwest. West wing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower contextual view. Camera facing southwest. West wing and north facade (rear) of Reactor Building (PER-620) is at left; Cooling Tower to right. Photographer: Kirsh. Date: November 2, 1970. INEEL negative no. 70-4913 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  18. PBF Reactor Building (PER620). Camera faces north into highbay/reactor pit ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces north into high-bay/reactor pit area. Inside from for reactor enclosure is in place. Photographer: John Capek. Date: March 15, 1967. INEEL negative no. 67-1769 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  19. PBF Reactor Building (PER620). Camera facing south end of high ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera facing south end of high bay. Vertical-lift door is being installed. Later, pneumatic seals will be installed around door. Photographer: Kirsh. Date: September 31, 1968. INEEL negative no. 68-3176 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  20. ETR CRITICAL FACILITY (ETRCF), TRA654. SOUTH SIDE. CAMERA FACING NORTH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR CRITICAL FACILITY (ETR-CF), TRA-654. SOUTH SIDE. CAMERA FACING NORTH AND ROLL-UP DOOR. ORIGINAL SIDING HAS BEEN REPLACED WITH STUCCO-LIKE MATERIAL. INL NEGATIVE NO. HD46-40-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  1. PBF Cooling Tower. Camera facing southwest. Round piers will support ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower. Camera facing southwest. Round piers will support Tower's wood "fill" or "packing." Black-topped stack in far distance is at Idaho Chemical Processing Plant. Photographer: John Capek. Date: October 16, 1968. INEEL negative no. 68-4097 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  2. ADM. Change House (TAN606) as completed. Camera facing northerly. Note ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ADM. Change House (TAN-606) as completed. Camera facing northerly. Note proximity to shielding berm. Part of hot shop (A&M Building, TAN-607) at left of view beyond berm. Date: October 29, 1954. INEEL negative no. 12705 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  3. PBF Reactor Building (PER620). Cubicle 10 detail. Camera facing west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Cubicle 10 detail. Camera facing west toward brick shield wall. Valve stems against wall penetrate through east wall of cubicle. Photographer: John Capek. Date: August 19, 1970. INEEL negative no. 70-3469 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  4. LPT. Low power test control building (TAN641) interior. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Low power test control building (TAN-641) interior. Camera facing northeast at what remains of control room console. Cut in wall at right of view shows west wall of northern test cell. INEEL negative no. HD-40-4-4 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  5. Cooperative multisensor system for real-time face detection and tracking in uncontrolled conditions

    NASA Astrophysics Data System (ADS)

    Marchesotti, Luca; Piva, Stefano; Turolla, Andrea; Minetti, Deborah; Regazzoni, Carlo S.

    2005-03-01

    The presented work describes an innovative architecture for multi-sensor distributed video surveillance applications. The aim of the system is to track moving objects in outdoor environments with a cooperative strategy exploiting two video cameras. The system also exhibits the capacity of focusing its attention on the faces of detected pedestrians collecting snapshot frames of face images, by segmenting and tracking them over time at different resolution. The system is designed to employ two video cameras in a cooperative client/server structure: the first camera monitors the entire area of interest and detects the moving objects using change detection techniques. The detected objects are tracked over time and their position is indicated on a map representing the monitored area. The objects" coordinates are sent to the server sensor in order to point its zooming optics towards the moving object. The second camera tracks the objects at high resolution. As well as the client camera, this sensor is calibrated and the position of the object detected on the image plane reference system is translated in its coordinates referred to the same area map. In the map common reference system, data fusion techniques are applied to achieve a more precise and robust estimation of the objects" track and to perform face detection and tracking. The work novelties and strength reside in the cooperative multi-sensor approach, in the high resolution long distance tracking and in the automatic collection of biometric data such as a person face clip for recognition purposes.

  6. Minimum Requirements for Taxicab Security Cameras.

    PubMed

    Zeng, Shengke; Amandus, Harlan E; Amendola, Alfred A; Newbraugh, Bradley H; Cantis, Douglas M; Weaver, Darlene

    2014-07-01

    The homicide rate of taxicab-industry is 20 times greater than that of all workers. A NIOSH study showed that cities with taxicab-security cameras experienced significant reduction in taxicab driver homicides. Minimum technical requirements and a standard test protocol for taxicab-security cameras for effective taxicab-facial identification were determined. The study took more than 10,000 photographs of human-face charts in a simulated-taxicab with various photographic resolutions, dynamic ranges, lens-distortions, and motion-blurs in various light and cab-seat conditions. Thirteen volunteer photograph-evaluators evaluated these face photographs and voted for the minimum technical requirements for taxicab-security cameras. Five worst-case scenario photographic image quality thresholds were suggested: the resolution of XGA-format, highlight-dynamic-range of 1 EV, twilight-dynamic-range of 3.3 EV, lens-distortion of 30%, and shutter-speed of 1/30 second. These minimum requirements will help taxicab regulators and fleets to identify effective taxicab-security cameras, and help taxicab-security camera manufacturers to improve the camera facial identification capability.

  7. 44. ARAIII Fuel oil tank ARA710. Camera facing west. Perimeter ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    44. ARA-III Fuel oil tank ARA-710. Camera facing west. Perimeter fence at left side of view. Gable-roofed building beyond tank on right is ARA-622. Gable-roofed building beyond tank on left is ARA-610. Ineel photo no. 3-16. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  8. MTR WING A, TRA604. SOUTH SIDE. CAMERA FACING NORTH. THIS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR WING A, TRA-604. SOUTH SIDE. CAMERA FACING NORTH. THIS VIEW TYPIFIES TENDENCY FOR EXPANSIONS TO TAKE THE FORM OF PROJECTIONS AND INFILL USING AVAILABLE YARD SPACES. INL NEGATIVE NO. HD47-44-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. PBF Cooling Tower (PER720), and Auxiliary Building (PER624). Camera faces ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower (PER-720), and Auxiliary Building (PER-624). Camera faces north to show south facades. Oblong vertical structure at left of center is weather shield for stairway. Date: August 2003. INEEL negative no. HD-35-10-4 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  10. REACTOR SERVICE BUILDING, TRA635. CROWDED MOCKUP AREA. CAMERA FACES EAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    REACTOR SERVICE BUILDING, TRA-635. CROWDED MOCK-UP AREA. CAMERA FACES EAST. PHOTOGRAPHER'S NOTE SAYS "PICTURE REQUESTED BY IDO IN SUPPORT OF FY '58 BUILDING PROJECTS." INL NEGATIVE NO. 56-3025. R.G. Larsen, Photographer, 9/13/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. WATER PUMP HOUSE, TRA619, PUMP INSTALLATION. CAMERA FACING NORTHEAST CORNER. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    WATER PUMP HOUSE, TRA-619, PUMP INSTALLATION. CAMERA FACING NORTHEAST CORNER. CARD IN LOWER RIGHT WAS INSERTED BY INL PHOTOGRAPHER TO COVER AN OBSOLETE SECURITY RESTRICTION PRINTED ON THE ORIGINAL NEGATIVE. INL NEGATIVE NO. 3998. Unknown Photographer, 12/28/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  12. A Highly Accurate Face Recognition System Using Filtering Correlation

    NASA Astrophysics Data System (ADS)

    Watanabe, Eriko; Ishikawa, Sayuri; Kodate, Kashiko

    2007-09-01

    The authors previously constructed a highly accurate fast face recognition optical correlator (FARCO) [E. Watanabe and K. Kodate: Opt. Rev. 12 (2005) 460], and subsequently developed an improved, super high-speed FARCO (S-FARCO), which is able to process several hundred thousand frames per second. The principal advantage of our new system is its wide applicability to any correlation scheme. Three different configurations were proposed, each depending on correlation speed. This paper describes and evaluates a software correlation filter. The face recognition function proved highly accurate, seeing that a low-resolution facial image size (64 × 64 pixels) has been successfully implemented. An operation speed of less than 10 ms was achieved using a personal computer with a central processing unit (CPU) of 3 GHz and 2 GB memory. When we applied the software correlation filter to a high-security cellular phone face recognition system, experiments on 30 female students over a period of three months yielded low error rates: 0% false acceptance rate and 2% false rejection rate. Therefore, the filtering correlation works effectively when applied to low resolution images such as web-based images or faces captured by a monitoring camera.

  13. Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Che; Huang, Chung-Lin

    2013-03-01

    This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

  14. PBF Cooling Tower (PER720). Camera faces east to show west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower (PER-720). Camera faces east to show west facade. Sloped (louvered) panels in this and opposite facade allow air to enter tower and cool water falling on splash bars within. Date: August 2003. INEEL negative no. HD-35-10-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  15. ETR HEAT EXCHANGER BUILDING, TRA644. EAST SIDE. CAMERA FACING WEST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR HEAT EXCHANGER BUILDING, TRA-644. EAST SIDE. CAMERA FACING WEST. NOTE COURSE OF PIPE FROM GROUND AND FOLLOWING ROOF OF BUILDING. MTR BUILDING IN BACKGROUND AT RIGHT EDGE OF VIEW. INL NEGATIVE NO. HD46-36-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  16. PBF Reactor Building (PER620). Camera facing north toward south facade. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera facing north toward south facade. Note west-wing siding on concrete block; high-bay siding of metal. Excavation and forms for signal and cable trenches proceed from building. Photographer: Kirsh. Date August 20, 1968. INEEL negative no. 68-3332 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  17. LOFT, TAN650. Camera facing southeast. From left to right: stack ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT, TAN-650. Camera facing southeast. From left to right: stack in distance, pre-amp wing, dome, north side of loft "service building." Note poured concrete wall of pre-amp wing on lower section; pumice block above. Date: May 2004. INEEL negative no. HD-39-19-3 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  18. PBF Reactor Building (PER620). Camera is facing east and down ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera is facing east and down into canal and storage pit for fuel rod assemblies. Stainless steel liner is being applied, temporarily covered with plywood for protection. Photographer: John Capek. Date: August 29, 1969. INEEL negative no. 69-4641 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  19. A&M. Hot liquid waste treatment building (TAN616). Camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Hot liquid waste treatment building (TAN-616). Camera facing southwest. Oblique view of east and north walls. Note three corrugated pipes at lower left indicating location of underground hot waste storage tanks. Photographer: Ron Paarmann. Date: September 22, 1997. INEEL negative no. HD-20-1-4 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  20. LOFT. Containment building (TAN650) detail. Camera facing east. Service building ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Containment building (TAN-650) detail. Camera facing east. Service building corner is at left of view above personnel access. Round feature at left of dome is tank that will contain borated water. Metal stack at right of view. Date: 1973. INEEL negative no. 73-1085 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  1. PBF Reactor Building (PER620). Cubicle 10. Camera facing southeast. Loop ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Cubicle 10. Camera facing southeast. Loop pressurizer on right. Other equipment includes loop strained, control valves, loop piping, pressurizer interchanger, and cleanup system cooler. High-density shielding brick walls. Photographer: Kirsh. Date: November 2, 1970. INEEL negative no. 70-4908 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  2. Intelligent person identification system using stereo camera-based height and stride estimation

    NASA Astrophysics Data System (ADS)

    Ko, Jung-Hwan; Jang, Jae-Hun; Kim, Eun-Soo

    2005-05-01

    In this paper, a stereo camera-based intelligent person identification system is suggested. In the proposed method, face area of the moving target person is extracted from the left image of the input steros image pair by using a threshold value of YCbCr color model and by carrying out correlation between the face area segmented from this threshold value of YCbCr color model and the right input image, the location coordinates of the target face can be acquired, and then these values are used to control the pan/tilt system through the modified PID-based recursive controller. Also, by using the geometric parameters between the target face and the stereo camera system, the vertical distance between the target and stereo camera system can be calculated through a triangulation method. Using this calculated vertical distance and the angles of the pan and tilt, the target's real position data in the world space can be acquired and from them its height and stride values can be finally extracted. Some experiments with video images for 16 moving persons show that a person could be identified with these extracted height and stride parameters.

  3. Unconstrained face detection and recognition based on RGB-D camera for the visually impaired

    NASA Astrophysics Data System (ADS)

    Zhao, Xiangdong; Wang, Kaiwei; Yang, Kailun; Hu, Weijian

    2017-02-01

    It is highly important for visually impaired people (VIP) to be aware of human beings around themselves, so correctly recognizing people in VIP assisting apparatus provide great convenience. However, in classical face recognition technology, faces used in training and prediction procedures are usually frontal, and the procedures of acquiring face images require subjects to get close to the camera so that frontal face and illumination guaranteed. Meanwhile, labels of faces are defined manually rather than automatically. Most of the time, labels belonging to different classes need to be input one by one. It prevents assisting application for VIP with these constraints in practice. In this article, a face recognition system under unconstrained environment is proposed. Specifically, it doesn't require frontal pose or uniform illumination as required by previous algorithms. The attributes of this work lie in three aspects. First, a real time frontal-face synthesizing enhancement is implemented, and frontal faces help to increase recognition rate, which is proved with experiment results. Secondly, RGB-D camera plays a significant role in our system, from which both color and depth information are utilized to achieve real time face tracking which not only raises the detection rate but also gives an access to label faces automatically. Finally, we propose to use neural networks to train a face recognition system, and Principal Component Analysis (PCA) is applied to pre-refine the input data. This system is expected to provide convenient help for VIP to get familiar with others, and make an access for them to recognize people when the system is trained enough.

  4. The ideal subject distance for passport pictures.

    PubMed

    Verhoff, Marcel A; Witzel, Carsten; Kreutz, Kerstin; Ramsthaler, Frank

    2008-07-04

    In an age of global combat against terrorism, the recognition and identification of people on document images is of increasing significance. Experiments and calculations have shown that the camera-to-subject distance - not the focal length of the lens - can have a significant effect on facial proportions. Modern passport pictures should be able to function as a reference image for automatic and manual picture comparisons. This requires a defined subject distance. It is completely unclear which subject distance, in the taking of passport photographs, is ideal for the recognition of the actual person. We show here that the camera-to-subject distance that is perceived as ideal is dependent on the face being photographed, even if the distance of 2m was most frequently preferred. So far the problem of the ideal camera-to-subject distance for faces has only been approached through technical calculations. We have, for the first time, answered this question experimentally with a double-blind experiment. Even if there is apparently no ideal camera-to-subject distance valid for every face, 2m can be proposed as ideal for the taking of passport pictures. The first step would actually be the determination of a camera-to-subject distance for the taking of passport pictures within the standards. From an anthropological point of view it would be interesting to find out which facial features allow the preference of a shorter camera-to-subject distance and which allow the preference of a longer camera-to-subject distance.

  5. Minimum Requirements for Taxicab Security Cameras*

    PubMed Central

    Zeng, Shengke; Amandus, Harlan E.; Amendola, Alfred A.; Newbraugh, Bradley H.; Cantis, Douglas M.; Weaver, Darlene

    2015-01-01

    Problem The homicide rate of taxicab-industry is 20 times greater than that of all workers. A NIOSH study showed that cities with taxicab-security cameras experienced significant reduction in taxicab driver homicides. Methods Minimum technical requirements and a standard test protocol for taxicab-security cameras for effective taxicab-facial identification were determined. The study took more than 10,000 photographs of human-face charts in a simulated-taxicab with various photographic resolutions, dynamic ranges, lens-distortions, and motion-blurs in various light and cab-seat conditions. Thirteen volunteer photograph-evaluators evaluated these face photographs and voted for the minimum technical requirements for taxicab-security cameras. Results Five worst-case scenario photographic image quality thresholds were suggested: the resolution of XGA-format, highlight-dynamic-range of 1 EV, twilight-dynamic-range of 3.3 EV, lens-distortion of 30%, and shutter-speed of 1/30 second. Practical Applications These minimum requirements will help taxicab regulators and fleets to identify effective taxicab-security cameras, and help taxicab-security camera manufacturers to improve the camera facial identification capability. PMID:26823992

  6. ETR COMPRESSOR BUILDING, TRA643. CAMERA FACES NORTHEAST. WATER HEAT EXCHANGER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPRESSOR BUILDING, TRA-643. CAMERA FACES NORTHEAST. WATER HEAT EXCHANGER IS IN LEFT FOREGROUND. A PARTIALLY ASSEMBLED PLANT AIR CONDITIONER IS AT CENTER. WORKERS AT RIGHT ASSEMBLE 4000 HORSEPOWER COMPRESSOR DRIVE MOTOR AT RIGHT. INL NEGATIVE NO. 56-3714. R.G. Larsen, Photographer, 11/13/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. PBF Reactor Building (PER620). Camera faces south toward verticallift door, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces south toward vertical-lift door, which is closed. Note crane and its trolley positioned near door; its rails along side walls. Reactor vessel and lifting beams are positioned above reactor pit. Photographer: John Capek. Date: January 9, 1970. INEEL negative no. 70-132 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  8. MTR BUILDING INTERIOR, TRA603. BASEMENT. CAMERA IN WEST CORRIDOR FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BUILDING INTERIOR, TRA-603. BASEMENT. CAMERA IN WEST CORRIDOR FACING SOUTH. FREIGHT ELEVATOR IS AT RIGHT OF VIEW. AT CENTER VIEW IS MTR VAULT NO. 1, USED TO STORE SPECIAL OR FISSIONABLE MATERIALS. INL NEGATIVE NO. HD46-6-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. Human Age Estimation Method Robust to Camera Sensor and/or Face Movement

    PubMed Central

    Nguyen, Dat Tien; Cho, So Ra; Pham, Tuyen Danh; Park, Kang Ryoung

    2015-01-01

    Human age can be employed in many useful real-life applications, such as customer service systems, automatic vending machines, entertainment, etc. In order to obtain age information, image-based age estimation systems have been developed using information from the human face. However, limitations exist for current age estimation systems because of the various factors of camera motion and optical blurring, facial expressions, gender, etc. Motion blurring can usually be presented on face images by the movement of the camera sensor and/or the movement of the face during image acquisition. Therefore, the facial feature in captured images can be transformed according to the amount of motion, which causes performance degradation of age estimation systems. In this paper, the problem caused by motion blurring is addressed and its solution is proposed in order to make age estimation systems robust to the effects of motion blurring. Experiment results show that our method is more efficient for enhancing age estimation performance compared with systems that do not employ our method. PMID:26334282

  10. Alternative images for perpendicular parking : a usability test of a multi-camera parking assistance system.

    DOT National Transportation Integrated Search

    2004-10-01

    The parking assistance system evaluated consisted of four outward facing cameras whose images could be presented on a monitor on the center console. The images presented varied in the location of the virtual eye point of the camera (the height above ...

  11. A&M. A&M building (TAN607). Camera facing east. From left to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. A&M building (TAN-607). Camera facing east. From left to right, pool section, hot shop, cold shop, and machine shop. Biparting doors to hot shop are in open position behind shroud. Four rail tracks lead to hot shop and cold shop. Date: August 20, 1954. INEEL negative no. 11706 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  12. Cross-Platform Mobile Application Development: A Pattern-Based Approach

    DTIC Science & Technology

    2012-03-01

    Additionally, developers should be aware of different hardware capabilities such as external SD cards and forward facing cameras. Finally, each...applications are written. Additionally, developers should be aware of different hardware capabilities such as external SD cards and forward facing cameras... iTunes library, allowing the user to update software and manage content on each device. However, in iOS5, the PC Free feature removes this constraint

  13. PROCESS WATER BUILDING, TRA605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS WATER BUILDING AND ETR STACK ARE IN LEFT HALF OF VIEW. TRA-666 IS NEAR CENTER, ABUTTED BY SECURITY BUILDING; TRA-626, AT RIGHT EDGE OF VIEW BEHIND BUS. INL NEGATIVE NO. HD46-34-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  14. 1. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY. CAMERA FACING NORTHEAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY. CAMERA FACING NORTHEAST. ON RIGHT OF VIEW IS PART OF EARTH/GRAVEL SHIELDING FOR BIN SET. AERIAL STRUCTURE MOUNTED ON POLES IS PNEUMATIC TRANSFER SYSTEM FOR DELIVERY OF SAMPLES BEING SENT FROM NEW WASTE CALCINING FACILITY TO THE CPP REMOTE ANALYTICAL LABORATORY. INEEL PROOF NUMBER HD-17-1. - Idaho National Engineering Laboratory, Old Waste Calcining Facility, Scoville, Butte County, ID

  15. MTR BUILDING, TRA603. EAST SIDE. CAMERA FACING WEST. CORRUGATED IRON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BUILDING, TRA-603. EAST SIDE. CAMERA FACING WEST. CORRUGATED IRON BUILDING MARKED WITH "X" IS TRA-651. TRA-626, TO ITS RIGHT, HOUSED COMPRESSOR EQUIPMENT FOR THE AIRCRAFT NUCLEAR PROPULSION PROGRAM. LATER, IT WAS USED FOR STORAGE. INL NEGATIVE NO. HD46-42-4. Mike Crane, Photographer, April 2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  16. ENGINEERING TEST REACTOR (ETR) BUILDING, TRA642. CONTEXTUAL VIEW, CAMERA FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ENGINEERING TEST REACTOR (ETR) BUILDING, TRA-642. CONTEXTUAL VIEW, CAMERA FACING EAST. VERTICAL METAL SIDING. ROOF IS SLIGHTLY ELEVATED AT CENTER LINE FOR DRAINAGE. WEST SIDE OF ETR COMPRESSOR BUILDING, TRA-643, PROJECTS TOWARD LEFT AT FAR END OF ETR BUILDING. INL NEGATIVE NO. HD46-37-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  17. Integration of multispectral face recognition and multi-PTZ camera automated surveillance for security applications

    NASA Astrophysics Data System (ADS)

    Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi

    2013-06-01

    Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially reduced computational complexity and improved flexibility at the cost of slightly decreased pixel accuracy as compared to Chen and Wang's method [18].

  18. Conditions that influence the accuracy of anthropometric parameter estimation for human body segments using shape-from-silhouette

    NASA Astrophysics Data System (ADS)

    Mundermann, Lars; Mundermann, Annegret; Chaudhari, Ajit M.; Andriacchi, Thomas P.

    2005-01-01

    Anthropometric parameters are fundamental for a wide variety of applications in biomechanics, anthropology, medicine and sports. Recent technological advancements provide methods for constructing 3D surfaces directly. Of these new technologies, visual hull construction may be the most cost-effective yet sufficiently accurate method. However, the conditions influencing the accuracy of anthropometric measurements based on visual hull reconstruction are unknown. The purpose of this study was to evaluate the conditions that influence the accuracy of 3D shape-from-silhouette reconstruction of body segments dependent on number of cameras, camera resolution and object contours. The results demonstrate that the visual hulls lacked accuracy in concave regions and narrow spaces, but setups with a high number of cameras reconstructed a human form with an average accuracy of 1.0 mm. In general, setups with less than 8 cameras yielded largely inaccurate visual hull constructions, while setups with 16 and more cameras provided good volume estimations. Body segment volumes were obtained with an average error of 10% at a 640x480 resolution using 8 cameras. Changes in resolution did not significantly affect the average error. However, substantial decreases in error were observed with increasing number of cameras (33.3% using 4 cameras; 10.5% using 8 cameras; 4.1% using 16 cameras; 1.2% using 64 cameras).

  19. A multi-camera system for real-time pose estimation

    NASA Astrophysics Data System (ADS)

    Savakis, Andreas; Erhard, Matthew; Schimmel, James; Hnatow, Justin

    2007-04-01

    This paper presents a multi-camera system that performs face detection and pose estimation in real-time and may be used for intelligent computing within a visual sensor network for surveillance or human-computer interaction. The system consists of a Scene View Camera (SVC), which operates at a fixed zoom level, and an Object View Camera (OVC), which continuously adjusts its zoom level to match objects of interest. The SVC is set to survey the whole filed of view. Once a region has been identified by the SVC as a potential object of interest, e.g. a face, the OVC zooms in to locate specific features. In this system, face candidate regions are selected based on skin color and face detection is accomplished using a Support Vector Machine classifier. The locations of the eyes and mouth are detected inside the face region using neural network feature detectors. Pose estimation is performed based on a geometrical model, where the head is modeled as a spherical object that rotates upon the vertical axis. The triangle formed by the mouth and eyes defines a vertical plane that intersects the head sphere. By projecting the eyes-mouth triangle onto a two dimensional viewing plane, equations were obtained that describe the change in its angles as the yaw pose angle increases. These equations are then combined and used for efficient pose estimation. The system achieves real-time performance for live video input. Testing results assessing system performance are presented for both still images and video.

  20. Deep Space Detection of Oriented Ice Crystals

    NASA Astrophysics Data System (ADS)

    Marshak, A.; Varnai, T.; Kostinski, A. B.

    2017-12-01

    The deep space climate observatory (DSCOVR) spacecraft resides at the first Lagrangian point about one million miles from Earth. A polychromatic imaging camera onboard delivers nearly hourly observations of the entire sun-lit face of the Earth. Many images contain unexpected bright flashes of light over both ocean and land. We constructed a yearlong time series of flash latitudes, scattering angles and oxygen absorption to demonstrate conclusively that the flashes over land are specular reflections off tiny ice crystals floating in the air nearly horizontally. Such deep space detection of tropospheric ice can be used to constrain the likelihood of oriented crystals and their contribution to Earth albedo.

  1. Building, north side (original front), detail of original entrance. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Building, north side (original front), detail of original entrance. Camera facing south - Naval Supply Center, Broadway Complex, Administration Storehouse, 911 West Broadway, San Diego, San Diego County, CA

  2. Why are faces denser in the visual experiences of younger than older infants?

    PubMed Central

    Jayaraman, Swapnaa; Fausey, Caitlin M.; Smith, Linda B.

    2017-01-01

    Recent evidence from studies using head cameras suggests that the frequency of faces directly in front of infants declines over the first year and a half of life, a result that has implications for the development of and evolutionary constraints on face processing. Two experiments tested two opposing hypotheses about this observed age-related decline in the frequency of faces in infant views. By the People-input hypothesis, there are more faces in view for younger infants because people are more often physically in front of younger than older infants. This hypothesis predicts that not just faces but views of other body parts will decline with age. By the Face-input hypothesis, the decline is strictly about faces, not people or other body parts in general. Two experiments, one using a time-sampling method (84 infants 3 to 24 months in age) and the other analyses of head camera images (36 infants 1 to 24 months) provide strong support for the Face-input hypothesis. The results suggest developmental constraints on the environment that ensure faces are prevalent early in development. PMID:28026190

  3. Development of camera technology for monitoring nests. Chapter 15

    Treesearch

    W. Andrew Cox; M. Shane Pruett; Thomas J. Benson; Scott J. Chiavacci; Frank R., III Thompson

    2012-01-01

    Photo and video technology has become increasingly useful in the study of avian nesting ecology. However, researchers interested in using camera systems are often faced with insufficient information on the types and relative advantages of available technologies. We reviewed the literature for studies of nests that used cameras and summarized them based on study...

  4. PBF Reactor Building (PER620). Camera faces south along west wall. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces south along west wall. Gap between native lava rock and concrete basement walls is being backfilled and compacted. Wire mesh protects workers from falling rock. Note penetrations for piping that will carry secondary coolant water to Cooling Tower. Photographer: Holmes. Date: June 15, 1967. INEEL negative no. 67-3665 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  5. LOFT complex, camera facing west. Mobile entry (TAN624) is position ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT complex, camera facing west. Mobile entry (TAN-624) is position next to containment building (TAN-650). Shielded roadway entrance in view just below and to right of stack. Borated water tank has been covered with weather shelter and is no longer visible. ANP hangar (TAN-629) in view beyond LOFT. Date: 1974. INEEL negative no. 74-4191 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  6. ETR BUILDING, TRA642, INTERIOR. BASEMENT. CAMERA FACES SOUTH AND LOOKS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. BASEMENT. CAMERA FACES SOUTH AND LOOKS AT DOOR TO M-3 CUBICLE. CUBICLE WALLS ARE MADE OF LEAD SHIELDING BRICKS. VALVE HANDLES AND STEMS PERTAIN TO SAMPLING. METAL SHIELDING DOOR. NOTE GLOVE BOX TO RIGHT OF CUBICLE DOOR. INL NEGATIVE NO. HD-46-21-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. ETR HEAT EXCHANGER BUILDING, TRA644. SOUTH SIDE. CAMERA FACING NORTH. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR HEAT EXCHANGER BUILDING, TRA-644. SOUTH SIDE. CAMERA FACING NORTH. NOTE POURED CONCRETE WALLS. ETR IS AT LEFT OF VIEW. NOTE DRIVEWAY INSET AT RIGHT FORMED BY DEMINERALIZER WING AT RIGHT. SOUTHEAST CORNER OF ETR, TRA-642, IN VIEW AT UPPER LEFT. INL NEGATIVE NO. HD46-36-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  8. 9. INTERIOR, WAREHOUSE SPACE AT EAST END OF BUILDING, CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. INTERIOR, WAREHOUSE SPACE AT EAST END OF BUILDING, CAMERA FACING NORTHEAST. - U.S. Coast Guard Support Center Alameda, Warehouse, Spencer Road & Icarrus Drive, Coast Guard Island, Alameda, Alameda County, CA

  9. Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas

    PubMed Central

    2018-01-01

    This paper presents a method of fusing the ego-motion of a robot or a land vehicle estimated from an upward-facing camera with Global Navigation Satellite System (GNSS) signals for navigation purposes in urban environments. A sky-pointing camera is mounted on the top of a car and synchronized with a GNSS receiver. The advantages of this configuration are two-fold: firstly, for the GNSS signals, the upward-facing camera will be used to classify the acquired images into sky and non-sky (also known as segmentation). A satellite falling into the non-sky areas (e.g., buildings, trees) will be rejected and not considered for the final position solution computation. Secondly, the sky-pointing camera (with a field of view of about 90 degrees) is helpful for urban area ego-motion estimation in the sense that it does not see most of the moving objects (e.g., pedestrians, cars) and thus is able to estimate the ego-motion with fewer outliers than is typical with a forward-facing camera. The GNSS and visual information systems are tightly-coupled in a Kalman filter for the final position solution. Experimental results demonstrate the ability of the system to provide satisfactory navigation solutions and better accuracy than the GNSS-only and the loosely-coupled GNSS/vision, 20 percent and 82 percent (in the worst case) respectively, in a deep urban canyon, even in conditions with fewer than four GNSS satellites. PMID:29673230

  10. Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas.

    PubMed

    Gakne, Paul Verlaine; O'Keefe, Kyle

    2018-04-17

    This paper presents a method of fusing the ego-motion of a robot or a land vehicle estimated from an upward-facing camera with Global Navigation Satellite System (GNSS) signals for navigation purposes in urban environments. A sky-pointing camera is mounted on the top of a car and synchronized with a GNSS receiver. The advantages of this configuration are two-fold: firstly, for the GNSS signals, the upward-facing camera will be used to classify the acquired images into sky and non-sky (also known as segmentation). A satellite falling into the non-sky areas (e.g., buildings, trees) will be rejected and not considered for the final position solution computation. Secondly, the sky-pointing camera (with a field of view of about 90 degrees) is helpful for urban area ego-motion estimation in the sense that it does not see most of the moving objects (e.g., pedestrians, cars) and thus is able to estimate the ego-motion with fewer outliers than is typical with a forward-facing camera. The GNSS and visual information systems are tightly-coupled in a Kalman filter for the final position solution. Experimental results demonstrate the ability of the system to provide satisfactory navigation solutions and better accuracy than the GNSS-only and the loosely-coupled GNSS/vision, 20 percent and 82 percent (in the worst case) respectively, in a deep urban canyon, even in conditions with fewer than four GNSS satellites.

  11. Investigation into the use of photoanthropometry in facial image comparison.

    PubMed

    Moreton, Reuben; Morley, Johanna

    2011-10-10

    Photoanthropometry is a metric based facial image comparison technique. Measurements of the face are taken from an image using predetermined facial landmarks. Measurements are then converted to proportionality indices (PIs) and compared to PIs from another facial image. Photoanthropometry has been presented as a facial image comparison technique in UK courts for over 15 years. It is generally accepted that extrinsic factors (e.g. orientation of the head, camera angle and distance from the camera) can cause discrepancies in anthropometric measurements of the face from photographs. However there has been limited empirical research into quantifying the influence of such variables. The aim of this study was to determine the reliability of photoanthropometric measurements between different images of the same individual taken with different angulations of the camera. The study examined the facial measurements of 25 individuals from high resolution photographs, taken at different horizontal and vertical camera angles in a controlled environment. Results show that the degree of variability in facial measurements of the same individual due to variations in camera angle can be as great as the variability of facial measurements between different individuals. Results suggest that photoanthropometric facial comparison, as it is currently practiced, is unsuitable for elimination purposes. Preliminary investigations into the effects of distance from camera and image resolution in poor quality images suggest that such images are not an accurate representation of an individuals face, however further work is required. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Efficient workflows for 3D building full-color model reconstruction using LIDAR long-range laser and image-based modeling techniques

    NASA Astrophysics Data System (ADS)

    Shih, Chihhsiong

    2005-01-01

    Two efficient workflow are developed for the reconstruction of a 3D full color building model. One uses a point wise sensing device to sample an unknown object densely and attach color textures from a digital camera separately. The other uses an image based approach to reconstruct the model with color texture automatically attached. The point wise sensing device reconstructs the CAD model using a modified best view algorithm that collects the maximum number of construction faces in one view. The partial views of the point clouds data are then glued together using a common face between two consecutive views. Typical overlapping mesh removal and coarsening procedures are adapted to generate a unified 3D mesh shell structure. A post processing step is then taken to combine the digital image content from a separate camera with the 3D mesh shell surfaces. An indirect uv mapping procedure first divide the model faces into groups within which every face share the same normal direction. The corresponding images of these faces in a group is then adjusted using the uv map as a guidance. The final assembled image is then glued back to the 3D mesh to present a full colored building model. The result is a virtual building that can reflect the true dimension and surface material conditions of a real world campus building. The image based modeling procedure uses a commercial photogrammetry package to reconstruct the 3D model. A novel view planning algorithm is developed to guide the photos taking procedure. This algorithm successfully generate a minimum set of view angles. The set of pictures taken at these view angles can guarantee that each model face shows up at least in two of the pictures set and no more than three. The 3D model can then be reconstructed with minimum amount of labor spent in correlating picture pairs. The finished model is compared with the original object in both the topological and dimensional aspects. All the test cases show exact same topology and reasonably low dimension error ratio. Again proving the applicability of the algorithm.

  13. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks.

    PubMed

    Zhang, Cuicui; Liang, Xuefeng; Matsuyama, Takashi

    2014-12-08

    Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the "small sample size" (SSS) problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1) how to define diverse base classifiers from the small data; (2) how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0-1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system.

  14. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks

    PubMed Central

    Zhang, Cuicui; Liang, Xuefeng; Matsuyama, Takashi

    2014-01-01

    Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the “small sample size” (SSS) problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1) how to define diverse base classifiers from the small data; (2) how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0–1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system. PMID:25494350

  15. FaceWarehouse: a 3D facial expression database for visual computing.

    PubMed

    Cao, Chen; Weng, Yanlin; Zhou, Shun; Tong, Yiying; Zhou, Kun

    2014-03-01

    We present FaceWarehouse, a database of 3D facial expressions for visual computing applications. We use Kinect, an off-the-shelf RGBD camera, to capture 150 individuals aged 7-80 from various ethnic backgrounds. For each person, we captured the RGBD data of her different expressions, including the neutral expression and 19 other expressions such as mouth-opening, smile, kiss, etc. For every RGBD raw data record, a set of facial feature points on the color image such as eye corners, mouth contour, and the nose tip are automatically localized, and manually adjusted if better accuracy is required. We then deform a template facial mesh to fit the depth data as closely as possible while matching the feature points on the color image to their corresponding points on the mesh. Starting from these fitted face meshes, we construct a set of individual-specific expression blendshapes for each person. These meshes with consistent topology are assembled as a rank-3 tensor to build a bilinear face model with two attributes: identity and expression. Compared with previous 3D facial databases, for every person in our database, there is a much richer matching collection of expressions, enabling depiction of most human facial actions. We demonstrate the potential of FaceWarehouse for visual computing with four applications: facial image manipulation, face component transfer, real-time performance-based facial image animation, and facial animation retargeting from video to image.

  16. Mastcam Telephoto of a Martian Dune Downwind Face

    NASA Image and Video Library

    2016-01-04

    This view combines multiple images from the telephoto-lens camera of the Mast Camera (Mastcam) on NASA's Curiosity Mars rover to reveal fine details of the downwind face of "Namib Dune." The site is part of the dark-sand "Bagnold Dunes" field along the northwestern flank of Mount Sharp. Images taken from orbit have shown that dunes in the Bagnold field move as much as about 3 feet (1 meter) per Earth year. Sand on this face of Namib Dune has cascaded down a slope of about 26 to 28 degrees. The top of the face is about 13 to 17 feet (4 to 5 meters) above the rocky ground at its base. http://photojournal.jpl.nasa.gov/catalog/PIA20283

  17. DETAIL OF DOORS ON EAST ELEVATION AT SOUTH END; CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF DOORS ON EAST ELEVATION AT SOUTH END; CAMERA FACING WEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  18. INTERIOR VIEW OF FIRST STORY SPACE SHOWING CONCRETE BEAMS; CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF FIRST STORY SPACE SHOWING CONCRETE BEAMS; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  19. IET. Weather instrumentation tower, located south of control building. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Weather instrumentation tower, located south of control building. Camera facing west. Date: August 17, 1955. INEEL negative no. 55-2414 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  20. Detail of second story balcony porch at southeast corner; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of second story balcony porch at southeast corner; camera facing northwest. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  1. Detail of main entry at center of southeast elevation; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of main entry at center of southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  2. Detail of basement level concrete beams at southwest corner; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of basement level concrete beams at southwest corner; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  3. Detail of windows at north portion of west elevation; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of windows at north portion of west elevation; camera facing east. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  4. Detail of portico at main entry on east elevation; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of portico at main entry on east elevation; camera facing southwest. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  5. Interior detail of lobby terrazzo floor inlaid NPG insignia; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of lobby terrazzo floor inlaid NPG insignia; camera facing west. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  6. Interior detail of fourpart moveable doors in east hallway; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of four-part moveable doors in east hallway; camera facing south. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  7. Detail of southeast corner with spandrel and window pattern; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of southeast corner with spandrel and window pattern; camera facing northeast. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  8. Detail of tower and entry doors on north elevation; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of tower and entry doors on north elevation; camera facing southwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  9. Detail of northeast corner with spandrel and window pattern; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of northeast corner with spandrel and window pattern; camera facing southwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  10. Interior detail of main entrance doors on south wall; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of main entrance doors on south wall; camera facing south. - Mare Island Naval Shipyard, Old Administrative Offices, Eighth Street, north side between Railroad Avenue & Walnut Avenue, Vallejo, Solano County, CA

  11. EAST FACE OF REACTOR BASE. COMING TOWARD CAMERA IS EXCAVATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    EAST FACE OF REACTOR BASE. COMING TOWARD CAMERA IS EXCAVATION FOR MTR CANAL. CAISSONS FLANK EACH SIDE. COUNTERFORT (SUPPORT PERPENDICULAR TO WHAT WILL BE THE LONG WALL OF THE CANAL) RESTS ATOP LEFT CAISSON. IN LOWER PART OF VIEW, DRILLERS PREPARE TRENCHES FOR SUPPORT BEAMS THAT WILL LIE BENEATH CANAL FLOOR. INL NEGATIVE NO. 739. Unknown Photographer, 10/6/1950 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  12. PBF Reactor Building (PER620). Camera facing southeast in second basement. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera facing southeast in second basement. Round form and reinforcing steel surround reactor vessel pit, which will be heavily shielded by several feet of concrete. Block-out is for door to sub-pile room. Rectangular form and rebar beyond pit is for canal wall. Photographer: John Capek. Date: March 10, 1967. INEEL negative no. 67-1643 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  13. FAST CHOPPER BUILDING, TRA665. CAMERA FACING NORTH. NOTE BRICKEDIN WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FAST CHOPPER BUILDING, TRA-665. CAMERA FACING NORTH. NOTE BRICKED-IN WINDOW ON RIGHT SIDE (BELOW PAINTED NUMERALS "665"). SLIDING METAL DOOR ON COVERED RAIL AT UPPER LEVEL. SHELTERED ENTRANCE TO STEEL SHIELDING DOOR. DOOR INTO MTR SERVICE BUILDING, TRA-635, STANDS OPEN. MTR BEHIND CHOPPER BUILDING. INL NEGATIVE NO. HD42-1. Mike Crane, Photographer, 3/2004 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  14. ETR COMPRESSOR BUILDING, TRA643. CAMERA FACES NORTH. AIR HEATERS LINE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPRESSOR BUILDING, TRA-643. CAMERA FACES NORTH. AIR HEATERS LINE UP AGAINST WALL, TO BE USED IN CONNECTION WITH ETR EXPERIMENTS. EACH HAD A HEAT OUTPUT OF 8 MILLION BTU PER HOUR, OPERATED AT 1260 DEGREES F. AND A PRESSURE OF 320 PSI. NOTE METAL WALLS AND ROOF. INL NEGATIVE NO. 56-3709. R.G. Larsen, Photographer, 11/13/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  15. A 3D camera for improved facial recognition

    NASA Astrophysics Data System (ADS)

    Lewin, Andrew; Orchard, David A.; Scott, Andrew M.; Walton, Nicholas A.; Austin, Jim

    2004-12-01

    We describe a camera capable of recording 3D images of objects. It does this by projecting thousands of spots onto an object and then measuring the range to each spot by determining the parallax from a single frame. A second frame can be captured to record a conventional image, which can then be projected onto the surface mesh to form a rendered skin. The camera is able of locating the images of the spots to a precision of better than one tenth of a pixel, and from this it can determine range to an accuracy of less than 1 mm at 1 meter. The data can be recorded as a set of two images, and is reconstructed by forming a 'wire mesh' of range points and morphing the 2 D image over this structure. The camera can be used to record the images of faces and reconstruct the shape of the face, which allows viewing of the face from various angles. This allows images to be more critically inspected for the purpose of identifying individuals. Multiple images can be stitched together to create full panoramic images of head sized objects that can be viewed from any direction. The system is being tested with a graph matching system capable of fast and accurate shape comparisons for facial recognition. It can also be used with "models" of heads and faces to provide a means of obtaining biometric data.

  16. PBF Cooling Tower detail. Camera facing southwest into north side ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower detail. Camera facing southwest into north side of Tower. Five horizontal layers of splash bars constitute fill decks, which will break up falling water into droplets, promoting evaporative cooling. Louvered faces, through which air enters tower, are on east and west sides. Louvers have been installed. Support framework for one of two venturi-shaped fan stacks (or "vents") is in center top. Orifices in hot basins (not in view) will distribute water over fill. Photographer: Kirsh. Date: May 15, 1969. INEEL negative no. 69-3032 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  17. Face and construct validation of a next generation virtual reality (Gen2-VR) surgical simulator.

    PubMed

    Sankaranarayanan, Ganesh; Li, Baichun; Manser, Kelly; Jones, Stephanie B; Jones, Daniel B; Schwaitzberg, Steven; Cao, Caroline G L; De, Suvranu

    2016-03-01

    Surgical performance is affected by distractors and interruptions to surgical workflow that exist in the operating room. However, traditional surgical simulators are used to train surgeons in a skills laboratory that does not recreate these conditions. To overcome this limitation, we have developed a novel, immersive virtual reality (Gen2-VR) system to train surgeons in these environments. This study was to establish face and construct validity of our system. The study was a within-subjects design, with subjects repeating a virtual peg transfer task under three different conditions: Case I: traditional VR; Case II: Gen2-VR with no distractions and Case III: Gen2-VR with distractions and interruptions. In Case III, to simulate the effects of distractions and interruptions, music was played intermittently, the camera lens was fogged for 10 s and tools malfunctioned for 15 s at random points in time during the simulation. At the completion of the study subjects filled in a 5-point Likert scale feedback questionnaire. A total of sixteen subjects participated in this study. Friedman test showed significant difference in scores between the three conditions (p < 0.0001). Post hoc analysis using Wilcoxon signed-rank tests with Bonferroni correction further showed that all the three conditions were significantly different from each other (Case I, Case II, p < 0.0001), (Case I, Case III, p < 0.0001) and (Case II, Case III, p = 0.009). Subjects rated that fog (mean 4.18) and tool malfunction (median 4.56) significantly hindered their performance. The results showed that Gen2-VR simulator has both face and construct validity and that it can accurately and realistically present distractions and interruptions in a simulated OR, in spite of limitations of the current HMD hardware technology.

  18. Face and Construct Validation of a Next Generation Virtual Reality (Gen2-VR©) Surgical Simulator

    PubMed Central

    Sankaranarayanan, Ganesh; Li, Baichun; Manser, Kelly; Jones, Stephanie B.; Jones, Daniel B.; Schwaitzberg, Steven; Cao, Caroline G. L.; De, Suvranu

    2015-01-01

    Introduction Surgical performance is affected by distractors and interruptions to surgical workflow that exist in the operating room. However, traditional surgical simulators are used to train surgeons in a skills lab that does not recreate these conditions. To overcome this limitation, we have developed a novel, immersive virtual reality (Gen2-VR©) system to train surgeons in these environments. This study was to establish face and construct validity of our system. Methods and Procedures The study was a within-subjects design, with subjects repeating a virtual peg transfer task under three different conditions: CASE I: traditional VR; CASE II: Gen2-VR© with no distractions and CASE III: Gen2-VR© with distractions and interruptions.. In Case III, to simulate the effects of distractions and interruptions, music was played intermittently, the camera lens was fogged for 10 seconds and tools malfunctioned for 15 seconds at random points in time during the simulation. At the completion of the study subjects filled in a 5-point Likert scale feedback questionnaire. A total of sixteen subjects participated in this study. Results Friedman test showed significant difference in scores between the three conditions (p < 0.0001). Post hoc analysis using Wilcoxon Signed Rank tests with Bonferroni correction further showed that all the three conditions were significantly different from each other (Case I, Case II, p < 0.001), (Case I, Case III, p < 0.001) and (Case II, Case III, p = 0.009). Subjects rated that fog (mean= 4.18) and tool malfunction (median = 4.56) significantly hindered their performance. Conclusion The results showed that Gen2-VR© simulator has both face and construct validity and it can accurately and realistically present distractions and interruptions in a simulated OR, in spite of limitations of the current HMD hardware technology. PMID:26092010

  19. Fringe projection profilometry with portable consumer devices

    NASA Astrophysics Data System (ADS)

    Liu, Danji; Pan, Zhipeng; Wu, Yuxiang; Yue, Huimin

    2018-01-01

    A fringe projection profilometry (FPP) using portable consumer devices is attractive because it can realize optical three dimensional (3D) measurement for ordinary consumers in their daily lives. We demonstrate a FPP using a camera in a smart mobile phone and a digital consumer mini projector. In our experiment of testing the smart phone (iphone7) camera performance, the rare-facing camera in the iphone7 causes the FPP to have a fringe contrast ratio of 0.546, nonlinear carrier phase aberration value of 0.6 rad, and nonlinear phase error of 0.08 rad and RMS random phase error of 0.033 rad. In contrast, the FPP using the industrial camera has a fringe contrast ratio of 0.715, nonlinear carrier phase aberration value of 0.5 rad, nonlinear phase error of 0.05 rad and RMS random phase error of 0.011 rad. Good performance is achieved by using the FPP composed of an iphone7 and a mini projector. 3D information of a facemask with a size for an adult is also measured by using the FPP that uses portable consumer devices. After the system calibration, the 3D absolute information of the facemask is obtained. The measured results are in good agreement with the ones that are carried out in a traditional way. Our results show that it is possible to use portable consumer devices to construct a good FPP, which is useful for ordinary people to get 3D information in their daily lives.

  20. Application of robust face recognition in video surveillance systems

    NASA Astrophysics Data System (ADS)

    Zhang, De-xin; An, Peng; Zhang, Hao-xiang

    2018-03-01

    In this paper, we propose a video searching system that utilizes face recognition as searching indexing feature. As the applications of video cameras have great increase in recent years, face recognition makes a perfect fit for searching targeted individuals within the vast amount of video data. However, the performance of such searching depends on the quality of face images recorded in the video signals. Since the surveillance video cameras record videos without fixed postures for the object, face occlusion is very common in everyday video. The proposed system builds a model for occluded faces using fuzzy principal component analysis (FPCA), and reconstructs the human faces with the available information. Experimental results show that the system has very high efficiency in processing the real life videos, and it is very robust to various kinds of face occlusions. Hence it can relieve people reviewers from the front of the monitors and greatly enhances the efficiency as well. The proposed system has been installed and applied in various environments and has already demonstrated its power by helping solving real cases.

  1. The Large Synoptic Survey Telescope (LSST) Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.

  2. Face Liveness Detection Using Defocus

    PubMed Central

    Kim, Sooyeon; Ban, Yuseok; Lee, Sangyoun

    2015-01-01

    In order to develop security systems for identity authentication, face recognition (FR) technology has been applied. One of the main problems of applying FR technology is that the systems are especially vulnerable to attacks with spoofing faces (e.g., 2D pictures). To defend from these attacks and to enhance the reliability of FR systems, many anti-spoofing approaches have been recently developed. In this paper, we propose a method for face liveness detection using the effect of defocus. From two images sequentially taken at different focuses, three features, focus, power histogram and gradient location and orientation histogram (GLOH), are extracted. Afterwards, we detect forged faces through the feature-level fusion approach. For reliable performance verification, we develop two databases with a handheld digital camera and a webcam. The proposed method achieves a 3.29% half total error rate (HTER) at a given depth of field (DoF) and can be extended to camera-equipped devices, like smartphones. PMID:25594594

  3. PBF Reactor Building (PER620). In subpile room, camera faces southeast ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). In sub-pile room, camera faces southeast and looks up toward bottom of reactor vessel. Upper assembly in center of view is in-pile tube as it connects to vessel. Lower lateral constraints and rotating control cable are in position. Other connections have been bolted together. Note light bulbs for scale. Photographer: John Capek. Date: August 21, 1970. INEEL negative no. 70-3494 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  4. PBF Reactor Building (PER620). Camera on main floor faces south ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera on main floor faces south (open) doorway. In foreground is canal gate, lined with stainless steel and painted with protective coatings. Reactor pit is round with protective coatings. Reactor put is round form discernible beyond. Lifting beams and rigging are in place for a load test before reactor vessel arrives. Photographer: John Capek. Date: January 26, 1970. INEEL negative no. 70-347 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  5. HOT CELL BUILDING, TRA632. EAST END OF BUILDING. CAMERA FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632. EAST END OF BUILDING. CAMERA FACING WEST. TRUCK ENCLOSURE (1986) TO THE LEFT, SMALL ADDITION IN ITS SHADOW IS ENCLOSURE OVER METAL PORT INTO HOT CELL NO. 1 (THE OLDEST HOT CELL). NOTE PERSONNEL LADDER AND PLATFORM AT LOFT LEVEL USED WHEN SERVICING AIR FILTERS AND VENTS OF CELL NO. 1. INL NEGATIVE NO. HD46-32-4. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  6. REACTOR SERVICE BUILDING, TRA635, INTERIOR. CAMERA FACES NORTHWEST TOWARDS INTERIOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    REACTOR SERVICE BUILDING, TRA-635, INTERIOR. CAMERA FACES NORTHWEST TOWARDS INTERIOR WALL ENCLOSING STORAGE AND OFFICE SPACE ALONG THE WEST SIDE. AT RIGHT EDGE IS DOOR TO MTR BUILDING. FROM RIGHT TO LEFT, SPACE WAS PLANNED FOR A LOCKER ROOM, MTR ISSUE ROOM, AND STORAGE AREAS AND RELATED OFFICES. NOTE SECOND "MEZZANINE" FLOOR ABOVE. INL NEGATIVE NO. 10227. Unknown Photographer, 3/23/1954 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. DEMINERALIZER BUILDING,TRA608. CAMERA FACES EAST ALONG SOUTH WALL. INSTRUMENT PANEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DEMINERALIZER BUILDING,TRA-608. CAMERA FACES EAST ALONG SOUTH WALL. INSTRUMENT PANEL BOARD IS IN RIGHT HALF OF VIEW, WITH FOUR PUMPS BEYOND. SMALLER PUMPS FILL DEMINERALIZED WATER TANK ON SOUTH SIDE OF BUILDING. CARD IN LOWER RIGHT WAS INSERTED BY INL PHOTOGRAPHER TO COVER AN OBSOLETE SECURITY RESTRICTION PRINTED ON ORIGINAL NEGATIVE. INL NEGATIVE NO. 3997A. Unknown Photographer, 12/28/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  8. An automated system to measure the quantum efficiency of CCDs for astronomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, R.; Chiang, J.; Cinabro, D.

    We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less

  9. An automated system to measure the quantum efficiency of CCDs for astronomy

    DOE PAGES

    Coles, R.; Chiang, J.; Cinabro, D.; ...

    2017-04-18

    We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less

  10. DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  11. A 3-dimensional anthropometric evaluation of facial morphology among Chinese and Greek population.

    PubMed

    Liu, Yun; Kau, Chung How; Pan, Feng; Zhou, Hong; Zhang, Qiang; Zacharopoulos, Georgios Vasileiou

    2013-07-01

    The use of 3-dimensional (3D) facial imaging has taken greater importance as orthodontists use the soft tissue paradigm in the evaluation of skeletal disproportion. Studies have shown that faces defer in populations. To date, no anthropometric evaluations have been made of Chinese and Greek faces. The aim of this study was to compare facial morphologies of Greeks and Chinese using 3D facial anthropometric landmarks. Three-dimensional facial images were acquired via a commercially available stereophotogrammetric camera capture system. The 3dMD face system captured 245 subjects from 2 population groups (Chinese [n = 72] and Greek [n = 173]), and each population was categorized into male and female groups for evaluation. All subjects in the group were between 18 and 30 years old and had no apparent facial anomalies. Twenty-five anthropometric landmarks were identified on the 3D faces of each subject. Soft tissue nasion was set as the "zeroed" reference landmark. Twenty landmark distances were constructed and evaluated within 3 dimensions of space. Six angles, 4 proportions, and 1 construct were also calculated. Student t test was used to analyze each data set obtained within each subgroup. Distinct facial differences were noted between the subgroups evaluated. When comparing differences of sexes in 2 populations (eg, male Greeks and male Chinese), significant differences were noted in more than 80% of the landmark distances calculated. One hundred percent of the angular were significant, and the Chinese were broader in width to height facial proportions. In evaluating the lips to the esthetic line, the Chinese population had more protrusive lips. There are differences in the facial morphologies of subjects obtained from a Chinese population versus that of a Greek population.

  12. The Large Synoptic Survey Telescope (LSST) Camera

    ScienceCinema

    None

    2018-06-13

    Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.

  13. Development of a camera casing suited for cryogenic and vacuum applications

    NASA Astrophysics Data System (ADS)

    Delaquis, S. C.; Gornea, R.; Janos, S.; Lüthi, M.; von Rohr, Ch Rudolf; Schenk, M.; Vuilleumier, J.-L.

    2013-12-01

    We report on the design, construction, and operation of a PID temperature controlled and vacuum tight camera casing. The camera casing contains a commercial digital camera and a lighting system. The design of the camera casing and its components are discussed in detail. Pictures taken by this cryo-camera while immersed in argon vapour and liquid nitrogen are presented. The cryo-camera can provide a live view inside cryogenic set-ups and allows to record video.

  14. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  15. Public Speaking Anxiety: Comparing Face-to-Face and Web-Based Speeches

    ERIC Educational Resources Information Center

    Campbell, Scott; Larson, James

    2013-01-01

    This study is to determine whether or not students have a different level of anxiety between giving a speech to a group of people in a traditional face-to-face classroom setting to a speech given to an audience (visible on a projected screen) into a camera using distance or web-based technology. The study included approximately 70 students.…

  16. Viewpoint matters: objective performance metrics for surgeon endoscope control during robot-assisted surgery.

    PubMed

    Jarc, Anthony M; Curet, Myriam J

    2017-03-01

    Effective visualization of the operative field is vital to surgical safety and education. However, additional metrics for visualization are needed to complement other common measures of surgeon proficiency, such as time or errors. Unlike other surgical modalities, robot-assisted minimally invasive surgery (RAMIS) enables data-driven feedback to trainees through measurement of camera adjustments. The purpose of this study was to validate and quantify the importance of novel camera metrics during RAMIS. New (n = 18), intermediate (n = 8), and experienced (n = 13) surgeons completed 25 virtual reality simulation exercises on the da Vinci Surgical System. Three camera metrics were computed for all exercises and compared to conventional efficiency measures. Both camera metrics and efficiency metrics showed construct validity (p < 0.05) across most exercises (camera movement frequency 23/25, camera movement duration 22/25, camera movement interval 19/25, overall score 24/25, completion time 25/25). Camera metrics differentiated new and experienced surgeons across all tasks as well as efficiency metrics. Finally, camera metrics significantly (p < 0.05) correlated with completion time (camera movement frequency 21/25, camera movement duration 21/25, camera movement interval 20/25) and overall score (camera movement frequency 20/25, camera movement duration 19/25, camera movement interval 20/25) for most exercises. We demonstrate construct validity of novel camera metrics and correlation between camera metrics and efficiency metrics across many simulation exercises. We believe camera metrics could be used to improve RAMIS proficiency-based curricula.

  17. Parting Moon Shots from NASAs GRAIL Mission

    NASA Image and Video Library

    2013-01-10

    Video of the moon taken by the NASA GRAIL mission's MoonKam (Moon Knowledge Acquired by Middle School Students) camera aboard the Ebb spacecraft on Dec. 14, 2012. Features forward-facing and rear-facing views.

  18. Cross-modal face recognition using multi-matcher face scores

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Blasch, Erik

    2015-05-01

    The performance of face recognition can be improved using information fusion of multimodal images and/or multiple algorithms. When multimodal face images are available, cross-modal recognition is meaningful for security and surveillance applications. For example, a probe face is a thermal image (especially at nighttime), while only visible face images are available in the gallery database. Matching a thermal probe face onto the visible gallery faces requires crossmodal matching approaches. A few such studies were implemented in facial feature space with medium recognition performance. In this paper, we propose a cross-modal recognition approach, where multimodal faces are cross-matched in feature space and the recognition performance is enhanced with stereo fusion at image, feature and/or score level. In the proposed scenario, there are two cameras for stereo imaging, two face imagers (visible and thermal images) in each camera, and three recognition algorithms (circular Gaussian filter, face pattern byte, linear discriminant analysis). A score vector is formed with three cross-matched face scores from the aforementioned three algorithms. A classifier (e.g., k-nearest neighbor, support vector machine, binomial logical regression [BLR]) is trained then tested with the score vectors by using 10-fold cross validations. The proposed approach was validated with a multispectral stereo face dataset from 105 subjects. Our experiments show very promising results: ACR (accuracy rate) = 97.84%, FAR (false accept rate) = 0.84% when cross-matching the fused thermal faces onto the fused visible faces by using three face scores and the BLR classifier.

  19. Applications of a shadow camera system for energy meteorology

    NASA Astrophysics Data System (ADS)

    Kuhn, Pascal; Wilbert, Stefan; Prahl, Christoph; Garsche, Dominik; Schüler, David; Haase, Thomas; Ramirez, Lourdes; Zarzalejo, Luis; Meyer, Angela; Blanc, Philippe; Pitz-Paal, Robert

    2018-02-01

    Downward-facing shadow cameras might play a major role in future energy meteorology. Shadow cameras directly image shadows on the ground from an elevated position. They are used to validate other systems (e.g. all-sky imager based nowcasting systems, cloud speed sensors or satellite forecasts) and can potentially provide short term forecasts for solar power plants. Such forecasts are needed for electricity grids with high penetrations of renewable energy and can help to optimize plant operations. In this publication, two key applications of shadow cameras are briefly presented.

  20. Late afternoon view of the interior of the westernmost wall ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Late afternoon view of the interior of the westernmost wall section to be removed; camera facing north. (Note: lowered camera position significantly to minimize background distractions including the porta-john, building, and telephone pole) - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  1. What are we missing? Advantages of more than one viewpoint to estimate fish assemblages using baited video

    PubMed Central

    Huveneers, Charlie; Fairweather, Peter G.

    2018-01-01

    Counting errors can bias assessments of species abundance and richness, which can affect assessments of stock structure, population structure and monitoring programmes. Many methods for studying ecology use fixed viewpoints (e.g. camera traps, underwater video), but there is little known about how this biases the data obtained. In the marine realm, most studies using baited underwater video, a common method for monitoring fish and nekton, have previously only assessed fishes using a single bait-facing viewpoint. To investigate the biases stemming from using fixed viewpoints, we added cameras to cover 360° views around the units. We found similar species richness for all observed viewpoints but the bait-facing viewpoint recorded the highest fish abundance. Sightings of infrequently seen and shy species increased with the additional cameras and the extra viewpoints allowed the abundance estimates of highly abundant schooling species to be up to 60% higher. We specifically recommend the use of additional cameras for studies focusing on shyer species or those particularly interested in increasing the sensitivity of the method by avoiding saturation in highly abundant species. Studies may also benefit from using additional cameras to focus observation on the downstream viewpoint. PMID:29892386

  2. LOFT. Interior view of entry (TAN624) rollup door. Camera is ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Interior view of entry (TAN-624) rollup door. Camera is inside entry building facing south. Rollup door was a modification of the original ANP door arrangement. Date: March 2004. INEEL negative no. HD-39-5-2 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  3. Identifying People with Soft-Biometrics at Fleet Week

    DTIC Science & Technology

    2013-03-01

    onboard sensors. This included:  Color Camera: Located in the right eye, Octavia stored 640x480 RGB images at ~4 Hz from a Point Grey Firefly camera. A...Face Detection The Fleet Week experiments demonstrated the potential of soft biometrics for recognition, but all of the existing algorithms currently

  4. 1. VIEW OF ARVFS BUNKER TAKEN FROM GROUND ELEVATION. CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW OF ARVFS BUNKER TAKEN FROM GROUND ELEVATION. CAMERA FACING NORTH. VIEW SHOWS PROFILE OF BUNKER IN RELATION TO NATURAL GROUND ELEVATION. TOP OF BUNKER HAS APPROXIMATELY THREE FEET OF EARTH COVER. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  5. Plenoptic camera based on a liquid crystal microlens array

    NASA Astrophysics Data System (ADS)

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Xie, Changsheng

    2015-09-01

    A type of liquid crystal microlens array (LCMLA) with tunable focal length by the voltage signals applied between its top and bottom electrodes, is fabricated and then the common optical focusing characteristics are tested. The relationship between the focal length and the applied voltage signals is given. The LCMLA is integrated with an image sensor and further coupled with a main lens so as to construct a plenoptic camera. Several raw images at different voltage signals applied are acquired and contrasted through the LCMLA-based plenoptic camera constructed by us. Our experiments demonstrate that through utilizing a LCMLA in a plenoptic camera, the focused zone of the LCMLA-based plenoptic camera can be shifted effectively only by changing the voltage signals loaded between the electrodes of the LCMLA, which is equivalent to the extension of the depth of field.

  6. Versatile microsecond movie camera

    NASA Astrophysics Data System (ADS)

    Dreyfus, R. W.

    1980-03-01

    A laboratory-type movie camera is described which satisfies many requirements in the range 1 microsec to 1 sec. The camera consists of a He-Ne laser and compatible state-of-the-art components; the primary components are an acoustooptic modulator, an electromechanical beam deflector, and a video tape system. The present camera is distinct in its operation in that submicrosecond laser flashes freeze the image motion while still allowing the simplicity of electromechanical image deflection in the millisecond range. The gating and pulse delay circuits of an oscilloscope synchronize the modulator and scanner relative to the subject being photographed. The optical table construction and electronic control enhance the camera's versatility and adaptability. The instant replay video tape recording allows for easy synchronization and immediate viewing of the results. Economy is achieved by using off-the-shelf components, optical table construction, and short assembly time.

  7. Accurate measurement of imaging photoplethysmographic signals based camera using weighted average

    NASA Astrophysics Data System (ADS)

    Pang, Zongguang; Kong, Lingqin; Zhao, Yuejin; Sun, Huijuan; Dong, Liquan; Hui, Mei; Liu, Ming; Liu, Xiaohua; Liu, Lingling; Li, Xiaohui; Li, Rongji

    2018-01-01

    Imaging Photoplethysmography (IPPG) is an emerging technique for the extraction of vital signs of human being using video recordings. IPPG technology with its advantages like non-contact measurement, low cost and easy operation has become one research hot spot in the field of biomedicine. However, the noise disturbance caused by non-microarterial area cannot be removed because of the uneven distribution of micro-arterial, different signal strength of each region, which results in a low signal noise ratio of IPPG signals and low accuracy of heart rate. In this paper, we propose a method of improving the signal noise ratio of camera-based IPPG signals of each sub-region of the face using a weighted average. Firstly, we obtain the region of interest (ROI) of a subject's face based camera. Secondly, each region of interest is tracked and feature-based matched in each frame of the video. Each tracked region of face is divided into 60x60 pixel block. Thirdly, the weights of PPG signal of each sub-region are calculated, based on the signal-to-noise ratio of each sub-region. Finally, we combine the IPPG signal from all the tracked ROI using weighted average. Compared with the existing approaches, the result shows that the proposed method takes modest but significant effects on improvement of signal noise ratio of camera-based PPG estimated and accuracy of heart rate measurement.

  8. Why Are Faces Denser in the Visual Experiences of Younger than Older Infants?

    ERIC Educational Resources Information Center

    Jayaraman, Swapnaa; Fausey, Caitlin M.; Smith, Linda B.

    2017-01-01

    Recent evidence from studies using head cameras suggests that the frequency of faces directly in front of infants "declines" over the first year and a half of life, a result that has implications for the development of and evolutionary constraints on face processing. Two experiments tested 2 opposing hypotheses about this observed…

  9. Field test studies of our infrared-based human temperature screening system embedded with a parallel measurement approach

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Chaitavon, Kosom

    2009-07-01

    This paper introduces a parallel measurement approach for fast infrared-based human temperature screening suitable for use in a large public area. Our key idea is based on the combination of simple image processing algorithms, infrared technology, and human flow management. With this multidisciplinary concept, we arrange as many people as possible in a two-dimensional space in front of a thermal imaging camera and then highlight all human facial areas through simple image filtering, image morphological, and particle analysis processes. In this way, an individual's face in live thermal image can be located and the maximum facial skin temperature can be monitored and displayed. Our experiment shows a measured 1 ms processing time in highlighting all human face areas. With a thermal imaging camera having an FOV lens of 24° × 18° and 320 × 240 active pixels, the maximum facial skin temperatures from three people's faces located at 1.3 m from the camera can also be simultaneously monitored and displayed in a measured rate of 31 fps, limited by the looping process in determining coordinates of all faces. For our 3-day test under the ambient temperature of 24-30 °C, 57-72% relative humidity, and weak wind from the outside hospital building, hyperthermic patients can be identified with 100% sensitivity and 36.4% specificity when the temperature threshold level and the offset temperature value are appropriately chosen. Appropriately locating our system away from the building doors, air conditioners and electric fans in order to eliminate wind blow coming toward the camera lens can significantly help improve our system specificity.

  10. SOUTH WING, MTR661. INTERIOR DETAIL INSIDE LAB ROOM 131. CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SOUTH WING, MTR-661. INTERIOR DETAIL INSIDE LAB ROOM 131. CAMERA FACING NORTHEAST. NOTE CONCRETE BLOCK WALLS. SAFETY SHOWER AND EYE WASHER AT REAR WALL. INL NEGATIVE NO. HD46-7-2. Mike Crane, Photographer, 2/2005. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. 26. VIEW OF METAL SHED OVER SHIELDING TANK WITH CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    26. VIEW OF METAL SHED OVER SHIELDING TANK WITH CAMERA FACING SOUTHWEST. SHOWS OPEN SIDE OF SHED ROOF, HERCULON SHEET, AND HAND-OPERATED CRANE. TAKEN IN 1983. INEL PHOTO NUMBER 83-476-2-9, TAKEN IN 1983. PHOTOGRAPHER NOT NAMED. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  12. HOT CELL BUILDING, TRA632, INTERIOR. CELL 3, "HEAVY" CELL. CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632, INTERIOR. CELL 3, "HEAVY" CELL. CAMERA FACES WEST TOWARD BUILDING EXIT. OBSERVATION WINDOW AT LEFT EDGE OF VIEW. INL NEGATIVE NO. HD46-28-4. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. A&M. Guard house (TAN638), contextual view. Built in 1968. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Guard house (TAN-638), contextual view. Built in 1968. Camera faces south. Guard house controlled access to radioactive waste storage tanks beyond and to left of view. Date: February 4, 2003. INEEL negative no. HD-33-4-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  14. Evidence for Recent Liquid Water on Mars: Gullies in Sirenum Fossae Trough

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This mosaic of two Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) images shows about 20 different gullies coming down the south-facing wall of a trough in the Sirenum Fossae/Gorgonum Chaos region of the martian southern hemisphere. Each channel and its associated fan--or apron--of debris appears to have started just below the same hard, resistant layer of bedrock located approximately 100 meters (about 325 feet) below the top of the trough wall. The layer beneath this hard, resistant bedrock is interpreted to be permeable, which allows ground water to percolate through it and--at the location of this trough--seep out onto the martian surface. The channels and aprons only occur on the south-facing slope of this valley created by faults on each side of the trough. The depression is approximately 1.4 km (0.9 mi) across.

    The mosaic was constructed from two pictures taken on September 16, 1999, and May 1, 2000. The black line is a gap between the two images that was not covered by MOC. The scene covers an area approximately 5.5 kilometers (3.4 miles) wide by 4.9 km (3.0 mi) high. Sunlight illuminates the area from the upper left. The image is located near 38.5oS, 171.3oW. MOC high resolution images are taken black-and-white (grayscale); the color seen here has been synthesized from the colors of Mars observed by the MOC wide angle cameras and by the Viking Orbiters in the late 1970s.

  15. The Potential of Low-Cost Rpas for Multi-View Reconstruction of Sub-Vertical Rock Faces

    NASA Astrophysics Data System (ADS)

    Thoeni, K.; Guccione, D. E.; Santise, M.; Giacomini, A.; Roncella, R.; Forlani, G.

    2016-06-01

    The current work investigates the potential of two low-cost off-the-shelf quadcopters for multi-view reconstruction of sub-vertical rock faces. The two platforms used are a DJI Phantom 1 equipped with a Gopro Hero 3+ Black and a DJI Phantom 3 Professional with integrated camera. The study area is a small sub-vertical rock face. Several flights were performed with both cameras set in time-lapse mode. Hence, images were taken automatically but the flights were performed manually as the investigated rock face is very irregular which required manual adjustment of the yaw and roll for optimal coverage. The digital images were processed with commercial SfM software packages. Several processing settings were investigated in order to find out the one providing the most accurate 3D reconstruction of the rock face. To this aim, all 3D models produced with both platforms are compared to a point cloud obtained with a terrestrial laser scanner. Firstly, the difference between the use of coded ground control targets and the use of natural features was studied. Coded targets generally provide the best accuracy, but they need to be placed on the surface, which is not always possible, as sub-vertical rock faces are not easily accessible. Nevertheless, natural features can provide a good alternative if wisely chosen as shown in this work. Secondly, the influence of using fixed interior orientation parameters or self-calibration was investigated. The results show that, in the case of the used sensors and camera networks, self-calibration provides better results. To support such empirical finding, a numerical investigation using a Monte Carlo simulation was performed.

  16. ETR COMPLEX. CAMERA FACING EAST. FROM LEFT TO RIGHT: ETRCRITICAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPLEX. CAMERA FACING EAST. FROM LEFT TO RIGHT: ETR-CRITICAL FACILITY BUILDING, ETR CONTROL BUILDING (ATTACHED TO HIGH-BAY ETR), ETR, ONE-STORY SECTION OF ETR BUILDING, ELECTRICAL BUILDING, COOLING TOWER PUMP HOUSE, COOLING TOWER. COMPRESSOR AND HEAT EXCHANGER BUILDING ARE PARTLY IN VIEW ABOVE ETR. DARK-COLORED DUCTS PROCEED FROM GROUND CONNECTION TO ETR WASTE GAS STACK. OTHER STACK IS MTR STACK WITH FAN HOUSE IN FRONT OF IT. RECTANGULAR STRUCTURE NEAR TOP OF VIEW IS SETTLING BASIN. INL NEGATIVE NO. 56-4102. Unknown Photographer, ca. 1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  17. ETR, TRA642. ETR COMPLEX NEARLY COMPLETE. CAMERA FACES NORTHWEST, PROBABLY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR, TRA-642. ETR COMPLEX NEARLY COMPLETE. CAMERA FACES NORTHWEST, PROBABLY FROM TOP DECK OF COOLING TOWER. SHADOW IS CAST BY COOLING TOWER UNITS OFF LEFT OF VIEW. HIGH-BAY REACTOR BUILDING IS SURROUNDED BY ITS ATTACHED SERVICES: ELECTRICAL (TRA-648), HEAT EXCHANGER (TRA-644 WITH U-SHAPED YARD), AND COMPRESSOR (TRA-643). THE CONTROL BUILDING (TRA-647) ON THE NORTH SIDE IS HIDDEN FROM VIEW. AT UPPER RIGHT IS MTR BUILDING, TRA-603. INL NEGATIVE NO. 56-3798. Jack L. Anderson, Photographer, 11/26/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  18. Terrestrial glint seen from deep space: Oriented ice crystals detected from the Lagrangian point

    NASA Astrophysics Data System (ADS)

    Marshak, Alexander; Várnai, Tamás.; Kostinski, Alexander

    2017-05-01

    The Deep Space Climate Observatory (DSCOVR) spacecraft resides at the first Lagrangian point about one million miles from Earth. A polychromatic imaging camera onboard delivers nearly hourly observations of the entire sunlit face of the Earth. Many images contain unexpected bright flashes of light over both ocean and land. We construct a yearlong time series of flash latitudes, scattering angles, and oxygen absorption to demonstrate conclusively that the flashes over land are specular reflections off tiny ice platelets floating in the air nearly horizontally. Such deep space detection of tropospheric ice can be used to constrain the likelihood of oriented crystals and their contribution to Earth albedo. These glint observations also support proposals for detecting starlight glints off faint companions in our search for habitable exoplanets.

  19. HEAT EXCHANGER BUILDING, TRA644. NORTHEAST CORNER. CAMERA IS ON PIKE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HEAT EXCHANGER BUILDING, TRA-644. NORTHEAST CORNER. CAMERA IS ON PIKE STREET FACING SOUTHWEST. ATTACHED STRUCTURE AT RIGHT OF VIEW IS ETR COMPRESSOR BUILDING, TRA-643. INL NEGATIVE NO. HD46-36-4. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  20. PBF Cooling Tower (PER720) and its Auxiliary Building (PER625). Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower (PER-720) and its Auxiliary Building (PER-625). Camera facing west shows east facades. Center pipe carried secondary coolant water from reactor. Building to distributor basin. Date: August 2003. INEEL negative no. HD-35-10-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  1. ETRMTR MECHANICAL SERVICES BUILDING, TRA653, INTERIOR. CAMERA IS INSIDE MEN'S ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR-MTR MECHANICAL SERVICES BUILDING, TRA-653, INTERIOR. CAMERA IS INSIDE MEN'S LAVATORY AND SHOWER FACING SOUTHEAST. SHOWER AND TOILET STALLS ARE IN PLACE. ROUND COMMUNAL SINK AT LEFT. INL NEGATIVE NO. 57-3652. K. Mansfield, Photographer, 7/22/1957 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  2. Faces of the Fleet | Navy Live

    Science.gov Websites

    annual training exercise at Ft. Knox, Ky. (U.S. Navy Combat Camera photo by Mass Communication Specialist ), navigates a waterway during an annual training exercise at Ft. Knox, Ky. (U.S. Navy Combat Camera photo by , coxswain assigned to Coastal Riverine Squadron Four (CRS-4), navigates a waterway during an annual training

  3. A Framework for People Re-Identification in Multi-Camera Surveillance Systems

    ERIC Educational Resources Information Center

    Ammar, Sirine; Zaghden, Nizar; Neji, Mahmoud

    2017-01-01

    People re-identification has been a very active research topic recently in computer vision. It is an important application in surveillance system with disjoint cameras. This paper is focused on the implementation of a human re-identification system. First the face of detected people is divided into three parts and some soft-biometric traits are…

  4. Validation of Viewing Reports: Exploration of a Photographic Method.

    ERIC Educational Resources Information Center

    Fletcher, James E.; Chen, Charles Chao-Ping

    A time lapse camera loaded with Super 8 film was employed to photographically record the area in front of a conventional television receiver in selected homes. The camera took one picture each minute for three days, including in the same frame the face of the television receiver. Family members kept a conventional viewing diary of their viewing…

  5. Multiplane and Spectrally-Resolved Single Molecule Localization Microscopy with Industrial Grade CMOS cameras.

    PubMed

    Babcock, Hazen P

    2018-01-29

    This work explores the use of industrial grade CMOS cameras for single molecule localization microscopy (SMLM). We show that industrial grade CMOS cameras approach the performance of scientific grade CMOS cameras at a fraction of the cost. This makes it more economically feasible to construct high-performance imaging systems with multiple cameras that are capable of a diversity of applications. In particular we demonstrate the use of industrial CMOS cameras for biplane, multiplane and spectrally resolved SMLM. We also provide open-source software for simultaneous control of multiple CMOS cameras and for the reduction of the movies that are acquired to super-resolution images.

  6. Evaluation of a virtual-reality-based simulator using passive haptic feedback for knee arthroscopy.

    PubMed

    Fucentese, Sandro F; Rahm, Stefan; Wieser, Karl; Spillmann, Jonas; Harders, Matthias; Koch, Peter P

    2015-04-01

    The aim of this work is to determine face validity and construct validity of a new virtual-reality-based simulator for diagnostic and therapeutic knee arthroscopy. The study tests a novel arthroscopic simulator based on passive haptics. Sixty-eight participants were grouped into novices, intermediates, and experts. All participants completed two exercises. In order to establish face validity, all participants filled out a questionnaire concerning different aspects of simulator realism, training capacity, and different statements using a seven-point Likert scale (range 1-7). Construct validity was tested by comparing various simulator metric values between novices and experts. Face validity could be established: overall realism was rated with a mean value of 5.5 points. Global training capacity scored a mean value of 5.9. Participants considered the simulator as useful for procedural training of diagnostic and therapeutic arthroscopy. In the foreign body removal exercise, experts were overall significantly faster in the whole procedure (6 min 24 s vs. 8 min 24 s, p < 0.001), took less time to complete the diagnostic tour (2 min 49 s vs. 3 min 32 s, p = 0.027), and had a shorter camera path length (186 vs. 246 cm, p = 0.006). The simulator achieved high scores in terms of realism. It was regarded as a useful training tool, which is also capable of differentiating between varying levels of arthroscopic experience. Nevertheless, further improvements of the simulator especially in the field of therapeutic arthroscopy are desirable. In general, the findings support that virtual-reality-based simulation using passive haptics has the potential to complement conventional training of knee arthroscopy skills. II.

  7. Near infrared photography with a vacuum-cold camera. [Orion nebula observation

    NASA Technical Reports Server (NTRS)

    Rossano, G. S.; Russell, R. W.; Cornett, R. H.

    1980-01-01

    Sensitized cooled plates have been obtained of the Orion nebula region and of Sh2-149 in the wavelength ranges 8000 A-9000 A and 9,000 A-11,000 A with a recently designed and constructed vacuum-cold camera. Sensitization procedures are described and the camera design is presented.

  8. Three-dimensional face model reproduction method using multiview images

    NASA Astrophysics Data System (ADS)

    Nagashima, Yoshio; Agawa, Hiroshi; Kishino, Fumio

    1991-11-01

    This paper describes a method of reproducing three-dimensional face models using multi-view images for a virtual space teleconferencing system that achieves a realistic visual presence for teleconferencing. The goal of this research, as an integral component of a virtual space teleconferencing system, is to generate a three-dimensional face model from facial images, synthesize images of the model virtually viewed from different angles, and with natural shadow to suit the lighting conditions of the virtual space. The proposed method is as follows: first, front and side view images of the human face are taken by TV cameras. The 3D data of facial feature points are obtained from front- and side-views by an image processing technique based on the color, shape, and correlation of face components. Using these 3D data, the prepared base face models, representing typical Japanese male and female faces, are modified to approximate the input facial image. The personal face model, representing the individual character, is then reproduced. Next, an oblique view image is taken by TV camera. The feature points of the oblique view image are extracted using the same image processing technique. A more precise personal model is reproduced by fitting the boundary of the personal face model to the boundary of the oblique view image. The modified boundary of the personal face model is determined by using face direction, namely rotation angle, which is detected based on the extracted feature points. After the 3D model is established, the new images are synthesized by mapping facial texture onto the model.

  9. PBF Control Building (PER619). Interior of control room shows control ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Control Building (PER-619). Interior of control room shows control console from direction facing visitors room and its observation window. Camera facing northeast. Date: May 2004. INEEL negative no. HD-41-7-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  10. Real Time 3D Facial Movement Tracking Using a Monocular Camera

    PubMed Central

    Dong, Yanchao; Wang, Yanming; Yue, Jiguang; Hu, Zhencheng

    2016-01-01

    The paper proposes a robust framework for 3D facial movement tracking in real time using a monocular camera. It is designed to estimate the 3D face pose and local facial animation such as eyelid movement and mouth movement. The framework firstly utilizes the Discriminative Shape Regression method to locate the facial feature points on the 2D image and fuses the 2D data with a 3D face model using Extended Kalman Filter to yield 3D facial movement information. An alternating optimizing strategy is adopted to fit to different persons automatically. Experiments show that the proposed framework could track the 3D facial movement across various poses and illumination conditions. Given the real face scale the framework could track the eyelid with an error of 1 mm and mouth with an error of 2 mm. The tracking result is reliable for expression analysis or mental state inference. PMID:27463714

  11. Real Time 3D Facial Movement Tracking Using a Monocular Camera.

    PubMed

    Dong, Yanchao; Wang, Yanming; Yue, Jiguang; Hu, Zhencheng

    2016-07-25

    The paper proposes a robust framework for 3D facial movement tracking in real time using a monocular camera. It is designed to estimate the 3D face pose and local facial animation such as eyelid movement and mouth movement. The framework firstly utilizes the Discriminative Shape Regression method to locate the facial feature points on the 2D image and fuses the 2D data with a 3D face model using Extended Kalman Filter to yield 3D facial movement information. An alternating optimizing strategy is adopted to fit to different persons automatically. Experiments show that the proposed framework could track the 3D facial movement across various poses and illumination conditions. Given the real face scale the framework could track the eyelid with an error of 1 mm and mouth with an error of 2 mm. The tracking result is reliable for expression analysis or mental state inference.

  12. Computer vision research with new imaging technology

    NASA Astrophysics Data System (ADS)

    Hou, Guangqi; Liu, Fei; Sun, Zhenan

    2015-12-01

    Light field imaging is capable of capturing dense multi-view 2D images in one snapshot, which record both intensity values and directions of rays simultaneously. As an emerging 3D device, the light field camera has been widely used in digital refocusing, depth estimation, stereoscopic display, etc. Traditional multi-view stereo (MVS) methods only perform well on strongly texture surfaces, but the depth map contains numerous holes and large ambiguities on textureless or low-textured regions. In this paper, we exploit the light field imaging technology on 3D face modeling in computer vision. Based on a 3D morphable model, we estimate the pose parameters from facial feature points. Then the depth map is estimated through the epipolar plane images (EPIs) method. At last, the high quality 3D face model is exactly recovered via the fusing strategy. We evaluate the effectiveness and robustness on face images captured by a light field camera with different poses.

  13. Three-dimensional face pose detection and tracking using monocular videos: tool and application.

    PubMed

    Dornaika, Fadi; Raducanu, Bogdan

    2009-08-01

    Recently, we have proposed a real-time tracker that simultaneously tracks the 3-D head pose and facial actions in monocular video sequences that can be provided by low quality cameras. This paper has two main contributions. First, we propose an automatic 3-D face pose initialization scheme for the real-time tracker by adopting a 2-D face detector and an eigenface system. Second, we use the proposed methods-the initialization and tracking-for enhancing the human-machine interaction functionality of an AIBO robot. More precisely, we show how the orientation of the robot's camera (or any active vision system) can be controlled through the estimation of the user's head pose. Applications based on head-pose imitation such as telepresence, virtual reality, and video games can directly exploit the proposed techniques. Experiments on real videos confirm the robustness and usefulness of the proposed methods.

  14. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F.; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  15. Power Burst Facility (PBF), PER620, contextual and oblique view. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Power Burst Facility (PBF), PER-620, contextual and oblique view. Camera facing northwest. South and east facade. The 1980 west-wing expansion is left of center bay. Concrete structure at right is PER-730. Date: March 2004. INEEL negative no. HD-41-2-3 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  16. NGEE Arctic Webcam Photographs, Barrow Environmental Observatory, Barrow, Alaska

    DOE Data Explorer

    Bob Busey; Larry Hinzman

    2012-04-01

    The NGEE Arctic Webcam (PTZ Camera) captures two views of seasonal transitions from its generally south-facing position on a tower located at the Barrow Environmental Observatory near Barrow, Alaska. Images are captured every 30 minutes. Historical images are available for download. The camera is operated by the U.S. DOE sponsored Next Generation Ecosystem Experiments - Arctic (NGEE Arctic) project.

  17. Camera-Based Microswitch Technology for Eyelid and Mouth Responses of Persons with Profound Multiple Disabilities: Two Case Studies

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff

    2010-01-01

    These two studies assessed camera-based microswitch technology for eyelid and mouth responses of two persons with profound multiple disabilities and minimal motor behavior. This technology, in contrast with the traditional optic microswitches used for those responses, did not require support frames on the participants' face but only small color…

  18. ADM. Aerial view of administration area. Camera facing westerly. From ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ADM. Aerial view of administration area. Camera facing westerly. From left to right in foregound: Substation (TAN-605), Warehouse (TAN-628), Gate House (TAN-601), Administration Building (TAN-602). Left to right middle ground: Service Building (TAN-603), Warehouse (later known as Maintenance Shop or Craft Shop, TAN-604), Water Well Pump Houses, Fuel Tanks and Fuel Pump Houses, and Water Storage Tanks. Change House (TAN-606) on near side of berm. Large building beyond berm is A&M. Building, TAN-607. Railroad tracks beyond lead from (unseen) turntable to the IET. Date: June 6, 1955. INEEL negative no. 13201 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  19. PBF (PER620) interior of Reactor Room. Camera facing south from ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) interior of Reactor Room. Camera facing south from stairway platform in southwest corner (similar to platform in view at left). Reactor was beneath water in circular tank. Fuel was stored in the canal north of it. Platform and apparatus at right is reactor bridge with control rod mechanisms and actuators. The entire apparatus swung over the reactor and pool during operations. Personnel in view are involved with decontamination and preparation of facility for demolition. Note rails near ceiling for crane; motor for rollup door at upper center of view. Date: March 2004. INEEL negative no. HD-41-3-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  20. From a Million Miles Away, NASA Camera Shows Moon Crossing Face of Earth

    NASA Image and Video Library

    2015-08-05

    This animation still image shows the far side of the moon, illuminated by the sun, as it crosses between the DISCOVR spacecraft's Earth Polychromatic Imaging Camera (EPIC) camera and telescope, and the Earth - one million miles away. Credits: NASA/NOAA A NASA camera aboard the Deep Space Climate Observatory (DSCOVR) satellite captured a unique view of the moon as it moved in front of the sunlit side of Earth last month. The series of test images shows the fully illuminated “dark side” of the moon that is never visible from Earth. The images were captured by NASA’s Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope on the DSCOVR satellite orbiting 1 million miles from Earth. From its position between the sun and Earth, DSCOVR conducts its primary mission of real-time solar wind monitoring for the National Oceanic and Atmospheric Administration (NOAA).

  1. Can we match ultraviolet face images against their visible counterparts?

    NASA Astrophysics Data System (ADS)

    Narang, Neeru; Bourlai, Thirimachos; Hornak, Lawrence A.

    2015-05-01

    In law enforcement and security applications, the acquisition of face images is critical in producing key trace evidence for the successful identification of potential threats. However, face recognition (FR) for face images captured using different camera sensors, and under variable illumination conditions, and expressions is very challenging. In this paper, we investigate the advantages and limitations of the heterogeneous problem of matching ultra violet (from 100 nm to 400 nm in wavelength) or UV, face images against their visible (VIS) counterparts, when all face images are captured under controlled conditions. The contributions of our work are three-fold; (i) We used a camera sensor designed with the capability to acquire UV images at short-ranges, and generated a dual-band (VIS and UV) database that is composed of multiple, full frontal, face images of 50 subjects. Two sessions were collected that span over the period of 2 months. (ii) For each dataset, we determined which set of face image pre-processing algorithms are more suitable for face matching, and, finally, (iii) we determined which FR algorithm better matches cross-band face images, resulting in high rank-1 identification rates. Experimental results show that our cross spectral matching (the heterogeneous problem, where gallery and probe sets consist of face images acquired in different spectral bands) algorithms achieve sufficient identification performance. However, we also conclude that the problem under study, is very challenging, and it requires further investigation to address real-world law enforcement or military applications. To the best of our knowledge, this is first time in the open literature the problem of cross-spectral matching of UV against VIS band face images is being investigated.

  2. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum †

    PubMed Central

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-01-01

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781

  3. Design and Construction of an X-ray Lightning Camera

    NASA Astrophysics Data System (ADS)

    Schaal, M.; Dwyer, J. R.; Rassoul, H. K.; Uman, M. A.; Jordan, D. M.; Hill, J. D.

    2010-12-01

    A pinhole-type camera was designed and built for the purpose of producing high-speed images of the x-ray emissions from rocket-and-wire-triggered lightning. The camera consists of 30 7.62-cm diameter NaI(Tl) scintillation detectors, each sampling at 10 million frames per second. The steel structure of the camera is encased in 1.27-cm thick lead, which blocks x-rays that are less than 400 keV, except through a 7.62-cm diameter “pinhole” aperture located at the front of the camera. The lead and steel structure is covered in 0.16-cm thick aluminum to block RF noise, water and light. All together, the camera weighs about 550-kg and is approximately 1.2-m x 0.6-m x 0.6-m. The image plane, which is adjustable, was placed 32-cm behind the pinhole aperture, giving a field of view of about ±38° in both the vertical and horizontal directions. The elevation of the camera is adjustable between 0 and 50° from horizontal and the camera may be pointed in any azimuthal direction. In its current configuration, the camera’s angular resolution is about 14°. During the summer of 2010, the x-ray camera was located 44-m from the rocket-launch tower at the UF/Florida Tech International Center for Lightning Research and Testing (ICLRT) at Camp Blanding, FL and several rocket-triggered lightning flashes were observed. In this presentation, I will discuss the design, construction and operation of this x-ray camera.

  4. Teleoperated control system for underground room and pillar mining

    DOEpatents

    Mayercheck, William D.; Kwitowski, August J.; Brautigam, Albert L.; Mueller, Brian K.

    1992-01-01

    A teleoperated mining system is provided for remotely controlling the various machines involved with thin seam mining. A thin seam continuous miner located at a mining face includes a camera mounted thereon and a slave computer for controlling the miner and the camera. A plurality of sensors for relaying information about the miner and the face to the slave computer. A slave computer controlled ventilation sub-system which removes combustible material from the mining face. A haulage sub-system removes material mined by the continuous miner from the mining face to a collection site and is also controlled by the slave computer. A base station, which controls the supply of power and water to the continuous miner, haulage system, and ventilation systems, includes cable/hose handling module for winding or unwinding cables/hoses connected to the miner, an operator control module, and a hydraulic power and air compressor module for supplying air to the miner. An operator controlled host computer housed in the operator control module is connected to the slave computer via a two wire communications line.

  5. THE DARK ENERGY CAMERA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flaugher, B.; Diehl, H. T.; Alvarez, O.

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuummore » Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  6. The Dark Energy Camera

    DOE PAGES

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar.more » The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  7. Plenoptic Imager for Automated Surface Navigation

    NASA Technical Reports Server (NTRS)

    Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael

    2010-01-01

    An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.

  8. LOFT. Interior view of entry to reactor building, TAN650. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Interior view of entry to reactor building, TAN-650. Camera is inside entry (TAN-624) and facing north. At far end of domed chamber are penetrations in wall for electrical and other connections. Reactor and other equipment has been removed. Date: March 2004. INEEL negative no. HD-39-5-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  9. LOFT complex in 1975 awaits renewed mission. Aerial view. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT complex in 1975 awaits renewed mission. Aerial view. Camera facing southwesterly. Left to right: stack, entry building (TAN-624), door shroud, duct shroud and filter hatches, dome (painted white), pre-amp building, equipment and piping building, shielded control room (TAN-630), airplane hangar (TAN-629). Date: 1975. INEEL negative no. 75-3690 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  10. MTR BUILDING, TRA603. SOUTHEAST CORNER, EAST SIDE FACING TOWARD RIGHT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BUILDING, TRA-603. SOUTHEAST CORNER, EAST SIDE FACING TOWARD RIGHT OF VIEW. CAMERA FACING NORTHWEST. LIGHT-COLORED PROJECTION AT LEFT IS ENGINEERING SERVICES BUILDING, TRA-635. SMALL CONCRETE BLOCK BUILDING AT CENTER OF VIEW IS FAST CHOPPER DETECTOR HOUSE, TRA-665. INL NEGATIVE NO. HD46-43-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. Towards next generation 3D cameras

    NASA Astrophysics Data System (ADS)

    Gupta, Mohit

    2017-03-01

    We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (<100 microns resolution) scans in extremely demanding scenarios with low-cost components. Several of these cameras are making a practical impact in industrial automation, being adopted in robotic inspection and assembly systems.

  12. Evaluation of Acquisition Strategies for Image-Based Construction Site Monitoring

    NASA Astrophysics Data System (ADS)

    Tuttas, S.; Braun, A.; Borrmann, A.; Stilla, U.

    2016-06-01

    Construction site monitoring is an essential task for keeping track of the ongoing construction work and providing up-to-date information for a Building Information Model (BIM). The BIM contains the as-planned states (geometry, schedule, costs, ...) of a construction project. For updating, the as-built state has to be acquired repeatedly and compared to the as-planned state. In the approach presented here, a 3D representation of the as-built state is calculated from photogrammetric images using multi-view stereo reconstruction. On construction sites one has to cope with several difficulties like security aspects, limited accessibility, occlusions or construction activity. Different acquisition strategies and techniques, namely (i) terrestrial acquisition with a hand-held camera, (ii) aerial acquisition using a Unmanned Aerial Vehicle (UAV) and (iii) acquisition using a fixed stereo camera pair at the boom of the crane, are tested on three test sites. They are assessed considering the special needs for the monitoring tasks and limitations on construction sites. The three scenarios are evaluated based on the ability of automation, the required effort for acquisition, the necessary equipment and its maintaining, disturbance of the construction works, and on the accuracy and completeness of the resulting point clouds. Based on the experiences during the test cases the following conclusions can be drawn: Terrestrial acquisition has the lowest requirements on the device setup but lacks on automation and coverage. The crane camera shows the lowest flexibility but the highest grade of automation. The UAV approach can provide the best coverage by combining nadir and oblique views, but can be limited by obstacles and security aspects. The accuracy of the point clouds is evaluated based on plane fitting of selected building parts. The RMS errors of the fitted parts range from 1 to a few cm for the UAV and the hand-held scenario. First results show that the crane camera approach has the potential to reach the same accuracy level.

  13. Construction of Shared Knowledge in Face-to-Face and Computer-Mediated Cooperation.

    ERIC Educational Resources Information Center

    Fischer, Frank; Mandl, Heinz

    This study examined how learners constructed and used shared knowledge in computer-mediated and face-to-face cooperative learning, investigating how to facilitate the construction and use of shared knowledge through dynamic visualization. Forty-eight college students were separated into dyads and assigned to one of four experimental conditions…

  14. D Animation Reconstruction from Multi-Camera Coordinates Transformation

    NASA Astrophysics Data System (ADS)

    Jhan, J. P.; Rau, J. Y.; Chou, C. M.

    2016-06-01

    Reservoir dredging issues are important to extend the life of reservoir. The most effective and cost reduction way is to construct a tunnel to desilt the bottom sediment. Conventional technique is to construct a cofferdam to separate the water, construct the intake of tunnel inside and remove the cofferdam afterwards. In Taiwan, the ZengWen reservoir dredging project will install an Elephant-trunk Steel Pipe (ETSP) in the water to connect the desilting tunnel without building the cofferdam. Since the installation is critical to the whole project, a 1:20 model was built to simulate the installation steps in a towing tank, i.e. launching, dragging, water injection, and sinking. To increase the construction safety, photogrammetry technic is adopted to record images during the simulation, compute its transformation parameters for dynamic analysis and reconstruct the 4D animations. In this study, several Australiscoded targets are fixed on the surface of ETSP for auto-recognition and measurement. The cameras orientations are computed by space resection where the 3D coordinates of coded targets are measured. Two approaches for motion parameters computation are proposed, i.e. performing 3D conformal transformation from the coordinates of cameras and relative orientation computation by the orientation of single camera. Experimental results show the 3D conformal transformation can achieve sub-mm simulation results, and relative orientation computation shows the flexibility for dynamic motion analysis which is easier and more efficiency.

  15. Cameras instead of sieves for aggregate characterization : research spotlight

    DOT National Transportation Integrated Search

    2012-01-01

    Michigan researchers explored the use of cameras and software that may eventually replace the use of screen sieves in sizing and assessing crushed aggregate for pavement construction. This research explored approaches to imaging aggregate as a way to...

  16. ETR, TRA642, CAMERA IS BELOW, BUT NEAR THE CEILING OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR, TRA-642, CAMERA IS BELOW, BUT NEAR THE CEILING OF THE GROUND FLOOR, AND LOOKS DOWN TOWARD THE CONSOLE FLOOR. CAMERA FACES WESTERLY. THE REACTOR PIT IS IN THE CENTER OF THE VIEW. BEYOND IT TO THE LEFT IS THE SOUTH SIDE OF THE WORKING CANAL. IN THE FOREGROUND ON THE RIGHT IS THE SHIELDING FOR THE PROCESS WATER TUNNEL AND PIPING. SPIRAL STAIRCASE AT LEFT OF VIEW. INL NEGATIVE NO. 56-2237. Jack L. Anderson, Photographer, 7/6/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  17. A multiple camera tongue switch for a child with severe spastic quadriplegic cerebral palsy.

    PubMed

    Leung, Brian; Chau, Tom

    2010-01-01

    The present study proposed a video-based access technology that facilitated a non-contact tongue protrusion access modality for a 7-year-old boy with severe spastic quadriplegic cerebral palsy (GMFCS level 5). The proposed system featured a centre camera and two peripheral cameras to extend coverage of the frontal face view of this user for longer durations. The child participated in a descriptive case study. The participant underwent 3 months of tongue protrusion training while the multiple camera tongue switch prototype was being prepared. Later, the participant was brought back for five experiment sessions where he worked on a single-switch picture matching activity, using the multiple camera tongue switch prototype in a controlled environment. The multiple camera tongue switch achieved an average sensitivity of 82% and specificity of 80%. In three of the experiment sessions, the peripheral cameras were associated with most of the true positive switch activations. These activations would have been missed by a centre-camera-only setup. The study demonstrated proof-of-concept of a non-contact tongue access modality implemented by a video-based system involving three cameras and colour video processing.

  18. Improving the color fidelity of cameras for advanced television systems

    NASA Astrophysics Data System (ADS)

    Kollarits, Richard V.; Gibbon, David C.

    1992-08-01

    In this paper we compare the accuracy of the color information obtained from television cameras using three and five wavelength bands. This comparison is based on real digital camera data. The cameras are treated as colorimeters whose characteristics are not linked to that of the display. The color matrices for both cameras were obtained by identical optimization procedures that minimized the color error The color error for the five band camera is 2. 5 times smaller than that obtained from the three band camera. Visual comparison of color matches on a characterized color monitor indicate that the five band camera is capable of color measurements that produce no significant visual error on the display. Because the outputs from the five band camera are reduced to the normal three channels conventionally used for display there need be no increase in signal handling complexity outside the camera. Likewise it is possible to construct a five band camera using only three sensors as in conventional cameras. The principal drawback of the five band camera is the reduction in effective camera sensitivity by about 3/4 of an I stop. 1.

  19. ETR AND MTR COMPLEXES IN CONTEXT. CAMERA FACING NORTHERLY. FROM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR AND MTR COMPLEXES IN CONTEXT. CAMERA FACING NORTHERLY. FROM BOTTOM TO TOP: ETR COOLING TOWER, ELECTRICAL BUILDING AND LOW-BAY SECTION OF ETR BUILDING, HEAT EXCHANGER BUILDING (WITH U SHAPED YARD), COMPRESSOR BUILDING. MTR REACTOR SERVICES BUILDING IS ATTACHED TO SOUTH WALL OF MTR. WING A IS ATTACHED TO BALCONY FLOOR OF MTR. NEAR UPPER RIGHT CORNER OF VIEW IS MTR PROCESS WATER BUILDING. WING B IS AT FAR WEST END OF COMPLEX. NEAR MAIN GATE IS GAMMA FACILITY, WITH "COLD" BUILDINGS BEYOND: RAW WATER STORAGE TANKS, STEAM PLANT, MTR COOLING TOWER PUMP HOUSE AND COOLING TOWER. INL NEGATIVE NO. 56-4101. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  20. An Application for Driver Drowsiness Identification based on Pupil Detection using IR Camera

    NASA Astrophysics Data System (ADS)

    Kumar, K. S. Chidanand; Bhowmick, Brojeshwar

    A Driver drowsiness identification system has been proposed that generates alarms when driver falls asleep during driving. A number of different physical phenomena can be monitored and measured in order to detect drowsiness of driver in a vehicle. This paper presents a methodology for driver drowsiness identification using IR camera by detecting and tracking pupils. The face region is first determined first using euler number and template matching. Pupils are then located in the face region. In subsequent frames of video, pupils are tracked in order to find whether the eyes are open or closed. If eyes are closed for several consecutive frames then it is concluded that the driver is fatigued and alarm is generated.

  1. High-emulation mask recognition with high-resolution hyperspectral video capture system

    NASA Astrophysics Data System (ADS)

    Feng, Jiao; Fang, Xiaojing; Li, Shoufeng; Wang, Yongjin

    2014-11-01

    We present a method for distinguishing human face from high-emulation mask, which is increasingly used by criminals for activities such as stealing card numbers and passwords on ATM. Traditional facial recognition technique is difficult to detect such camouflaged criminals. In this paper, we use the high-resolution hyperspectral video capture system to detect high-emulation mask. A RGB camera is used for traditional facial recognition. A prism and a gray scale camera are used to capture spectral information of the observed face. Experiments show that mask made of silica gel has different spectral reflectance compared with the human skin. As multispectral image offers additional spectral information about physical characteristics, high-emulation mask can be easily recognized.

  2. Mapping Land and Water Surface Topography with instantaneous Structure from Motion

    NASA Astrophysics Data System (ADS)

    Dietrich, J.; Fonstad, M. A.

    2012-12-01

    Structure from Motion (SfM) has given researchers an invaluable tool for low-cost, high-resolution 3D mapping of the environment. These SfM 3D surface models are commonly constructed from many digital photographs collected with one digital camera (either handheld or attached to aerial platform). This method works for stationary or very slow moving objects. However, objects in motion are impossible to capture with one-camera SfM. With multiple simultaneously triggered cameras, it becomes possible to capture multiple photographs at the same time which allows for the construction 3D surface models of moving objects and surfaces, an instantaneous SfM (ISfM) surface model. In river science, ISfM provides a low-cost solution for measuring a number of river variables that researchers normally estimate or are unable to collect over large areas. With ISfM and sufficient coverage of the banks and RTK-GPS control it is possible to create a digital surface model of land and water surface elevations across an entire channel and water surface slopes at any point within the surface model. By setting the cameras to collect time-lapse photography of a scene it is possible to create multiple surfaces that can be compared using traditional digital surface model differencing. These water surface models could be combined the high-resolution bathymetry to create fully 3D cross sections that could be useful in hydrologic modeling. Multiple temporal image sets could also be used in 2D or 3D particle image velocimetry to create 3D surface velocity maps of a channel. Other applications in earth science include anything where researchers could benefit from temporal surface modeling like mass movements, lava flows, dam removal monitoring. The camera system that was used for this research consisted of ten pocket digital cameras (Canon A3300) equipped with wireless triggers. The triggers were constructed with an Arduino-style microcontroller and off-the-shelf handheld radios with a maximum range of several kilometers. The cameras are controlled from another microcontroller/radio combination that allows for manual or automatic triggering of the cameras. The total cost of the camera system was approximately 1500 USD.

  3. A method for the real-time construction of a full parallax light field

    NASA Astrophysics Data System (ADS)

    Tanaka, Kenji; Aoki, Soko

    2006-02-01

    We designed and implemented a light field acquisition and reproduction system for dynamic objects called LiveDimension, which serves as a 3D live video system for multiple viewers. The acquisition unit consists of circularly arranged NTSC cameras surrounding an object. The display consists of circularly arranged projectors and a rotating screen. The projectors are constantly projecting images captured by the corresponding cameras onto the screen. The screen rotates around an in-plane vertical axis at a sufficient speed so that it faces each of the projectors in sequence. Since the Lambertian surfaces of the screens are covered by light-collimating plastic films with vertical louver patterns that are used for the selection of appropriate light rays, viewers can only observe images from a projector located in the same direction as the viewer. Thus, the dynamic view of an object is dependent on the viewer's head position. We evaluated the system by projecting both objects and human figures and confirmed that the entire system can reproduce light fields with a horizontal parallax to display video sequences of 430x770 pixels at a frame rate of 45 fps. Applications of this system include product design reviews, sales promotion, art exhibits, fashion shows, and sports training with form checking.

  4. 14. VIEW OF MST, FACING SOUTHEAST, AND LAUNCH PAD TAKEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. VIEW OF MST, FACING SOUTHEAST, AND LAUNCH PAD TAKEN FROM NORTHEAST PHOTO TOWER WITH WINDOW OPEN. FEATURES LEFT TO RIGHT: SOUTH TELEVISION CAMERA TOWER, SOUTHWEST PHOTO TOWER, LAUNCHER, UMBILICAL MAST, MST, AND OXIDIZER APRON. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  5. Brief Report: Using a Point-of-View Camera to Measure Eye Gaze in Young Children with Autism Spectrum Disorder during Naturalistic Social Interactions--A Pilot Study

    ERIC Educational Resources Information Center

    Edmunds, Sarah R.; Rozga, Agata; Li, Yin; Karp, Elizabeth A.; Ibanez, Lisa V.; Rehg, James M.; Stone, Wendy L.

    2017-01-01

    Children with autism spectrum disorder (ASD) show reduced gaze to social partners. Eye contact during live interactions is often measured using stationary cameras that capture various views of the child, but determining a child's precise gaze target within another's face is nearly impossible. This study compared eye gaze coding derived from…

  6. ETR, TRA642. CONSOLE FLOOR. CAMERA IS ON WEST SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR, TRA-642. CONSOLE FLOOR. CAMERA IS ON WEST SIDE OF FLOOR AND FACES NORTH. OUTER WALL OF STORAGE CANAL IS AT RIGHT. SHIELDING IS THICKER AT LOWER LEVEL, WHERE SPENT FUEL ELEMENTS WILL COOL AFTER REMOVAL FROM REACTOR. INL NEGATIVE NO. 56-1401. Jack L. Anderson, Photographer, 5/1/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. PBF Reactor Building (PER620). Camera is in cab of electricpowered ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera is in cab of electric-powered rail crane and facing east. Reactor pit and storage canal have been shaped. Floors for wings on east and west side are above and below reactor in view. Photographer: Larry Page. Date: August 23, 1967. INEEL negative no. 67-4403 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  8. IET. Aerial view of snaptran destructive experiment in 1964. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Aerial view of snaptran destructive experiment in 1964. Camera facing north. Test cell building (TAN-624) is positioned away from coupling station. Weather tower in right foreground. Divided duct just beyond coupling station. Air intake structure on south side of shielded control room. Experiment is on dolly at coupling station. Date: 1964. INEEL negative no. 64-1736 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  9. Depth estimation using a lightfield camera

    NASA Astrophysics Data System (ADS)

    Roper, Carissa

    The latest innovation to camera design has come in the form of the lightfield, or plenoptic, camera that captures 4-D radiance data rather than just the 2-D scene image via microlens arrays. With the spatial and angular light ray data now recorded on the camera sensor, it is feasible to construct algorithms that can estimate depth of field in different portions of a given scene. There are limitations to the precision due to hardware structure and the sheer number of scene variations that can occur. In this thesis, the potential of digital image analysis and spatial filtering to extract depth information is tested on the commercially available plenoptic camera.

  10. Research on inosculation between master of ceremonies or players and virtual scene in virtual studio

    NASA Astrophysics Data System (ADS)

    Li, Zili; Zhu, Guangxi; Zhu, Yaoting

    2003-04-01

    A technical principle about construction of virtual studio has been proposed where orientation tracker and telemeter has been used for improving conventional BETACAM pickup camera and connecting with the software module of the host. A model of virtual camera named Camera & Post-camera Coupling Pair has been put forward, which is different from the common model in computer graphics and has been bound to real BETACAM pickup camera for shooting. The formula has been educed to compute the foreground frame buffer image and the background frame buffer image of the virtual scene whose boundary is based on the depth information of target point of the real BETACAM pickup camera's projective ray. The effect of real-time consistency has been achieved between the video image sequences of the master of ceremonies or players and the CG video image sequences for the virtual scene in spatial position, perspective relationship and image object masking. The experimental result has shown that the technological scheme of construction of virtual studio submitted in this paper is feasible and more applicative and more effective than the existing technology to establish a virtual studio based on color-key and image synthesis with background using non-linear video editing technique.

  11. Constructing a Database from Multiple 2D Images for Camera Pose Estimation and Robot Localization

    NASA Technical Reports Server (NTRS)

    Wolf, Michael; Ansar, Adnan I.; Brennan, Shane; Clouse, Daniel S.; Padgett, Curtis W.

    2012-01-01

    The LMDB (Landmark Database) Builder software identifies persistent image features (landmarks) in a scene viewed multiple times and precisely estimates the landmarks 3D world positions. The software receives as input multiple 2D images of approximately the same scene, along with an initial guess of the camera poses for each image, and a table of features matched pair-wise in each frame. LMDB Builder aggregates landmarks across an arbitrarily large collection of frames with matched features. Range data from stereo vision processing can also be passed to improve the initial guess of the 3D point estimates. The LMDB Builder aggregates feature lists across all frames, manages the process to promote selected features to landmarks, and iteratively calculates the 3D landmark positions using the current camera pose estimations (via an optimal ray projection method), and then improves the camera pose estimates using the 3D landmark positions. Finally, it extracts image patches for each landmark from auto-selected key frames and constructs the landmark database. The landmark database can then be used to estimate future camera poses (and therefore localize a robotic vehicle that may be carrying the cameras) by matching current imagery to landmark database image patches and using the known 3D landmark positions to estimate the current pose.

  12. Face pose tracking using the four-point algorithm

    NASA Astrophysics Data System (ADS)

    Fung, Ho Yin; Wong, Kin Hong; Yu, Ying Kin; Tsui, Kwan Pang; Kam, Ho Chuen

    2017-06-01

    In this paper, we have developed an algorithm to track the pose of a human face robustly and efficiently. Face pose estimation is very useful in many applications such as building virtual reality systems and creating an alternative input method for the disabled. Firstly, we have modified a face detection toolbox called DLib for the detection of a face in front of a camera. The detected face features are passed to a pose estimation method, known as the four-point algorithm, for pose computation. The theory applied and the technical problems encountered during system development are discussed in the paper. It is demonstrated that the system is able to track the pose of a face in real time using a consumer grade laptop computer.

  13. Utilizing Job Camera Technology in Construction Education

    ERIC Educational Resources Information Center

    Bruce, Richard D.; McCandless, David W.; Berryman, Chuck W.; Strong, Shawn D.

    2008-01-01

    One of the toughest hurdles to overcome in construction education is the varying levels of construction field experience among undergraduate students. Although an internship is a common construction management requirement, it is often completed after students complete classes in planning and scheduling. This poses a challenge for the modern…

  14. Effects of Face-to-Face and Computer-Mediated Constructive Controversy on Social Interdependence, Motivation, and Achievement

    ERIC Educational Resources Information Center

    Roseth, Cary J.; Saltarelli, Andy J.; Glass, Chris R.

    2011-01-01

    Cooperative learning capitalizes on the relational processes by which peers promote learning, yet it remains unclear whether these processes operate similarly in face-to-face and online settings. This study addresses this issue by comparing face-to-face and computer-mediated versions of "constructive controversy", a cooperative learning procedure…

  15. Adapting Local Features for Face Detection in Thermal Image.

    PubMed

    Ma, Chao; Trung, Ngo Thanh; Uchiyama, Hideaki; Nagahara, Hajime; Shimada, Atsushi; Taniguchi, Rin-Ichiro

    2017-11-27

    A thermal camera captures the temperature distribution of a scene as a thermal image. In thermal images, facial appearances of different people under different lighting conditions are similar. This is because facial temperature distribution is generally constant and not affected by lighting condition. This similarity in face appearances is advantageous for face detection. To detect faces in thermal images, cascade classifiers with Haar-like features are generally used. However, there are few studies exploring the local features for face detection in thermal images. In this paper, we introduce two approaches relying on local features for face detection in thermal images. First, we create new feature types by extending Multi-Block LBP. We consider a margin around the reference and the generally constant distribution of facial temperature. In this way, we make the features more robust to image noise and more effective for face detection in thermal images. Second, we propose an AdaBoost-based training method to get cascade classifiers with multiple types of local features. These feature types have different advantages. In this way we enhance the description power of local features. We did a hold-out validation experiment and a field experiment. In the hold-out validation experiment, we captured a dataset from 20 participants, comprising 14 males and 6 females. For each participant, we captured 420 images with 10 variations in camera distance, 21 poses, and 2 appearances (participant with/without glasses). We compared the performance of cascade classifiers trained by different sets of the features. The experiment results showed that the proposed approaches effectively improve the performance of face detection in thermal images. In the field experiment, we compared the face detection performance in realistic scenes using thermal and RGB images, and gave discussion based on the results.

  16. Serial fusion of Eulerian and Lagrangian approaches for accurate heart-rate estimation using face videos.

    PubMed

    Gupta, Puneet; Bhowmick, Brojeshwar; Pal, Arpan

    2017-07-01

    Camera-equipped devices are ubiquitous and proliferating in the day-to-day life. Accurate heart rate (HR) estimation from the face videos acquired from the low cost cameras in a non-contact manner, can be used in many real-world scenarios and hence, require rigorous exploration. This paper has presented an accurate and near real-time HR estimation system using these face videos. It is based on the phenomenon that the color and motion variations in the face video are closely related to the heart beat. The variations also contain the noise due to facial expressions, respiration, eye blinking and environmental factors which are handled by the proposed system. Neither Eulerian nor Lagrangian temporal signals can provide accurate HR in all the cases. The cases where Eulerian temporal signals perform spuriously are determined using a novel poorness measure and then both the Eulerian and Lagrangian temporal signals are employed for better HR estimation. Such a fusion is referred as serial fusion. Experimental results reveal that the error introduced in the proposed algorithm is 1.8±3.6 which is significantly lower than the existing well known systems.

  17. ETR BUILDING, TRA642, INTERIOR. CONSOLE FLOOR, NORTH HALF. CAMERA IS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. CONSOLE FLOOR, NORTH HALF. CAMERA IS NEAR NORTHWEST CORNER AND FACING SOUTH ALONG WEST CORRIDOR. STORAGE CANAL IS ALONG LEFT OF VIEW; PERIMETER WALL, ALONG RIGHT. CORRIDOR WAS ONE MEANS OF WALKING FROM NORTH TO SOUTH SIDE OF CONSOLE FLOOR. INL NEGATIVE NO. HD46-18-1. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  18. Acquisition of the spatial temperature distribution of rock faces by using infrared thermography

    NASA Astrophysics Data System (ADS)

    Beham, Michael; Rode, Matthias; Schnepfleitner, Harald; Sass, Oliver

    2013-04-01

    Rock temperature plays a central role for weathering and therefore influences the risk potential originating from rockfall processes. So far, for the acquisition of temperature mainly point-based measuring methods have been used and accordingly, two-dimensional temperature data is rare. To overcome this limitation, an infrared camera was used to collect and analyse data on the spatial temperature distribution on 10 x 10 m sections of rock faces in the Gesäuse (900m a.s.l.) and in the Dachsteingebirge (2700m a.s.l.) within the framework of the research project ROCKING ALPS (FWF-P24244). The advantage of infrared thermography to capture area-wide temperatures has hardly ever been used in this context. In order to investigate the differences between north-facing and south-facing rock faces at about the same period of time it was necessary to move the camera between the sites. The resulting offset of the time lapse infrared images made it necessary to develop a sophisticated methodology to rectify the captured images in order to create matching datasets for future analysis. With the relatively simple camera used, one of the main challenges was to find a way to convert the colour-scale or grey-scale values of the rectified image back to temperature values after the rectification process. The processing steps were mainly carried out with MATLAB. South-facing rock faces generally experienced higher temperatures and amplitudes compared to the north facing ones. In view of the spatial temperature distribution, the temperatures of shady areas were clearly below those of sunny ones, with the latter also showing the highest amplitudes. Joints and sun-shaded areas were characterised by attenuated diurnal temperature fluctuations closely paralleled to the air temperature. The temperature of protruding rock parts and of loose debris responded very quick to changes in radiation and air temperatures while massive rock reacted more slowly. The potential effects of temperature on weathering could only be assessed in a qualitative way by now. However, the variability of temperatures and amplitudes on a rather small and homogeneous section of a rockwall is surprisingly high which challenges any statements on weathering effectiveness based on point measurements. In simple terms, the use of infrared thermography has proven its value in the presented pilot study and is going to be a promising tool for research into rock weathering.

  19. Achievable Rate Estimation of IEEE 802.11ad Visual Big-Data Uplink Access in Cloud-Enabled Surveillance Applications.

    PubMed

    Kim, Joongheon; Kim, Jong-Kook

    2016-01-01

    This paper addresses the computation procedures for estimating the impact of interference in 60 GHz IEEE 802.11ad uplink access in order to construct visual big-data database from randomly deployed surveillance camera sensing devices. The acquired large-scale massive visual information from surveillance camera devices will be used for organizing big-data database, i.e., this estimation is essential for constructing centralized cloud-enabled surveillance database. This performance estimation study captures interference impacts on the target cloud access points from multiple interference components generated by the 60 GHz wireless transmissions from nearby surveillance camera devices to their associated cloud access points. With this uplink interference scenario, the interference impacts on the main wireless transmission from a target surveillance camera device to its associated target cloud access point with a number of settings are measured and estimated under the consideration of 60 GHz radiation characteristics and antenna radiation pattern models.

  20. High-immersion three-dimensional display of the numerical computer model

    NASA Astrophysics Data System (ADS)

    Xing, Shujun; Yu, Xunbo; Zhao, Tianqi; Cai, Yuanfa; Chen, Duo; Chen, Zhidong; Sang, Xinzhu

    2013-08-01

    High-immersion three-dimensional (3D) displays making them valuable tools for many applications, such as designing and constructing desired building houses, industrial architecture design, aeronautics, scientific research, entertainment, media advertisement, military areas and so on. However, most technologies provide 3D display in the front of screens which are in parallel with the walls, and the sense of immersion is decreased. To get the right multi-view stereo ground image, cameras' photosensitive surface should be parallax to the public focus plane and the cameras' optical axes should be offset to the center of public focus plane both atvertical direction and horizontal direction. It is very common to use virtual cameras, which is an ideal pinhole camera to display 3D model in computer system. We can use virtual cameras to simulate the shooting method of multi-view ground based stereo image. Here, two virtual shooting methods for ground based high-immersion 3D display are presented. The position of virtual camera is determined by the people's eye position in the real world. When the observer stand in the circumcircle of 3D ground display, offset perspective projection virtual cameras is used. If the observer stands out the circumcircle of 3D ground display, offset perspective projection virtual cameras and the orthogonal projection virtual cameras are adopted. In this paper, we mainly discussed the parameter setting of virtual cameras. The Near Clip Plane parameter setting is the main point in the first method, while the rotation angle of virtual cameras is the main point in the second method. In order to validate the results, we use the D3D and OpenGL to render scenes of different viewpoints and generate a stereoscopic image. A realistic visualization system for 3D models is constructed and demonstrated for viewing horizontally, which provides high-immersion 3D visualization. The displayed 3D scenes are compared with the real objects in the real world.

  1. Visualizing Interstellar's Wormhole

    NASA Astrophysics Data System (ADS)

    James, Oliver; von Tunzelmann, Eugénie; Franklin, Paul; Thorne, Kip S.

    2015-06-01

    Christopher Nolan's science fiction movie Interstellar offers a variety of opportunities for students in elementary courses on general relativity theory. This paper describes such opportunities, including: (i) At the motivational level, the manner in which elementary relativity concepts underlie the wormhole visualizations seen in the movie; (ii) At the briefest computational level, instructive calculations with simple but intriguing wormhole metrics, including, e.g., constructing embedding diagrams for the three-parameter wormhole that was used by our visual effects team and Christopher Nolan in scoping out possible wormhole geometries for the movie; (iii) Combining the proper reference frame of a camera with solutions of the geodesic equation, to construct a light-ray-tracing map backward in time from a camera's local sky to a wormhole's two celestial spheres; (iv) Implementing this map, for example, in Mathematica, Maple or Matlab, and using that implementation to construct images of what a camera sees when near or inside a wormhole; (v) With the student's implementation, exploring how the wormhole's three parameters influence what the camera sees—which is precisely how Christopher Nolan, using our implementation, chose the parameters for Interstellar's wormhole; (vi) Using the student's implementation, exploring the wormhole's Einstein ring and particularly the peculiar motions of star images near the ring, and exploring what it looks like to travel through a wormhole.

  2. Contextual view showing northeastern eucalyptus windbreak and portion of citrus ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing northeastern eucalyptus windbreak and portion of citrus orchard. Camera facing 118" east-southeast. - Goerlitz House, 9893 Highland Avenue, Rancho Cucamonga, San Bernardino County, CA

  3. Mini Compton Camera Based on an Array of Virtual Frisch-Grid CdZnTe Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Wonho; Bolotnikov, Aleksey; Lee, Taewoong

    In this study, we constructed a mini Compton camera based on an array of CdZnTe detectors and assessed its spectral and imaging properties. The entire array consisted of 6×6 Frisch-grid CdZnTe detectors, each with a size of 6×6 ×15 mm 3. Since it is easier and more practical to grow small CdZnTe crystals rather than large monolithic ones, constructing a mosaic array of parallelepiped crystals can be an effective way to build a more efficient, large-volume detector. With the fully operational CdZnTe array, we measured the energy spectra for 133Ba -, 137Cs -, 60Co-radiation sources; we also located these sourcesmore » using a Compton imaging approach. Although the Compton camera was small enough to hand-carry, its intrinsic efficiency was several orders higher than those generated in previous researches using spatially separated arrays, because our camera measured the interactions inside the CZT detector array, wherein the detector elements were positioned very close to each other. Lastly, the performance of our camera was compared with that based on a pixelated detector.« less

  4. Mini Compton Camera Based on an Array of Virtual Frisch-Grid CdZnTe Detectors

    DOE PAGES

    Lee, Wonho; Bolotnikov, Aleksey; Lee, Taewoong; ...

    2016-02-15

    In this study, we constructed a mini Compton camera based on an array of CdZnTe detectors and assessed its spectral and imaging properties. The entire array consisted of 6×6 Frisch-grid CdZnTe detectors, each with a size of 6×6 ×15 mm 3. Since it is easier and more practical to grow small CdZnTe crystals rather than large monolithic ones, constructing a mosaic array of parallelepiped crystals can be an effective way to build a more efficient, large-volume detector. With the fully operational CdZnTe array, we measured the energy spectra for 133Ba -, 137Cs -, 60Co-radiation sources; we also located these sourcesmore » using a Compton imaging approach. Although the Compton camera was small enough to hand-carry, its intrinsic efficiency was several orders higher than those generated in previous researches using spatially separated arrays, because our camera measured the interactions inside the CZT detector array, wherein the detector elements were positioned very close to each other. Lastly, the performance of our camera was compared with that based on a pixelated detector.« less

  5. Accuracy of Wearable Cameras to Track Social Interactions in Stroke Survivors.

    PubMed

    Dhand, Amar; Dalton, Alexandra E; Luke, Douglas A; Gage, Brian F; Lee, Jin-Moo

    2016-12-01

    Social isolation after a stroke is related to poor outcomes. However, a full study of social networks on stroke outcomes is limited by the current metrics available. Typical measures of social networks rely on self-report, which is vulnerable to response bias and measurement error. We aimed to test the accuracy of an objective measure-wearable cameras-to capture face-to-face social interactions in stroke survivors. If accurate and usable in real-world settings, this technology would allow improved examination of social factors on stroke outcomes. In this prospective study, 10 stroke survivors each wore 2 wearable cameras: Autographer (OMG Life Limited, Oxford, United Kingdom) and Narrative Clip (Narrative, Linköping, Sweden). Each camera automatically took a picture every 20-30 seconds. Patients mingled with healthy controls for 5 minutes of 1-on-1 interactions followed by 5 minutes of no interaction for 2 hours. After the event, 2 blinded judges assessed whether photograph sequences identified interactions or noninteractions. Diagnostic accuracy statistics were calculated. A total of 8776 photographs were taken and adjudicated. In distinguishing interactions, the Autographer's sensitivity was 1.00 and specificity was .98. The Narrative Clip's sensitivity was .58 and specificity was 1.00. The receiver operating characteristic curves of the 2 devices were statistically different (Z = 8.26, P < .001). Wearable cameras can accurately detect social interactions of stroke survivors. Likely because of its large field of view, the Autographer was more sensitive than the Narrative Clip for this purpose. Copyright © 2016 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  6. The construction FACE database - Codifying the NIOSH FACE reports.

    PubMed

    Dong, Xiuwen Sue; Largay, Julie A; Wang, Xuanwen; Cain, Chris Trahan; Romano, Nancy

    2017-09-01

    The National Institute for Occupational Safety and Health (NIOSH) has published reports detailing the results of investigations on selected work-related fatalities through the Fatality Assessment and Control Evaluation (FACE) program since 1982. Information from construction-related FACE reports was coded into the Construction FACE Database (CFD). Use of the CFD was illustrated by analyzing major CFD variables. A total of 768 construction fatalities were included in the CFD. Information on decedents, safety training, use of PPE, and FACE recommendations were coded. Analysis shows that one in five decedents in the CFD died within the first two months on the job; 75% and 43% of reports recommended having safety training or installing protection equipment, respectively. Comprehensive research using FACE reports may improve understanding of work-related fatalities and provide much-needed information on injury prevention. The CFD allows researchers to analyze the FACE reports quantitatively and efficiently. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.

  7. Mars Orbiter Camera Views the 'Face on Mars' - Best View from Viking

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

    The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long.

    This Viking Orbiter image is one of the best Viking pictures of the area Cydonia where the 'Face' is located. Marked on the image are the 'footprint' of the high resolution (narrow angle) Mars Orbiter Camera image and the area seen in enlarged views (dashed box). See PIA01440-1442 for these images in raw and processed form.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  8. LOFT. Reactor support apparatus inside containment building (TAN650). Camera is ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Reactor support apparatus inside containment building (TAN-650). Camera is on crane rail level and facing northerly. View shows top two banks of round conduit openings on wall for electrical and other connections to control room. Ladders and platforms provide access to reactor instrumentation. Note hatch in floor and drain at edge of floor near wall. Date: 1974. INEEL negative no. 74-219 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  9. 24/7 security system: 60-FPS color EMCCD camera with integral human recognition

    NASA Astrophysics Data System (ADS)

    Vogelsong, T. L.; Boult, T. E.; Gardner, D. W.; Woodworth, R.; Johnson, R. C.; Heflin, B.

    2007-04-01

    An advanced surveillance/security system is being developed for unattended 24/7 image acquisition and automated detection, discrimination, and tracking of humans and vehicles. The low-light video camera incorporates an electron multiplying CCD sensor with a programmable on-chip gain of up to 1000:1, providing effective noise levels of less than 1 electron. The EMCCD camera operates in full color mode under sunlit and moonlit conditions, and monochrome under quarter-moonlight to overcast starlight illumination. Sixty frame per second operation and progressive scanning minimizes motion artifacts. The acquired image sequences are processed with FPGA-compatible real-time algorithms, to detect/localize/track targets and reject non-targets due to clutter under a broad range of illumination conditions and viewing angles. The object detectors that are used are trained from actual image data. Detectors have been developed and demonstrated for faces, upright humans, crawling humans, large animals, cars and trucks. Detection and tracking of targets too small for template-based detection is achieved. For face and vehicle targets the results of the detection are passed to secondary processing to extract recognition templates, which are then compared with a database for identification. When combined with pan-tilt-zoom (PTZ) optics, the resulting system provides a reliable wide-area 24/7 surveillance system that avoids the high life-cycle cost of infrared cameras and image intensifiers.

  10. [Environmental Education Units.] Photography for Kids. Vacant Lot Studies. Contour Mapping.

    ERIC Educational Resources Information Center

    Minneapolis Independent School District 275, Minn.

    Techniques suitable for use with elementary school students when studying field environment are described in these four booklets. Techniques for photography (construction of simple cameras, printing on blueprint and photographic paper, use of simple commercial cameras, development of exposed film); for measuring microclimatic factors (temperature,…

  11. Contextual view of building, with building #11 in right foreground. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building, with building #11 in right foreground. Camera facing east - Naval Supply Center, Broadway Complex, Administration Storehouse, 911 West Broadway, San Diego, San Diego County, CA

  12. Characterizing Microbial Mat Morphology with Structure from Motion Techniques in Ice-Covered Lake Joyce, McMurdo Dry Valleys, Antarctica

    NASA Astrophysics Data System (ADS)

    Mackey, T. J.; Leidman, S. Z.; Allen, B.; Hawes, I.; Lawrence, J.; Jungblut, A. D.; Krusor, M.; Coleman, L.; Sumner, D. Y.

    2015-12-01

    Structure from Motion (SFM) techniques can provide quantitative morphological documentation of otherwise inaccessible benthic ecosystems such as microbial mats in Lake Joyce, a perennially ice-covered lake of the Antarctic McMurdo Dry Valleys (MDV). Microbial mats are a key ecosystem of MDV lakes, and diverse mat morphologies like pinnacles emerge from interactions among microbial behavior, mineralization, and environmental conditions. Environmental gradients can be isolated to test mat growth models, but assessment of mat morphology along these gradients is complicated by their inaccessibility: the Lake Joyce ice cover is 4-5 m thick, water depths containing diverse pinnacle morphologies are 9-14 m, and relevant mat features are cm-scale. In order to map mat pinnacle morphology in different sedimentary settings, we deployed drop cameras (SeaViewer and GoPro) through 29 GPS referenced drill holes clustered into six stations along a transect spanning 880 m. Once under the ice cover, a boom containing a second GoPro camera was unfurled and rotated to collect oblique images of the benthic mats within dm of the mat-water interface. This setup allowed imaging from all sides over a ~1.5 m diameter area of the lake bottom. Underwater lens parameters were determined for each camera in Agisoft Lens; images were reconstructed and oriented in space with the SFM software Agisoft Photoscan, using the drop camera axis of rotation as up. The reconstructions were compared to downward facing images to assess accuracy, and similar images of an object with known geometry provided a test for expected error in reconstructions. Downward facing images identify decreasing pinnacle abundance in higher sedimentation settings, and quantitative measurements of 3D reconstructions in KeckCAVES LidarViewer supplement these mat morphological facies with measurements of pinnacle height and orientation. Reconstructions also help isolate confounding variables for mat facies trends with measurements of lake bottom slope and underlying relief that could influence pinnacle growth. Comparison of 3D reconstructions to downward-facing drop camera images demonstrate that SFM is a powerful tool for documenting diverse mat morphologies across environmental gradients in ice-covered lakes.

  13. Watching elderly and disabled person's physical condition by remotely controlled monorail robot

    NASA Astrophysics Data System (ADS)

    Nagasaka, Yasunori; Matsumoto, Yoshinori; Fukaya, Yasutoshi; Takahashi, Tomoichi; Takeshita, Toru

    2001-10-01

    We are developing a nursing system using robots and cameras. The cameras are mounted on a remote controlled monorail robot which moves inside a room and watches the elderly. It is necessary to pay attention to the elderly at home or nursing homes all time. This requires staffs to pay attention to them at every time. The purpose of our system is to help those staffs. This study intends to improve such situation. A host computer controls a monorail robot to go in front of the elderly using the images taken by cameras on the ceiling. A CCD camera is mounted on the monorail robot to take pictures of their facial expression or movements. The robot sends the images to a host computer that checks them whether something unusual happens or not. We propose a simple calibration method for positioning the monorail robots to track the moves of the elderly for keeping their faces at center of camera view. We built a small experiment system, and evaluated our camera calibration method and image processing algorithm.

  14. 2. HEALTH CENTER OFFICE SOUTH BACK AND EAST SIDE, FROM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. HEALTH CENTER OFFICE SOUTH BACK AND EAST SIDE, FROM PASSAGE BEHIND COURTHOUSE, CAMERA FACING NORTHWEST. - Lancaster County Center, Health Center Office, 4845 Cedar Avenue, Lancaster, Los Angeles County, CA

  15. Building, roof, with machinery penthouses on left and harbor control ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Building, roof, with machinery penthouses on left and harbor control tower on right. Camera facing south - Naval Supply Center, Broadway Complex, Warehouse, 911 West Broadway, San Diego, San Diego County, CA

  16. Contextual view showing drainage culvert in foreground boarding east side ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing drainage culvert in foreground boarding east side of knoll with eucalyptus windbreak. Camera facing 278" southwest. - Goerlitz House, 9893 Highland Avenue, Rancho Cucamonga, San Bernardino County, CA

  17. Driver face recognition as a security and safety feature

    NASA Astrophysics Data System (ADS)

    Vetter, Volker; Giefing, Gerd-Juergen; Mai, Rudolf; Weisser, Hubert

    1995-09-01

    We present a driver face recognition system for comfortable access control and individual settings of automobiles. The primary goals are the prevention of car thefts and heavy accidents caused by unauthorized use (joy-riders), as well as the increase of safety through optimal settings, e.g. of the mirrors and the seat position. The person sitting on the driver's seat is observed automatically by a small video camera in the dashboard. All he has to do is to behave cooperatively, i.e. to look into the camera. A classification system validates his access. Only after a positive identification, the car can be used and the driver-specific environment (e.g. seat position, mirrors, etc.) may be set up to ensure the driver's comfort and safety. The driver identification system has been integrated in a Volkswagen research car. Recognition results are presented.

  18. Integrating daylighting into a 3,000 seat church auditorium and network quality television production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holder, L.M. III; Holder, L.M. IV

    1999-07-01

    The project was designed by the Overland Partners Architectural Firm for Riverbend Church of Austin as an Auditorium for Sunday Services and a venue for special theatrical presentations for the church and the community as well. It is an amphitheater on a hillside overlooking the Colorado River Valley. The amphitheater was selected as the building form to keep the audience closer to the speaker. A 175 ft wide by 60 ft tall arched window was installed on the north face to allow the audience to see the panorama views of the tree covered hills on the other side of themore » valley in the Texas Hill Country. Although the design is quite effective in achieving the program goals, these characteristics make it difficult to achieve effective daylighting without glare for the audience and television cameras since both face the north glazing. The design team was faced with providing quality daylighting for the audience and television cameras from the wall behind the stage. Most television studios have carefully controlled lighting systems with the major lighting component from behind the cameras. Virtually all television facilities with daylight contributing to the production lighting are in a building with high shading coefficient glass producing illumination on all areas equally or almost all glass and daylighting from skylights and clearstories above. All television networks have requirements for control of the quality of the video images to parallel those conditions for the program to be aired.« less

  19. Multi-modal low cost mobile indoor surveillance system on the Robust Artificial Intelligence-based Defense Electro Robot (RAIDER)

    NASA Astrophysics Data System (ADS)

    Nair, Binu M.; Diskin, Yakov; Asari, Vijayan K.

    2012-10-01

    We present an autonomous system capable of performing security check routines. The surveillance machine, the Clearpath Husky robotic platform, is equipped with three IP cameras with different orientations for the surveillance tasks of face recognition, human activity recognition, autonomous navigation and 3D reconstruction of its environment. Combining the computer vision algorithms onto a robotic machine has given birth to the Robust Artificial Intelligencebased Defense Electro-Robot (RAIDER). The end purpose of the RAIDER is to conduct a patrolling routine on a single floor of a building several times a day. As the RAIDER travels down the corridors off-line algorithms use two of the RAIDER's side mounted cameras to perform a 3D reconstruction from monocular vision technique that updates a 3D model to the most current state of the indoor environment. Using frames from the front mounted camera, positioned at the human eye level, the system performs face recognition with real time training of unknown subjects. Human activity recognition algorithm will also be implemented in which each detected person is assigned to a set of action classes picked to classify ordinary and harmful student activities in a hallway setting.The system is designed to detect changes and irregularities within an environment as well as familiarize with regular faces and actions to distinguish potentially dangerous behavior. In this paper, we present the various algorithms and their modifications which when implemented on the RAIDER serves the purpose of indoor surveillance.

  20. Familiarity effects in the construction of facial-composite images using modern software systems.

    PubMed

    Frowd, Charlie D; Skelton, Faye C; Butt, Neelam; Hassan, Amal; Fields, Stephen; Hancock, Peter J B

    2011-12-01

    We investigate the effect of target familiarity on the construction of facial composites, as used by law enforcement to locate criminal suspects. Two popular software construction methods were investigated. Participants were shown a target face that was either familiar or unfamiliar to them and constructed a composite of it from memory using a typical 'feature' system, involving selection of individual facial features, or one of the newer 'holistic' types, involving repeated selection and breeding from arrays of whole faces. This study found that composites constructed of a familiar face were named more successfully than composites of an unfamiliar face; also, naming of composites of internal and external features was equivalent for construction of unfamiliar targets, but internal features were better named than the external features for familiar targets. These findings applied to both systems, although benefit emerged for the holistic type due to more accurate construction of internal features and evidence for a whole-face advantage. STATEMENT OF RELEVANCE: This work is of relevance to practitioners who construct facial composites with witnesses to and victims of crime, as well as for software designers to help them improve the effectiveness of their composite systems.

  1. Applied learning-based color tone mapping for face recognition in video surveillance system

    NASA Astrophysics Data System (ADS)

    Yew, Chuu Tian; Suandi, Shahrel Azmin

    2012-04-01

    In this paper, we present an applied learning-based color tone mapping technique for video surveillance system. This technique can be applied onto both color and grayscale surveillance images. The basic idea is to learn the color or intensity statistics from a training dataset of photorealistic images of the candidates appeared in the surveillance images, and remap the color or intensity of the input image so that the color or intensity statistics match those in the training dataset. It is well known that the difference in commercial surveillance cameras models, and signal processing chipsets used by different manufacturers will cause the color and intensity of the images to differ from one another, thus creating additional challenges for face recognition in video surveillance system. Using Multi-Class Support Vector Machines as the classifier on a publicly available video surveillance camera database, namely SCface database, this approach is validated and compared to the results of using holistic approach on grayscale images. The results show that this technique is suitable to improve the color or intensity quality of video surveillance system for face recognition.

  2. ETR BUILDING, TRA642, INTERIOR. BASEMENT. CAMERA IS AT MIDPOINT OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. BASEMENT. CAMERA IS AT MIDPOINT OF SOUTH CORRIDOR AND FACES EAST, OPPOSITE DIRECTION FROM VIEWS ID-33-G-98 AND ID-33-G-99. STEEL DOOR AT LEFT OPENS BY ROLLING IT INTO CORRIDOR ON RAILS. TANK AT FAR END OF CORRIDOR IS EMERGENCY CORE COOLING CATCH TANK FOR A TEST LOOP. INL NEGATIVE NO. HD46-30-4. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  3. HOT CELL BUILDING, TRA632. CONTEXTUAL VIEW ALONG WALLEYE AVENUE, CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632. CONTEXTUAL VIEW ALONG WALLEYE AVENUE, CAMERA FACING EASTERLY. HOT CELL BUILDING IS AT CENTER LEFT OF VIEW; THE LOW-BAY PROJECTION WITH LADDER IS THE TEST TRAIN ASSEMBLY FACILITY, ADDED IN 1968. MTR BUILDING IS IN LEFT OF VIEW. HIGH-BAY BUILDING AT RIGHT IS THE ENGINEERING TEST REACTOR BUILDING, TRA-642. INL NEGATIVE NO. HD46-32-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  4. Recreation of three-dimensional objects in a real-time simulated environment by means of a panoramic single lens stereoscopic image-capturing device

    NASA Astrophysics Data System (ADS)

    Wong, Erwin

    2000-03-01

    Traditional methods of linear based imaging limits the viewer to a single fixed-point perspective. By means of a single lens multiple perspective mirror system, a 360-degree representation of the area around the camera is reconstructed. This reconstruction is used overcome the limitations of a traditional camera by providing the viewer with many different perspectives. By constructing the mirror into a hemispherical surface with multiple focal lengths at various diameters on the mirror, and by placing a parabolic mirror overhead, a stereoscopic image can be extracted from the image captured by a high-resolution camera placed beneath the mirror. Image extraction and correction is made by computer processing of the image obtained by camera; the image present up to five distinguishable different viewpoints that a computer can extrapolate pseudo- perspective data from. Geometric and depth for field can be extrapolated via comparison and isolation of objects within a virtual scene post processed by the computer. Combining data with scene rendering software provides the viewer with the ability to choose a desired viewing position, multiple dynamic perspectives, and virtually constructed perspectives based on minimal existing data. An examination into the workings of the mirror relay system is provided, including possible image extrapolation and correctional methods. Generation of data and virtual interpolated and constructed data is also mentioned.

  5. Effects of Synchronicity and Belongingness on Face-to-Face and Computer-Mediated Constructive Controversy

    ERIC Educational Resources Information Center

    Saltarelli, Andy J.; Roseth, Cary J.

    2014-01-01

    Adapting face-to-face (FTF) pedagogies to online settings raises boundary questions about the contextual conditions in which the same instructional method stimulates different outcomes. We address this issue by examining FTF and computer-mediated communication (CMC) versions of constructive controversy, a cooperative learning procedure involving…

  6. Make Your Own Animated Movies. Yellow Ball Workshop Film Techniques.

    ERIC Educational Resources Information Center

    Anderson, Yvonne

    At the Yellow Ball Workshop, children and teenagers make animated films using simple art materials and camera equipment. Based on the animation techniques developed at the workshop, complete instructions for constructing backgrounds and characters and for animating the figures are provided. Setting up and using the camera, splicing film,…

  7. Smartphone based face recognition tool for the blind.

    PubMed

    Kramer, K M; Hedin, D S; Rolkosky, D J

    2010-01-01

    The inability to identify people during group meetings is a disadvantage for blind people in many professional and educational situations. To explore the efficacy of face recognition using smartphones in these settings, we have prototyped and tested a face recognition tool for blind users. The tool utilizes Smartphone technology in conjunction with a wireless network to provide audio feedback of the people in front of the blind user. Testing indicated that the face recognition technology can tolerate up to a 40 degree angle between the direction a person is looking and the camera's axis and a 96% success rate with no false positives. Future work will be done to further develop the technology for local face recognition on the smartphone in addition to remote server based face recognition.

  8. Principal axis-based correspondence between multiple cameras for people tracking.

    PubMed

    Hu, Weiming; Hu, Min; Zhou, Xue; Tan, Tieniu; Lou, Jianguang; Maybank, Steve

    2006-04-01

    Visual surveillance using multiple cameras has attracted increasing interest in recent years. Correspondence between multiple cameras is one of the most important and basic problems which visual surveillance using multiple cameras brings. In this paper, we propose a simple and robust method, based on principal axes of people, to match people across multiple cameras. The correspondence likelihood reflecting the similarity of pairs of principal axes of people is constructed according to the relationship between "ground-points" of people detected in each camera view and the intersections of principal axes detected in different camera views and transformed to the same view. Our method has the following desirable properties: 1) Camera calibration is not needed. 2) Accurate motion detection and segmentation are less critical due to the robustness of the principal axis-based feature to noise. 3) Based on the fused data derived from correspondence results, positions of people in each camera view can be accurately located even when the people are partially occluded in all views. The experimental results on several real video sequences from outdoor environments have demonstrated the effectiveness, efficiency, and robustness of our method.

  9. Camera Control and Geo-Registration for Video Sensor Networks

    NASA Astrophysics Data System (ADS)

    Davis, James W.

    With the use of large video networks, there is a need to coordinate and interpret the video imagery for decision support systems with the goal of reducing the cognitive and perceptual overload of human operators. We present computer vision strategies that enable efficient control and management of cameras to effectively monitor wide-coverage areas, and examine the framework within an actual multi-camera outdoor urban video surveillance network. First, we construct a robust and precise camera control model for commercial pan-tilt-zoom (PTZ) video cameras. In addition to providing a complete functional control mapping for PTZ repositioning, the model can be used to generate wide-view spherical panoramic viewspaces for the cameras. Using the individual camera control models, we next individually map the spherical panoramic viewspace of each camera to a large aerial orthophotograph of the scene. The result provides a unified geo-referenced map representation to permit automatic (and manual) video control and exploitation of cameras in a coordinated manner. The combined framework provides new capabilities for video sensor networks that are of significance and benefit to the broad surveillance/security community.

  10. System Construction of the Stilbene Compact Neutron Scatter Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsmith, John E. M.; Gerling, Mark D.; Brennan, James S.

    This report documents the construction of a stilbene-crystal-based compact neutron scatter camera. This system is essentially identical to the MINER (Mobile Imager of Neutrons for Emergency Responders) system previously built and deployed under DNN R&D funding,1 but with the liquid scintillator in the detection cells replaced by stilbene crystals. The availability of these two systems for side-by-side performance comparisons will enable us to unambiguously identify the performance enhancements provided by the stilbene crystals, which have only recently become commercially available in the large size required (3” diameter, 3” deep).

  11. Evaluation of the MSFC facsimile camera system as a tool for extraterrestrial geologic exploration

    NASA Technical Reports Server (NTRS)

    Wolfe, E. W.; Alderman, J. D.

    1971-01-01

    Utility of the Marshall Space Flight (MSFC) facsimile camera system for extraterrestrial geologic exploration was investigated during the spring of 1971 near Merriam Crater in northern Arizona. Although the system with its present hard-wired recorder operates erratically, the imagery showed that the camera could be developed as a prime imaging tool for automated missions. Its utility would be enhanced by development of computer techniques that utilize digital camera output for construction of topographic maps, and it needs increased resolution for examining near field details. A supplementary imaging system may be necessary for hand specimen examination at low magnification.

  12. The Faces in Infant-Perspective Scenes Change over the First Year of Life

    PubMed Central

    Jayaraman, Swapnaa; Fausey, Caitlin M.; Smith, Linda B.

    2015-01-01

    Mature face perception has its origins in the face experiences of infants. However, little is known about the basic statistics of faces in early visual environments. We used head cameras to capture and analyze over 72,000 infant-perspective scenes from 22 infants aged 1-11 months as they engaged in daily activities. The frequency of faces in these scenes declined markedly with age: for the youngest infants, faces were present 15 minutes in every waking hour but only 5 minutes for the oldest infants. In general, the available faces were well characterized by three properties: (1) they belonged to relatively few individuals; (2) they were close and visually large; and (3) they presented views showing both eyes. These three properties most strongly characterized the face corpora of our youngest infants and constitute environmental constraints on the early development of the visual system. PMID:26016988

  13. Late afternoon view of the interior of the westcentral wall ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Late afternoon view of the interior of the west-central wall section to be removed; camera facing north. Gravestones in the foreground. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  14. Faces in water bubbles

    NASA Image and Video Library

    2013-07-12

    NASA astronaut Karen Nyberger, Expedition 36 flight engineer, watches a water bubble float freely between her and the camera, showing her image refracted in the droplet, while in the Node 1Unity module of the International Space Station.

  15. Faces in water bubbles

    NASA Image and Video Library

    2013-07-12

    ISS036-E-018302 (12 July 2013) --- NASA astronaut Chris Cassidy, Expedition 36 flight engineer, watches a water bubble float freely between him and the camera, showing his image refracted, in the Unity node of the International Space Station.

  16. 8. DETAIL OF WINDOW AT EAST END OF NORTH ELEVATION, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF WINDOW AT EAST END OF NORTH ELEVATION, CAMERA FACING SOUTH. - U.S. Coast Guard Support Center Alameda, Warehouse, Spencer Road & Icarrus Drive, Coast Guard Island, Alameda, Alameda County, CA

  17. 7. DETAIL OF WINDOW AT WEST END OF SOUTH ELEVATION, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. DETAIL OF WINDOW AT WEST END OF SOUTH ELEVATION, CAMERA FACING NORTH. - U.S. Coast Guard Support Center Alameda, Warehouse, Spencer Road & Icarrus Drive, Coast Guard Island, Alameda, Alameda County, CA

  18. Interior view showing split levels with buildings 87 windows in ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view showing split levels with buildings 87 windows in distance; camera facing west. - Mare Island Naval Shipyard, Mechanics Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  19. INTERIOR DETAIL OF, CEILINGS OF EAST BEDROOM, NORTH WING, SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR DETAIL OF, CEILINGS OF EAST BEDROOM, NORTH WING, SHOWING PART OF MOUNTAIN LION MURAL; CAMERA FACING NORTHEAST - Harry Carey Ranch, Ranch House, 28515 San Francisquito Canyon Road, Saugus, Los Angeles County, CA

  20. Detail of entry and stairway at center of west elevation; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of entry and stairway at center of west elevation; camera facing east. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  1. Interior detail of stairway at first floor near main entry; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of stairway at first floor near main entry; camera facing south. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  2. Interior view of typical room on second floor, west side; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of typical room on second floor, west side; camera facing north. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  3. Overview in two parts: Right view showing orchard path on ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Overview in two parts: Right view showing orchard path on left eucalyptus windbreak bordering knoll on right. Camera facing 278" west. - Goerlitz House, 9893 Highland Avenue, Rancho Cucamonga, San Bernardino County, CA

  4. View of Chapel Park, showing bomb shelters at right foreground, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Chapel Park, showing bomb shelters at right foreground, from building 746 parking lot across Walnut Avenue; camera facing north. - Mare Island Naval Shipyard, East of Nave Drive, Vallejo, Solano County, CA

  5. Contextual view of building 505 showing west elevation from marsh; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 505 showing west elevation from marsh; camera facing east. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  6. Interior detail of trusses and high windows in north wing; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of trusses and high windows in north wing; camera facing southwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  7. View of steel warehouses (building 710 second in on right); ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses (building 710 second in on right); camera facing south. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  8. View of steel warehouses (building 710 second in on left); ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses (building 710 second in on left); camera facing west. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  9. Interior view of main entry on south elevation, showing railroad ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of main entry on south elevation, showing railroad tracks; camera facing south. - Mare Island Naval Shipyard, Boiler Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  10. Interior view of main entry on south elevation, showing railroad ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of main entry on south elevation, showing railroad tracks; camera facing south. - Mare Island Naval Shipyard, Machine Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  11. Detail of second floor window, south wall of north wing; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of second floor window, south wall of north wing; camera facing south. - Mare Island Naval Shipyard, Marine Prison, Suisun Avenue, west side between Mesa Road & San Pablo, Vallejo, Solano County, CA

  12. Human-Robot Interaction

    NASA Technical Reports Server (NTRS)

    Sandor, Aniko; Cross, E. Vincent, II; Chang, Mai Lee

    2015-01-01

    Human-robot interaction (HRI) is a discipline investigating the factors affecting the interactions between humans and robots. It is important to evaluate how the design of interfaces affect the human's ability to perform tasks effectively and efficiently when working with a robot. By understanding the effects of interface design on human performance, workload, and situation awareness, interfaces can be developed to appropriately support the human in performing tasks with minimal errors and with appropriate interaction time and effort. Thus, the results of research on human-robot interfaces have direct implications for the design of robotic systems. For efficient and effective remote navigation of a rover, a human operator needs to be aware of the robot's environment. However, during teleoperation, operators may get information about the environment only through a robot's front-mounted camera causing a keyhole effect. The keyhole effect reduces situation awareness which may manifest in navigation issues such as higher number of collisions, missing critical aspects of the environment, or reduced speed. One way to compensate for the keyhole effect and the ambiguities operators experience when they teleoperate a robot is adding multiple cameras and including the robot chassis in the camera view. Augmented reality, such as overlays, can also enhance the way a person sees objects in the environment or in camera views by making them more visible. Scenes can be augmented with integrated telemetry, procedures, or map information. Furthermore, the addition of an exocentric (i.e., third-person) field of view from a camera placed in the robot's environment may provide operators with the additional information needed to gain spatial awareness of the robot. Two research studies investigated possible mitigation approaches to address the keyhole effect: 1) combining the inclusion of the robot chassis in the camera view with augmented reality overlays, and 2) modifying the camera frame of reference. The first study investigated the effects of inclusion and exclusion of the robot chassis along with superimposing a simple arrow overlay onto the video feed of operator task performance during teleoperation of a mobile robot in a driving task. In this study, the front half of the robot chassis was made visible through the use of three cameras, two side-facing and one forward-facing. The purpose of the second study was to compare operator performance when teleoperating a robot from an egocentric-only and combined (egocentric plus exocentric camera) view. Camera view parameters that are found to be beneficial in these laboratory experiments can be implemented on NASA rovers and tested in a real-world driving and navigation scenario on-site at the Johnson Space Center.

  13. Combining Deep and Handcrafted Image Features for Presentation Attack Detection in Face Recognition Systems Using Visible-Light Camera Sensors

    PubMed Central

    Nguyen, Dat Tien; Pham, Tuyen Danh; Baek, Na Rae; Park, Kang Ryoung

    2018-01-01

    Although face recognition systems have wide application, they are vulnerable to presentation attack samples (fake samples). Therefore, a presentation attack detection (PAD) method is required to enhance the security level of face recognition systems. Most of the previously proposed PAD methods for face recognition systems have focused on using handcrafted image features, which are designed by expert knowledge of designers, such as Gabor filter, local binary pattern (LBP), local ternary pattern (LTP), and histogram of oriented gradients (HOG). As a result, the extracted features reflect limited aspects of the problem, yielding a detection accuracy that is low and varies with the characteristics of presentation attack face images. The deep learning method has been developed in the computer vision research community, which is proven to be suitable for automatically training a feature extractor that can be used to enhance the ability of handcrafted features. To overcome the limitations of previously proposed PAD methods, we propose a new PAD method that uses a combination of deep and handcrafted features extracted from the images by visible-light camera sensor. Our proposed method uses the convolutional neural network (CNN) method to extract deep image features and the multi-level local binary pattern (MLBP) method to extract skin detail features from face images to discriminate the real and presentation attack face images. By combining the two types of image features, we form a new type of image features, called hybrid features, which has stronger discrimination ability than single image features. Finally, we use the support vector machine (SVM) method to classify the image features into real or presentation attack class. Our experimental results indicate that our proposed method outperforms previous PAD methods by yielding the smallest error rates on the same image databases. PMID:29495417

  14. Combining Deep and Handcrafted Image Features for Presentation Attack Detection in Face Recognition Systems Using Visible-Light Camera Sensors.

    PubMed

    Nguyen, Dat Tien; Pham, Tuyen Danh; Baek, Na Rae; Park, Kang Ryoung

    2018-02-26

    Although face recognition systems have wide application, they are vulnerable to presentation attack samples (fake samples). Therefore, a presentation attack detection (PAD) method is required to enhance the security level of face recognition systems. Most of the previously proposed PAD methods for face recognition systems have focused on using handcrafted image features, which are designed by expert knowledge of designers, such as Gabor filter, local binary pattern (LBP), local ternary pattern (LTP), and histogram of oriented gradients (HOG). As a result, the extracted features reflect limited aspects of the problem, yielding a detection accuracy that is low and varies with the characteristics of presentation attack face images. The deep learning method has been developed in the computer vision research community, which is proven to be suitable for automatically training a feature extractor that can be used to enhance the ability of handcrafted features. To overcome the limitations of previously proposed PAD methods, we propose a new PAD method that uses a combination of deep and handcrafted features extracted from the images by visible-light camera sensor. Our proposed method uses the convolutional neural network (CNN) method to extract deep image features and the multi-level local binary pattern (MLBP) method to extract skin detail features from face images to discriminate the real and presentation attack face images. By combining the two types of image features, we form a new type of image features, called hybrid features, which has stronger discrimination ability than single image features. Finally, we use the support vector machine (SVM) method to classify the image features into real or presentation attack class. Our experimental results indicate that our proposed method outperforms previous PAD methods by yielding the smallest error rates on the same image databases.

  15. New camera-based microswitch technology to monitor small head and mouth responses of children with multiple disabilities.

    PubMed

    Lancioni, Giulio E; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N; O'Reilly, Mark F; Green, Vanessa A; Furniss, Fred

    2014-06-01

    Assessing a new camera-based microswitch technology, which did not require the use of color marks on the participants' face. Two children with extensive multiple disabilities participated. The responses selected for them consisted of small, lateral head movements and mouth closing or opening. The intervention was carried out according to a multiple probe design across responses. The technology involved a computer with a CPU using a 2-GHz clock, a USB video camera with a 16-mm lens, a USB cable connecting the camera and the computer, and a special software program written in ISO C++ language. The new technology was satisfactorily used with both children. Large increases in their responding were observed during the intervention periods (i.e. when the responses were followed by preferred stimulation). The new technology may be an important resource for persons with multiple disabilities and minimal motor behavior.

  16. Experiment on Uav Photogrammetry and Terrestrial Laser Scanning for Ict-Integrated Construction

    NASA Astrophysics Data System (ADS)

    Takahashi, N.; Wakutsu, R.; Kato, T.; Wakaizumi, T.; Ooishi, T.; Matsuoka, R.

    2017-08-01

    In the 2016 fiscal year the Ministry of Land, Infrastructure, Transport and Tourism of Japan started a program integrating construction and ICT in earthwork and concrete placing. The new program named "i-Construction" focusing on productivity improvement adopts such new technologies as UAV photogrammetry and TLS. We report a field experiment to investigate whether the procedures of UAV photogrammetry and TLS following the standards for "i-Construction" are feasible or not. In the experiment we measured an embankment of about 80 metres by 160 metres immediately after earthwork was done on the embankment. We used two sets of UAV and camera in the experiment. One is a larger UAV enRoute Zion QC730 and its onboard camera Sony α6000. The other is a smaller UAV DJI Phantom 4 and its dedicated onboard camera. Moreover, we used a terrestrial laser scanner FARO Focus3D X330 based on the phase shift principle. The experiment results indicate that the procedures of UAV photogrammetry using a QC730 with an α6000 and TLS using a Focus3D X330 following the standards for "i-Construction" would be feasible. Furthermore, the experiment results show that UAV photogrammetry using a lower price UAV Phantom 4 was unable to satisfy the accuracy requirement for "i-Construction." The cause of the low accuracy by Phantom 4 is under investigation. We also found that the difference of image resolution on the ground would not have a great influence on the measurement accuracy in UAV photogrammetry.

  17. Thermal-depth matching in dynamic scene based on affine projection and feature registration

    NASA Astrophysics Data System (ADS)

    Wang, Hongyu; Jia, Tong; Wu, Chengdong; Li, Yongqiang

    2018-03-01

    This paper aims to study the construction of 3D temperature distribution reconstruction system based on depth and thermal infrared information. Initially, a traditional calibration method cannot be directly used, because the depth and thermal infrared camera is not sensitive to the color calibration board. Therefore, this paper aims to design a depth and thermal infrared camera calibration board to complete the calibration of the depth and thermal infrared camera. Meanwhile a local feature descriptors in thermal and depth images is proposed. The belief propagation matching algorithm is also investigated based on the space affine transformation matching and local feature matching. The 3D temperature distribution model is built based on the matching of 3D point cloud and 2D thermal infrared information. Experimental results show that the method can accurately construct the 3D temperature distribution model, and has strong robustness.

  18. Applications of optical fibers and miniature photonic elements in medical diagnostics

    NASA Astrophysics Data System (ADS)

    Blaszczak, Urszula; Gilewski, Marian; Gryko, Lukasz; Zajac, Andrzej; Kukwa, Andrzej; Kukwa, Wojciech

    2014-05-01

    Construction of endoscopes which are known for decades, in particular in small devices with the diameter of few millimetres, are based on the application of fibre optic imaging bundles or bundles of fibers in the illumination systems (usually with a halogen source). Cameras - CCD and CMOS - with the sensor size of less than 5 mm emerging commercially and high power LED solutions allow to design and construct modern endoscopes characterized by many innovative properties. These constructions offer higher resolution. They are also relatively cheaper especially in the context of the integration of the majority of the functions on a single chip. Mentioned features of the CMOS sensors reduce the cycle of introducing the newly developed instruments to the market. The paper includes a description of the concept of the endoscope with a miniature camera built on the basis of CMOS detector manufactured by Omni Vision. The set of LEDs located at the operator side works as the illuminating system. Fibre optic system and the lens of the camera are used in shaping the beam illuminating the observed tissue. Furthermore, to broaden the range of applications of the endoscope, the illuminator allows to control the spectral characteristics of emitted light. The paper presents the analysis of the basic parameters of the light-and-optical system of the endoscope. The possibility of adjusting the magnifications of the lens, the field of view of the camera and its spatial resolution is discussed. Special attention was drawn to the issues related to the selection of the light sources used for the illumination in terms of energy efficiency and the possibility of providing adjusting the colour of the emitted light in order to improve the quality of the image obtained by the camera.

  19. Morning view, contextual view showing unpaved corridor down the westernmost ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view showing unpaved corridor down the westernmost lane where the wall section (E) will be removed; camera facing north-northwest. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  20. Morning view, contextual view showing the road and gate to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view showing the road and gate to be widened; view taken from the statue area with the camera facing north. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  1. Morning view of the exterior of the westernmost wall section ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view of the exterior of the westernmost wall section to be removed; camera facing south. Unpaved road in foreground; tree canopy in background. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  2. Detail of window and lamp at entrance on north side ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of window and lamp at entrance on north side of north wing; camera facing south. - Mare Island Naval Shipyard, Administrative Offices, Walnut Avenue, east side between Seventh & Eighth Streets, Vallejo, Solano County, CA

  3. INTERIOR DETAIL OF SECOND FLOOR DOORS AT NORTH END OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR DETAIL OF SECOND FLOOR DOORS AT NORTH END OF BUILDING; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  4. INTERIOR VIEW OF SECOND STORY SPACE LOOKING TOWARD SECOND FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF SECOND STORY SPACE LOOKING TOWARD SECOND FLOOR DOORS; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  5. INTERIOR VIEW OF SECOND STORY SPACE, NORTH END OF BUILDING; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF SECOND STORY SPACE, NORTH END OF BUILDING; CAMERA FACING SOUTHEAST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  6. CONTEXTUAL VIEW OF BUILDING 231 SHOWING WEST AND SOUTH ELEVATIONS; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    CONTEXTUAL VIEW OF BUILDING 231 SHOWING WEST AND SOUTH ELEVATIONS; CAMERA FACING NORTHEAST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  7. DETAIL OF FIRST STORY WINDOWS ON NORTH END OF EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF FIRST STORY WINDOWS ON NORTH END OF EAST ELEVATION; CAMERA FACING WEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  8. DETAIL OF WINDOW AND ROOF VENT AT EAST ELEVATION GABLE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF WINDOW AND ROOF VENT AT EAST ELEVATION GABLE END; CAMERA FACING WEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  9. CONTEXTUAL VIEW OF BUILDING 231 SHOWING EAST AND NORTH ELEVATIONS; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    CONTEXTUAL VIEW OF BUILDING 231 SHOWING EAST AND NORTH ELEVATIONS; CAMERA FACING SOUTHWEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  10. DETAIL OF GABLE END WITH ARCHED WINDOW, SHOWING SOFFIT OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF GABLE END WITH ARCHED WINDOW, SHOWING SOFFIT OF OVERHANG; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  11. Contextual view of summer kitchen, showing blacksmith shop downhill at ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of summer kitchen, showing blacksmith shop downhill at right and cottage at center (between the trees); camera facing northeast - Lemmon-Anderson-Hixson Ranch, Summer Kitchen, 11220 North Virginia Street, Reno, Washoe County, NV

  12. Detail of second story windows at northwest corner of southeast ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of second story windows at northwest corner of southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  13. Detail of parapet, frieze, and castings at windows. South elevation; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of parapet, frieze, and castings at windows. South elevation; camera facing north. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  14. Detail of second floor windows and chimney on southeast elevation; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of second floor windows and chimney on southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  15. Contextual view of the rear of building 926 from the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of the rear of building 926 from the hillside; camera facing east. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  16. Interior view of typical ward on second floor, south wing; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of typical ward on second floor, south wing; camera facing northwest. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  17. Interior view of first floor lobby with detail of columns; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of first floor lobby with detail of columns; camera facing north. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  18. Interior detail of exit door on second floor at north ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of exit door on second floor at north end; camera facing north. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  19. Interior detail of bathroom on first floor north end, west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of bathroom on first floor north end, west side, camera facing west. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  20. Contextual view of Goerlitz Property, showing eucalyptus trees along west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Goerlitz Property, showing eucalyptus trees along west side of driveway; parking lot and utility pole in foreground. Camera facing 38" northeast - Goerlitz House, 9893 Highland Avenue, Rancho Cucamonga, San Bernardino County, CA

  1. View of main terrace retaining wall with mature tree on ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of main terrace retaining wall with mature tree on left center, camera facing southeast - Naval Training Station, Senior Officers' Quarters District, Naval Station Treasure Island, Yerba Buena Island, San Francisco, San Francisco County, CA

  2. Contextual view of building 505 Cedar avenue, showing south and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 505 Cedar avenue, showing south and east elevations; camera facing northwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  3. Detail of north wing with rollup door on north elevation; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of north wing with roll-up door on north elevation; camera facing south. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  4. View of steel warehouses at Gilmore Avenue (building 710 second ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses at Gilmore Avenue (building 710 second in on left); camera facing east. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  5. View of steel warehouses on Ellsberg Drive, building 710 full ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses on Ellsberg Drive, building 710 full building at center; camera facing southeast. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  6. View of steel warehouses (from left: building 807, 808, 809, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses (from left: building 807, 808, 809, 810, 811); camera facing east. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  7. Intermediate view synthesis for eye-gazing

    NASA Astrophysics Data System (ADS)

    Baek, Eu-Ttuem; Ho, Yo-Sung

    2015-01-01

    Nonverbal communication, also known as body language, is an important form of communication. Nonverbal behaviors such as posture, eye contact, and gestures send strong messages. In regard to nonverbal communication, eye contact is one of the most important forms that an individual can use. However, lack of eye contact occurs when we use video conferencing system. The disparity between locations of the eyes and a camera gets in the way of eye contact. The lock of eye gazing can give unapproachable and unpleasant feeling. In this paper, we proposed an eye gazing correction for video conferencing. We use two cameras installed at the top and the bottom of the television. The captured two images are rendered with 2D warping at virtual position. We implement view morphing to the detected face, and synthesize the face and the warped image. Experimental results verify that the proposed system is effective in generating natural gaze-corrected images.

  8. PBF Cooling Tower and it Auxiliary Building (PER624) to left ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower and it Auxiliary Building (PER-624) to left of tower. Camera facing west and the east louvered face of the tower. Details include secondary coolant water riser piping and flow control valves (butterfly valves) to distribute water evenly to all sections of tower. Photographer: Holmes. Date: May, 20, 1970. INEEL negative no. 70-2322 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  9. Examining wildlife responses to phenology and wildfire using a landscape-scale camera trap network

    USGS Publications Warehouse

    Villarreal, Miguel L.; Gass, Leila; Norman, Laura; Sankeya, Joel B.; Wallace, Cynthia S.A.; McMacken, Dennis; Childs, Jack L.; Petrakis, Roy E.

    2012-01-01

    Between 2001 and 2009, the Borderlands Jaguar Detection Project deployed 174 camera traps in the mountains of southern Arizona to record jaguar activity. In addition to jaguars, the motion-activated cameras, placed along known wildlife travel routes, recorded occurrences of ~ 20 other animal species. We examined temporal relationships of white-tailed deer (Odocoileus virginianus) and javelina (Pecari tajacu) to landscape phenology (as measured by monthly Normalized Difference Vegetation Index data) and the timing of wildfire (Alambre Fire of 2007). Mixed model analyses suggest that temporal dynamics of these two species were related to vegetation phenology and natural disturbance in the Sky Island region, information important for wildlife managers faced with uncertainty regarding changing climate and disturbance regimes.

  10. A novel augmented reality simulator for skills assessment in minimal invasive surgery.

    PubMed

    Lahanas, Vasileios; Loukas, Constantinos; Smailis, Nikolaos; Georgiou, Evangelos

    2015-08-01

    Over the past decade, simulation-based training has come to the foreground as an efficient method for training and assessment of surgical skills in minimal invasive surgery. Box-trainers and virtual reality (VR) simulators have been introduced in the teaching curricula and have substituted to some extent the traditional model of training based on animals or cadavers. Augmented reality (AR) is a new technology that allows blending of VR elements and real objects within a real-world scene. In this paper, we present a novel AR simulator for assessment of basic laparoscopic skills. The components of the proposed system include: a box-trainer, a camera and a set of laparoscopic tools equipped with custom-made sensors that allow interaction with VR training elements. Three AR tasks were developed, focusing on basic skills such as perception of depth of field, hand-eye coordination and bimanual operation. The construct validity of the system was evaluated via a comparison between two experience groups: novices with no experience in laparoscopic surgery and experienced surgeons. The observed metrics included task execution time, tool pathlength and two task-specific errors. The study also included a feedback questionnaire requiring participants to evaluate the face-validity of the system. Between-group comparison demonstrated highly significant differences (<0.01) in all performance metrics and tasks denoting the simulator's construct validity. Qualitative analysis on the instruments' trajectories highlighted differences between novices and experts regarding smoothness and economy of motion. Subjects' ratings on the feedback questionnaire highlighted the face-validity of the training system. The results highlight the potential of the proposed simulator to discriminate groups with different expertise providing a proof of concept for the potential use of AR as a core technology for laparoscopic simulation training.

  11. The Dartmouth Database of Children’s Faces: Acquisition and Validation of a New Face Stimulus Set

    PubMed Central

    Dalrymple, Kirsten A.; Gomez, Jesse; Duchaine, Brad

    2013-01-01

    Facial identity and expression play critical roles in our social lives. Faces are therefore frequently used as stimuli in a variety of areas of scientific research. Although several extensive and well-controlled databases of adult faces exist, few databases include children’s faces. Here we present the Dartmouth Database of Children’s Faces, a set of photographs of 40 male and 40 female Caucasian children between 6 and 16 years-of-age. Models posed eight facial expressions and were photographed from five camera angles under two lighting conditions. Models wore black hats and black gowns to minimize extra-facial variables. To validate the images, independent raters identified facial expressions, rated their intensity, and provided an age estimate for each model. The Dartmouth Database of Children’s Faces is freely available for research purposes and can be downloaded by contacting the corresponding author by email. PMID:24244434

  12. Imaging Dot Patterns for Measuring Gossamer Space Structures

    NASA Technical Reports Server (NTRS)

    Dorrington, A. A.; Danehy, P. M.; Jones, T. W.; Pappa, R. S.; Connell, J. W.

    2005-01-01

    A paper describes a photogrammetric method for measuring the changing shape of a gossamer (membrane) structure deployed in outer space. Such a structure is typified by a solar sail comprising a transparent polymeric membrane aluminized on its Sun-facing side and coated black on the opposite side. Unlike some prior photogrammetric methods, this method does not require an artificial light source or the attachment of retroreflectors to the gossamer structure. In a basic version of the method, the membrane contains a fluorescent dye, and the front and back coats are removed in matching patterns of dots. The dye in the dots absorbs some sunlight and fluoresces at a longer wavelength in all directions, thereby enabling acquisition of high-contrast images from almost any viewing angle. The fluorescent dots are observed by one or more electronic camera(s) on the Sun side, the shade side, or both sides. Filters that pass the fluorescent light and suppress most of the solar spectrum are placed in front of the camera(s) to increase the contrast of the dots against the background. The dot image(s) in the camera(s) are digitized, then processed by use of commercially available photogrammetric software.

  13. Center for Automatic Target Recognition Research. Delivery Order 0005: Image Georegistration, Camera Calibration, and Dismount Categorization in Support of DEBU from Layered Sensing

    DTIC Science & Technology

    2011-07-01

    rendering of a subject using 316,691 polygon faces and 161,951 points. The small white dots on the surface of the subject are landmark points. The...Figure 17: CAESAR Data. The leftmost image is a color polygon rendering of a subject using 316,691 polygon faces and 161,951 points. The small white...polygon rendering of a subject using 316,691 polygon faces and 161,951 points. The small white dots on the surface of the subject are landmark points

  14. Performance measurement of commercial electronic still picture cameras

    NASA Astrophysics Data System (ADS)

    Hsu, Wei-Feng; Tseng, Shinn-Yih; Chiang, Hwang-Cheng; Cheng, Jui-His; Liu, Yuan-Te

    1998-06-01

    Commercial electronic still picture cameras need a low-cost, systematic method for evaluating the performance. In this paper, we present a measurement method to evaluating the dynamic range and sensitivity by constructing the opto- electronic conversion function (OECF), the fixed pattern noise by the peak S/N ratio (PSNR) and the image shading function (ISF), and the spatial resolution by the modulation transfer function (MTF). The evaluation results of individual color components and the luminance signal from a PC camera using SONY interlaced CCD array as the image sensor are then presented.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaffney, Kelly

    Movies have transformed our perception of the world. With slow motion photography, we can see a hummingbird flap its wings, and a bullet pierce an apple. The remarkably small and extremely fast molecular world that determines how your body functions cannot be captured with even the most sophisticated movie camera today. To see chemistry in real time requires a camera capable of seeing molecules that are one ten billionth of a foot with a frame rate of 10 trillion frames per second! SLAC has embarked on the construction of just such a camera. Please join me as I discuss howmore » this molecular movie camera will work and how it will change our perception of the molecular world.« less

  16. Morning view of the exterior of the gate and white ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view of the exterior of the gate and white posts to be reworked/widened; camera facing south looking into the cemetery toward the statue. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  17. INTERIOR VIEW OF FIRST FLOOR SPACE AT NORTH END, LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF FIRST FLOOR SPACE AT NORTH END, LOOKING AT WEST WALL; CAMERA FACING NORTHWEST. - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  18. Contextual view of building, with building #12 in right background ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building, with building #12 in right background and building #11 in right foreground. Camera facing east-southeast - Naval Supply Center, Broadway Complex, Administration Storehouse, 911 West Broadway, San Diego, San Diego County, CA

  19. Contextual view of building H70 showing southeast and northeast elevations; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building H70 showing southeast and northeast elevations; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  20. Detail of concrete pillars and steps leading to main entry ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of concrete pillars and steps leading to main entry at southeast elevation; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  1. Detail of first and second floor windows at east corner ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of first and second floor windows at east corner of south elevation; camera facing northwest. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  2. Interior view of office with fireplace on second floor off ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of office with fireplace on second floor off south lobby; camera facing southeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  3. 1. BUILDING L (LEFT OF CENTER) EAST END AND SOUTH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. BUILDING L (LEFT OF CENTER) EAST END AND SOUTH SIDE (BUILDING K IS ON RIGHT, BUILDING M IS ON LEFT), CAMERA FACING NORTHWEST - Buffalo Ranch, Office Building, 2418 MacArthur Boulevard, Irvine, Orange County, CA

  4. Interior detail view showing worn threshold in doorway between kitchen ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail view showing worn threshold in doorway between kitchen and west room in north addition. Camera facing west. - Warner Ranch, Ranch House, San Felipe Road (State Highway S2), Warner Springs, San Diego County, CA

  5. Estimation of vibration frequency of loudspeaker diaphragm by parallel phase-shifting digital holography

    NASA Astrophysics Data System (ADS)

    Kakue, T.; Endo, Y.; Shimobaba, T.; Ito, T.

    2014-11-01

    We report frequency estimation of loudspeaker diaphragm vibrating at high speed by parallel phase-shifting digital holography which is a technique of single-shot phase-shifting interferometry. This technique records multiple phaseshifted holograms required for phase-shifting interferometry by using space-division multiplexing. We constructed a parallel phase-shifting digital holography system consisting of a high-speed polarization-imaging camera. This camera has a micro-polarizer array which selects four linear polarization axes for 2 × 2 pixels. We set a loudspeaker as an object, and recorded vibration of diaphragm of the loudspeaker by the constructed system. By the constructed system, we demonstrated observation of vibration displacement of loudspeaker diaphragm. In this paper, we aim to estimate vibration frequency of the loudspeaker diaphragm by applying the experimental results to frequency analysis. Holograms consisting of 128 × 128 pixels were recorded at a frame rate of 262,500 frames per second by the camera. A sinusoidal wave was input to the loudspeaker via a phone connector. We observed displacement of the loudspeaker diaphragm vibrating by the system. We also succeeded in estimating vibration frequency of the loudspeaker diaphragm by applying frequency analysis to the experimental results.

  6. Printed products for digital cameras and mobile devices

    NASA Astrophysics Data System (ADS)

    Fageth, Reiner; Schmidt-Sacht, Wulf

    2005-01-01

    Digital photography is no longer simply a successor to film. The digital market is now driven by additional devices such as mobile phones with camera and video functions (camphones) as well as innovative products derived from digital files. A large number of consumers do not print their images and non-printing has become the major enemy of wholesale printers, home printing suppliers and retailers. This paper addresses the challenge facing our industry, namely how to encourage the consumer to print images easily and conveniently from all types of digital media.

  7. PROCESS WATER BUILDING, TRA605, INTERIOR. FIRST FLOOR. CAMERA IS IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605, INTERIOR. FIRST FLOOR. CAMERA IS IN SOUTHEAST CORNER AND FACES NORTHWEST. CONTROL ROOM AT RIGHT. CRANE MONORAIL IS OVER FLOOR HATCHES AND FLOOR OPENINGS. SIX VALVE HANDWHEELS ALONG FAR WALL IN LEFT CENTER VIEW. SEAL TANK IS ON OTHER SIDE OF WALL; PROCESS WATER PIPES ARE BELOW VALVE WHEELS. NOTE CURBS AROUND FLOOR OPENINGS. INL NEGATIVE NO. HD46-26-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  8. ETR CRITICAL FACILITY, TRA654. CONTEXTUAL VIEW. CAMERA ON ROOF OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR CRITICAL FACILITY, TRA-654. CONTEXTUAL VIEW. CAMERA ON ROOF OF MTR BUILDING AND FACING SOUTH. ETR AND ITS COOLANT BUILDING AT UPPER PART OF VIEW. ETR COOLING TOWER NEAR TOP EDGE OF VIEW. EXCAVATION AT CENTER IS FOR ETR CF. CENTER OF WHICH WILL CONTAIN POOL FOR REACTOR. NOTE CHOPPER TUBE PROCEEDING FROM MTR IN LOWER LEFT OF VIEW, DIAGONAL TOWARD LEFT. INL NEGATIVE NO. 56-4227. Jack L. Anderson, Photographer, 12/18/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. Comparison of parameters of modern cooled and uncooled thermal cameras

    NASA Astrophysics Data System (ADS)

    Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał

    2017-10-01

    During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.

  10. A high-speed digital camera system for the observation of rapid H-alpha fluctuations in solar flares

    NASA Technical Reports Server (NTRS)

    Kiplinger, Alan L.; Dennis, Brian R.; Orwig, Larry E.

    1989-01-01

    Researchers developed a prototype digital camera system for obtaining H-alpha images of solar flares with 0.1 s time resolution. They intend to operate this system in conjunction with SMM's Hard X Ray Burst Spectrometer, with x ray instruments which will be available on the Gamma Ray Observatory and eventually with the Gamma Ray Imaging Device (GRID), and with the High Resolution Gamma-Ray and Hard X Ray Spectrometer (HIREGS) which are being developed for the Max '91 program. The digital camera has recently proven to be successful as a one camera system operating in the blue wing of H-alpha during the first Max '91 campaign. Construction and procurement of a second and possibly a third camera for simultaneous observations at other wavelengths are underway as are analyses of the campaign data.

  11. SCC500: next-generation infrared imaging camera core products with highly flexible architecture for unique camera designs

    NASA Astrophysics Data System (ADS)

    Rumbaugh, Roy N.; Grealish, Kevin; Kacir, Tom; Arsenault, Barry; Murphy, Robert H.; Miller, Scott

    2003-09-01

    A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer specific software application plug-ins, and a highly modular construction for unique physical and interface options. These microbolometer based camera cores offer outstanding and reliable performance over an extended operating temperature range to meet the demanding requirements of real-world environments. A highly integrated lens and shutter is included in the new SCC500 product enabling easy, drop-in camera designs for quick time-to-market product introductions.

  12. Speckle-learning-based object recognition through scattering media.

    PubMed

    Ando, Takamasa; Horisaki, Ryoichi; Tanida, Jun

    2015-12-28

    We experimentally demonstrated object recognition through scattering media based on direct machine learning of a number of speckle intensity images. In the experiments, speckle intensity images of amplitude or phase objects on a spatial light modulator between scattering plates were captured by a camera. We used the support vector machine for binary classification of the captured speckle intensity images of face and non-face data. The experimental results showed that speckles are sufficient for machine learning.

  13. Validation of the facial assessment by computer evaluation (FACE) program for software-aided eyelid measurements.

    PubMed

    Choi, Catherine J; Lefebvre, Daniel R; Yoon, Michael K

    2016-06-01

    The aim of this article is to validate the accuracy of Facial Assessment by Computer Evaluation (FACE) program in eyelid measurements. Sixteen subjects between the ages of 27 and 65 were included with IRB approval. Clinical measurements of upper eyelid margin reflex distance (MRD1) and inter-palpebral fissure (IPF) were obtained. Photographs were then taken with a digital single lens reflex camera with built-in pop-up flash (dSLR-pop) and a dSLR with lens-mounted ring flash (dSLR-ring) with the cameras upright, rotated 90, 180, and 270 degrees. The images were analyzed using both the FACE and ImageJ software to measure MRD1 and IPF.Thirty-two eyes of sixteen subjects were included. Comparison of clinical measurement of MRD1 and IPF with FACE measurements of photos in upright position showed no statistically significant differences for dSLR-pop (MRD1: p = 0.0912, IPF: p = 0.334) and for dSLR-ring (MRD1: p = 0.105, IPF: p = 0.538). One-to-one comparison of MRD1 and IPF measurements in four positions obtained with FACE versus ImageJ for dSLR-pop showed moderate to substantial agreement for MRD1 (intraclass correlation coefficient = 0.534 upright, 0.731 in 90 degree rotation, 0.627 in 180 degree rotation, 0.477 in 270 degree rotation) and substantial to excellent agreement in IPF (ICC = 0.740, 0.859, 0.849, 0.805). In photos taken with dSLR-ring, there was excellent agreement of all MRD1 (ICC = 0.916, 0.932, 0.845, 0.812) and IPF (ICC = 0.937, 0.938, 0.917, 0.888) values. The FACE program is a valid method for measuring margin reflex distance and inter-palpebral fissure.

  14. Human fatigue expression recognition through image-based dynamic multi-information and bimodal deep learning

    NASA Astrophysics Data System (ADS)

    Zhao, Lei; Wang, Zengcai; Wang, Xiaojin; Qi, Yazhou; Liu, Qing; Zhang, Guoxin

    2016-09-01

    Human fatigue is an important cause of traffic accidents. To improve the safety of transportation, we propose, in this paper, a framework for fatigue expression recognition using image-based facial dynamic multi-information and a bimodal deep neural network. First, the landmark of face region and the texture of eye region, which complement each other in fatigue expression recognition, are extracted from facial image sequences captured by a single camera. Then, two stacked autoencoder neural networks are trained for landmark and texture, respectively. Finally, the two trained neural networks are combined by learning a joint layer on top of them to construct a bimodal deep neural network. The model can be used to extract a unified representation that fuses landmark and texture modalities together and classify fatigue expressions accurately. The proposed system is tested on a human fatigue dataset obtained from an actual driving environment. The experimental results demonstrate that the proposed method performs stably and robustly, and that the average accuracy achieves 96.2%.

  15. Validation of Underwater Sensor Package Using Feature Based SLAM

    PubMed Central

    Cain, Christopher; Leonessa, Alexander

    2016-01-01

    Robotic vehicles working in new, unexplored environments must be able to locate themselves in the environment while constructing a picture of the objects in the environment that could act as obstacles that would prevent the vehicles from completing their desired tasks. In enclosed environments, underwater range sensors based off of acoustics suffer performance issues due to reflections. Additionally, their relatively high cost make them less than ideal for usage on low cost vehicles designed to be used underwater. In this paper we propose a sensor package composed of a downward facing camera, which is used to perform feature tracking based visual odometry, and a custom vision-based two dimensional rangefinder that can be used on low cost underwater unmanned vehicles. In order to examine the performance of this sensor package in a SLAM framework, experimental tests are performed using an unmanned ground vehicle and two feature based SLAM algorithms, the extended Kalman filter based approach and the Rao-Blackwellized, particle filter based approach, to validate the sensor package. PMID:26999142

  16. Fast Resistive Bolometry

    NASA Astrophysics Data System (ADS)

    Graham, Jeffrey

    2005-10-01

    A bolometer with microsecond scale response time is under construction for the Caltech spheromak experiment to measure radiation from a ˜20 μs duration plasma discharge emitting ˜10^2---10^3 kW/m^2. A gold film several micrometers thick absorbs the radiation, heats up, and the consequent change in resistance can be measured. The film itself is vacuum deposited upon a glass slide. Several geometries for the film are under consideration to optimize the amount of radiation absorbed, the response time and the signal-to-noise ratio. We measure the change in voltage across the film for a known current driven through it; a square pulse (3---30A, ˜20 μs) is used to avoid Joule heating. Results from prototypes tested with a UV flashlamp will be presented. After optimizing the bolometer design, the final vacuum-compatible diagnostic would consist of a plasma-facing bolometer and a reference in a camera obscura. This device could provide a design for fast resistive bolometry.

  17. Anti-Stokes effect CCD camera and SLD based optical coherence tomography for full-field imaging in the 1550nm region

    NASA Astrophysics Data System (ADS)

    Kredzinski, Lukasz; Connelly, Michael J.

    2012-06-01

    Full-field Optical coherence tomography is an en-face interferometric imaging technology capable of carrying out high resolution cross-sectional imaging of the internal microstructure of an examined specimen in a non-invasive manner. The presented system is based on competitively priced optical components available at the main optical communications band located in the 1550 nm region. It consists of a superluminescent diode and an anti-stokes imaging device. The single mode fibre coupled SLD was connected to a multi-mode fibre inserted into a mode scrambler to obtain spatially incoherent illumination, suitable for OCT wide-field modality in terms of crosstalk suppression and image enhancement. This relatively inexpensive system with moderate resolution of approximately 24um x 12um (axial x lateral) was constructed to perform a 3D cross sectional imaging of a human tooth. To our knowledge this is the first 1550 nm full-field OCT system reported.

  18. 41. ARAIII Prototype assembly and evaluation building ARA630. West end ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    41. ARA-III Prototype assembly and evaluation building ARA-630. West end and south side of building. Camera facing northeast. Ineel photo no. 3-22. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  19. 4. INTERIOR VIEW OF CLUB HOUSE REFRIGERATION UNIT, SHOWING COOLING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. INTERIOR VIEW OF CLUB HOUSE REFRIGERATION UNIT, SHOWING COOLING COILS AND CORK-LINED ROOM. CAMERA IS BETWEEN SEVEN AND EIGHT FEET ABOVE FLOOR LEVEL, FACING SOUTHEAST. - Swan Falls Village, Clubhouse 011, Snake River, Kuna, Ada County, ID

  20. Contextual view showing H1 on left and H270 in background; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing H1 on left and H270 in background; camera facing north. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  1. View looking across to building H1 from third floor porch ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View looking across to building H1 from third floor porch over entrance; camera facing south. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  2. Interior detail of scrolled brackets on post, west side of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of scrolled brackets on post, west side of first floor by rear entrance; camera facing north. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  3. Contextual view showing building 926 north wing at left and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing building 926 north wing at left and hospital historic district at right; camera facing north. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  4. Contextual view looking down clubhouse drive. Showing west elevation of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view looking down clubhouse drive. Showing west elevation of H1 on right; camera facing east. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  5. INTERIOR OF BOILER BUILDING, FIRST LEVEL, EAST SIDE, SHOWING STEAMDRIVEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR OF BOILER BUILDING, FIRST LEVEL, EAST SIDE, SHOWING STEAM-DRIVEN PISTON PUMPS FOR FUEL OIL, CAMERA FACING EAST. - New Haven Rail Yard, Central Steam Plant and Oil Storage, Vicinity of Union Avenue, New Haven, New Haven County, CT

  6. Impact Flash Monitoring Facility on the Deep Space Gateway

    NASA Astrophysics Data System (ADS)

    Needham, D. H.; Moser, D. E.; Suggs, R. M.; Cooke, W. J.; Kring, D. A.; Neal, C. R.; Fassett, C. I.

    2018-02-01

    Cameras mounted to the Deep Space Gateway exterior will detect flashes caused by impacts on the lunar surface. Observed flashes will help constrain the current lunar impact flux and assess hazards faced by crews living and working in cislunar space.

  7. 104. ARAIII. Interior view of room 110 in ARA607 used ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    104. ARA-III. Interior view of room 110 in ARA-607 used as data acquisition control room. Camera facing northeast. Ineel photo no. 81-103. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  8. On the performances of computer vision algorithms on mobile platforms

    NASA Astrophysics Data System (ADS)

    Battiato, S.; Farinella, G. M.; Messina, E.; Puglisi, G.; Ravì, D.; Capra, A.; Tomaselli, V.

    2012-01-01

    Computer Vision enables mobile devices to extract the meaning of the observed scene from the information acquired with the onboard sensor cameras. Nowadays, there is a growing interest in Computer Vision algorithms able to work on mobile platform (e.g., phone camera, point-and-shot-camera, etc.). Indeed, bringing Computer Vision capabilities on mobile devices open new opportunities in different application contexts. The implementation of vision algorithms on mobile devices is still a challenging task since these devices have poor image sensors and optics as well as limited processing power. In this paper we have considered different algorithms covering classic Computer Vision tasks: keypoint extraction, face detection, image segmentation. Several tests have been done to compare the performances of the involved mobile platforms: Nokia N900, LG Optimus One, Samsung Galaxy SII.

  9. The potential of low-cost RPAS for multi-view reconstruction of rock cliffs

    NASA Astrophysics Data System (ADS)

    Ettore Guccione, Davide; Thoeni, Klaus; Santise, Marina; Giacomini, Anna; Roncella, Riccardo; Forlani, Gianfranco

    2016-04-01

    RPAS, also known as drones or UAVs, have been used in military applications for many years. Nevertheless, the technology has become accessible to everyone only in recent years (Westoby et al., 2012; Nex and Remondino, 2014). Electric multirotor helicopters or multicopters have become one of the most exciting developments and several off-the-shelf platforms (including camera) are now available. In particular, RPAS can provide 3D models of sub-vertical rock faces, which for instance are needed for rockfall hazard assessments along road cuts and very steep mountains. The current work investigates the potential of two low-cost off-the-shelf quadcopters equipped with digital cameras for multi-view reconstruction of sub-vertical rock cliffs. The two platforms used are a DJI Phantom 1 (P1) equipped with a Gopro Hero 3+ (12MP) and a DJI Phantom 3 Professional (P3). The latter comes with an integrated 12MP camera mounted on a 3-axis gimbal. Both platforms cost less than 1.500€ including camera. The study area is a small rock cliff near the Callaghan Campus of the University of Newcastle (Thoeni et al., 2014). The wall is partly smooth with some evident geological features such as non-persistent joints and sharp edges. Several flights were performed with both cameras set in time-lapse mode. Hence, images were taken automatically but the flights were performed manually since the investigated rock face is very irregular which required adjusting the yaw and roll for optimal coverage since the flights were performed very close to the cliff face. The digital images were processed with a commercial SfM software package. Thereby, several processing options and camera networks were investigated in order to define the most accurate configuration. Firstly, the difference between the use of coded ground control targets versus natural features was studied. Coded targets generally provide the best accuracy but they need to be placed on the surface which is not always possible as rock cliffs are not easily accessible. Nevertheless, work natural features can provide a good alternative if chosen wisely. Secondly, the influence of using fixed interior orientation parameters and self-calibration was investigated. The results show that in the case of the used sensors and camera networks self-calibration provides better results. This can mainly be attributed to the fact that the object distance is not constant and rather small (less than 10m) and that both cameras do not provide an option for fixing the interior orientation parameters. Finally, the results of both platforms are as well compared to a point cloud obtained with a terrestrial laser scanner where generally a very good agreement is observed. References Nex, F., Remondino, F. (2014) UAV for 3D mapping applications: a review. Applied Geomatics 6(1), 1-15. Thoeni, K., Giacomini, A., Murtagh, R., Kniest, E. (2014) A comparison of multi-view 3D reconstruction of a rock wall using several cameras and a laser scanner. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XL-5, 573-580. Westoby, M.J., Brasington, J., Glasser, N.F., Hambrey, M.J., Reynolds, J.M. (2012) 'Structure-from-Motion' photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 179, 300-314.

  10. Financial Awareness Education with Apprentices in the Australian Construction Industry: Program Evaluation

    ERIC Educational Resources Information Center

    Du Plessis, Karin; Green, Emma

    2013-01-01

    A financial awareness education program was implemented with construction industry apprentices in Victoria, Australia. The program included face-to-face delivery of education around a range of financial management issues that apprentices face as they begin their apprenticeship. The paper reports on an evaluation of the program, which included…

  11. Geological Prediction Ahead of Tunnel Face in the Limestone Formation Tunnel using Multi-Modal Geophysical Surveys

    NASA Astrophysics Data System (ADS)

    Zaki, N. F. M.; Ismail, M. A. M.; Hazreek Zainal Abidin, Mohd; Madun, Aziman

    2018-04-01

    Tunnel construction in typical karst topography face the risk which unknown geological condition such as abundant rainwater, ground water and cavities. Construction of tunnel in karst limestone frequently lead to potentially over-break of rock formation and cause failure to affected area. Physical character of limestone which consists large cavity prone to sudden failure and become worsen due to misinterpretation of rock quality by engineer and geologists during analysis stage and improper method adopted in construction stage. Consideration for execution of laboratory and field testing in rock limestone should be well planned and arranged in tunnel construction project. Several tests including Ground Penetration Radar (GPR) and geological face mapping were studied in this research to investigate the performances of limestone rock in tunnel construction, measured in term of rock mass quality that used for risk assessment. The objective of this study is to focus on the prediction of geological condition ahead of tunnel face using short range method (GPR) and verified by geological face mapping method to determine the consistency of actual geological condition on site. Q-Value as the main indicator for rock mass classification was obtained from geological face mapping method. The scope of this study is covering for tunnelling construction along 756 meters in karst limestone area which located at Timah Tasoh Tunnel, Bukit Tebing Tinggi, Perlis. For this case study, 15% of GPR results was identified as inaccurate for rock mass classification in which certain chainage along this tunnel with 34 out of 224 data from GPR was identified as incompatible with actual face mapping.

  12. Recovering faces from memory: the distracting influence of external facial features.

    PubMed

    Frowd, Charlie D; Skelton, Faye; Atherton, Chris; Pitchford, Melanie; Hepton, Gemma; Holden, Laura; McIntyre, Alex H; Hancock, Peter J B

    2012-06-01

    Recognition memory for unfamiliar faces is facilitated when contextual cues (e.g., head pose, background environment, hair and clothing) are consistent between study and test. By contrast, inconsistencies in external features, especially hair, promote errors in unfamiliar face-matching tasks. For the construction of facial composites, as carried out by witnesses and victims of crime, the role of external features (hair, ears, and neck) is less clear, although research does suggest their involvement. Here, over three experiments, we investigate the impact of external features for recovering facial memories using a modern, recognition-based composite system, EvoFIT. Participant-constructors inspected an unfamiliar target face and, one day later, repeatedly selected items from arrays of whole faces, with "breeding," to "evolve" a composite with EvoFIT; further participants (evaluators) named the resulting composites. In Experiment 1, the important internal-features (eyes, brows, nose, and mouth) were constructed more identifiably when the visual presence of external features was decreased by Gaussian blur during construction: higher blur yielded more identifiable internal-features. In Experiment 2, increasing the visible extent of external features (to match the target's) in the presented face-arrays also improved internal-features quality, although less so than when external features were masked throughout construction. Experiment 3 demonstrated that masking external-features promoted substantially more identifiable images than using the previous method of blurring external-features. Overall, the research indicates that external features are a distractive rather than a beneficial cue for face construction; the results also provide a much better method to construct composites, one that should dramatically increase identification of offenders.

  13. Use Hardwoods for Building Components

    Treesearch

    Glenn A. Cooper; William W. Rice

    1968-01-01

    Describes a system for prefabricating structural units from hardwoods for use in floors, roofs, and walls of a-frame or post-and-beam type construction. The interior face of the unit is decorative paneling; the exterior face is sheathing. Use of the system could reduce prefabricated house construction costs compared to conventional construction costs.

  14. Development of a safe ultraviolet camera system to enhance awareness by showing effects of UV radiation and UV protection of the skin (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Verdaasdonk, Rudolf M.; Wedzinga, Rosaline; van Montfrans, Bibi; Stok, Mirte; Klaessens, John; van der Veen, Albert

    2016-03-01

    The significant increase of skin cancer occurring in the western world is attributed to longer sun expose during leisure time. For prevention, people should become aware of the risks of UV light exposure by showing skin damage and the protective effect of sunscreen with an UV camera. An UV awareness imaging system optimized for 365 nm (UV-A) was develop using consumer components being interactive, safe and mobile. A Sony NEX5t camera was adapted to full spectral range. In addition, UV transparent lenses and filters were selected based on spectral characteristics measured (Schott S8612 and Hoya U-340 filters) to obtain the highest contrast for e.g. melanin spots and wrinkles on the skin. For uniform UV illumination, 2 facial tanner units were adapted with UV 365 nm black light fluorescent tubes. Safety of the UV illumination was determined relative to the sun and with absolute irradiance measurements at the working distance. A maximum exposure time over 15 minutes was calculate according the international safety standards. The UV camera was successfully demonstrated during the Dutch National Skin Cancer day and was well received by dermatologists and participating public. Especially, the 'black paint' effect putting sun screen on the face was dramatic and contributed to the awareness of regions on the face what are likely to be missed applying sunscreen. The UV imaging system shows to be promising for diagnostics and clinical studies in dermatology and potentially in other areas (dentistry and ophthalmology)

  15. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  16. User-assisted visual search and tracking across distributed multi-camera networks

    NASA Astrophysics Data System (ADS)

    Raja, Yogesh; Gong, Shaogang; Xiang, Tao

    2011-11-01

    Human CCTV operators face several challenges in their task which can lead to missed events, people or associations, including: (a) data overload in large distributed multi-camera environments; (b) short attention span; (c) limited knowledge of what to look for; and (d) lack of access to non-visual contextual intelligence to aid search. Developing a system to aid human operators and alleviate such burdens requires addressing the problem of automatic re-identification of people across disjoint camera views, a matching task made difficult by factors such as lighting, viewpoint and pose changes and for which absolute scoring approaches are not best suited. Accordingly, we describe a distributed multi-camera tracking (MCT) system to visually aid human operators in associating people and objects effectively over multiple disjoint camera views in a large public space. The system comprises three key novel components: (1) relative measures of ranking rather than absolute scoring to learn the best features for matching; (2) multi-camera behaviour profiling as higher-level knowledge to reduce the search space and increase the chance of finding correct matches; and (3) human-assisted data mining to interactively guide search and in the process recover missing detections and discover previously unknown associations. We provide an extensive evaluation of the greater effectiveness of the system as compared to existing approaches on industry-standard i-LIDS multi-camera data.

  17. Late afternoon view of the interior of the eastcentral wall ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Late afternoon view of the interior of the east-central wall section to be removed; camera facing north. Stubby crape myrtle in front of wall. Metal Quonset hut in background. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  18. Morning view, contextual view of the exterior west side of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view of the exterior west side of the north wall along the unpaved road; camera facing west, positioned in road approximately 8 posts west of the gate. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  19. Contextual view of Treasure Island showing Palace of Fine and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Treasure Island showing Palace of Fine and Decorative Arts (building 3) at right,and Port of the Trade Winds is in foreground, camera facing north - Golden Gate International Exposition, Treasure Island, San Francisco, San Francisco County, CA

  20. 80. ARAIII. Forming of the mechanical equipment pit in reactor ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    80. ARA-III. Forming of the mechanical equipment pit in reactor building (ARA-608). Camera facing northwest. September 22, 1958. Ineel photo no. 58-4675. Photographer: Jack L. Anderson. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  1. Contextual view showing west elevations of building H81 on right ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing west elevations of building H81 on right and H1 in middle; camera facing northeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  2. OVERVIEW OF CENTRAL HEATING PLANT, WITH OIL STORAGE ON LEFT, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    OVERVIEW OF CENTRAL HEATING PLANT, WITH OIL STORAGE ON LEFT, BOILER BUILDING ON RIGHT, SOUTH AND EAST ELEVATIONS, CAMERA FACING NORTH. - New Haven Rail Yard, Central Steam Plant and Oil Storage, Vicinity of Union Avenue, New Haven, New Haven County, CT

  3. Concave Surround Optics for Rapid Multi-View Imaging

    DTIC Science & Technology

    2006-11-01

    thus is amenable to capturing dynamic events avoiding the need to construct and calibrate an array of cameras. We demonstrate the system with a high...hard to assemble and calibrate . In this paper we present an optical system capable of rapidly moving the viewpoint around a scene. Our system...flexibility, large camera arrays are typically expensive and require significant effort to calibrate temporally, geometrically and chromatically

  4. The NASA Fireball Network All-Sky Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Rob M.

    2011-01-01

    The construction of small, inexpensive all-sky cameras designed specifically for the NASA Fireball Network is described. The use of off-the-shelf electronics, optics, and plumbing materials results in a robust and easy to duplicate design. Engineering challenges such as weather-proofing and thermal control and their mitigation are described. Field-of-view and gain adjustments to assure uniformity across the network will also be detailed.

  5. STS-98 U.S. Lab Destiny rests in Atlantis' payload bay

    NASA Technical Reports Server (NTRS)

    2001-01-01

    KENNEDY SPACE CENTER, Fla. -- This closeup reveals the tight clearance between an elbow camera on the robotic arm (left) and the U.S. Lab Destiny when the payload bay doors are closed. Measurements of the elbow camera revealed only a one-inch clearance from the U.S. Lab payload, which is under review. A key element in the construction of the International Space Station, Destiny is 28 feet long and weighs 16 tons. Destiny will be attached to the Unity node on the ISS using the Shuttle'''s robot arm, with the help of the camera. This research and command-and-control center is the most sophisticated and versatile space laboratory ever built. It will ultimately house a total of 23 experiment racks for crew support and scientific research. Destiny will fly on STS-98, the seventh construction flight to the ISS. Launch of STS-98 is scheduled for Jan. 19 at 2:11 a.m. EST.

  6. CCD TV focal plane guider development and comparison to SIRTF applications

    NASA Technical Reports Server (NTRS)

    Rank, David M.

    1989-01-01

    It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.

  7. Use of a microscope-mounted wide-angle point of view camera to record optimal hand position in ocular surgery.

    PubMed

    Gooi, Patrick; Ahmed, Yusuf; Ahmed, Iqbal Ike K

    2014-07-01

    We describe the use of a microscope-mounted wide-angle point-of-view camera to record optimal hand positions in ocular surgery. The camera is mounted close to the objective lens beneath the surgeon's oculars and faces the same direction as the surgeon, providing a surgeon's view. A wide-angle lens enables viewing of both hands simultaneously and does not require repositioning the camera during the case. Proper hand positioning and instrument placement through microincisions are critical for effective and atraumatic handling of tissue within the eye. Our technique has potential in the assessment and training of optimal hand position for surgeons performing intraocular surgery. It is an innovative way to routinely record instrument and operating hand positions in ophthalmic surgery and has minimal requirements in terms of cost, personnel, and operating-room space. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  8. An electrically tunable plenoptic camera using a liquid crystal microlens array.

    PubMed

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng

    2015-05-01

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.

  9. An electrically tunable plenoptic camera using a liquid crystal microlens array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Yu; School of Automation, Huazhong University of Science and Technology, Wuhan 430074; Wuhan National Laboratory for Optoelectronics, Huazhong University of Science and Technology, Wuhan 430074

    2015-05-15

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated withmore » an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.« less

  10. An electrically tunable plenoptic camera using a liquid crystal microlens array

    NASA Astrophysics Data System (ADS)

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng

    2015-05-01

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.

  11. From a Million Miles Away, NASA Camera Shows Moon Crossing Face of Earth

    NASA Image and Video Library

    2015-08-05

    This animation shows images of the far side of the moon, illuminated by the sun, as it crosses between the DISCOVR spacecraft's Earth Polychromatic Imaging Camera (EPIC) camera and telescope, and the Earth - one million miles away. Credits: NASA/NOAA A NASA camera aboard the Deep Space Climate Observatory (DSCOVR) satellite captured a unique view of the moon as it moved in front of the sunlit side of Earth last month. The series of test images shows the fully illuminated “dark side” of the moon that is never visible from Earth. The images were captured by NASA’s Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope on the DSCOVR satellite orbiting 1 million miles from Earth. From its position between the sun and Earth, DSCOVR conducts its primary mission of real-time solar wind monitoring for the National Oceanic and Atmospheric Administration (NOAA). Read more: www.nasa.gov/feature/goddard/from-a-million-miles-away-na... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  12. From a Million Miles Away, NASA Camera Shows Moon Crossing Face of Earth

    NASA Image and Video Library

    2017-12-08

    This animation still image shows the far side of the moon, illuminated by the sun, as it crosses between the DISCOVR spacecraft's Earth Polychromatic Imaging Camera (EPIC) camera and telescope, and the Earth - one million miles away. Credits: NASA/NOAA A NASA camera aboard the Deep Space Climate Observatory (DSCOVR) satellite captured a unique view of the moon as it moved in front of the sunlit side of Earth last month. The series of test images shows the fully illuminated “dark side” of the moon that is never visible from Earth. The images were captured by NASA’s Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope on the DSCOVR satellite orbiting 1 million miles from Earth. From its position between the sun and Earth, DSCOVR conducts its primary mission of real-time solar wind monitoring for the National Oceanic and Atmospheric Administration (NOAA). Read more: www.nasa.gov/feature/goddard/from-a-million-miles-away-na... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  13. Fast Fiber-Coupled Imaging Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brockington, Samuel; Case, Andrew; Witherspoon, Franklin Douglas

    HyperV Technologies Corp. has successfully designed, built and experimentally demonstrated a full scale 1024 pixel 100 MegaFrames/s fiber coupled camera with 12 or 14 bits, and record lengths of 32K frames, exceeding our original performance objectives. This high-pixel-count, fiber optically-coupled, imaging diagnostic can be used for investigating fast, bright plasma events. In Phase 1 of this effort, a 100 pixel fiber-coupled fast streak camera for imaging plasma jet profiles was constructed and successfully demonstrated. The resulting response from outside plasma physics researchers emphasized development of increased pixel performance as a higher priority over increasing pixel count. In this Phase 2more » effort, HyperV therefore focused on increasing the sample rate and bit-depth of the photodiode pixel designed in Phase 1, while still maintaining a long record length and holding the cost per channel to levels which allowed up to 1024 pixels to be constructed. Cost per channel was 53.31 dollars, very close to our original target of $50 per channel. The system consists of an imaging "camera head" coupled to a photodiode bank with an array of optical fibers. The output of these fast photodiodes is then digitized at 100 Megaframes per second and stored in record lengths of 32,768 samples with bit depths of 12 to 14 bits per pixel. Longer record lengths are possible with additional memory. A prototype imaging system with up to 1024 pixels was designed and constructed and used to successfully take movies of very fast moving plasma jets as a demonstration of the camera performance capabilities. Some faulty electrical components on the 64 circuit boards resulted in only 1008 functional channels out of 1024 on this first generation prototype system. We experimentally observed backlit high speed fan blades in initial camera testing and then followed that with full movies and streak images of free flowing high speed plasma jets (at 30-50 km/s). Jet structure and jet collisions onto metal pillars in the path of the plasma jets were recorded in a single shot. This new fast imaging system is an attractive alternative to conventional fast framing cameras for applications and experiments where imaging events using existing techniques are inefficient or impossible. The development of HyperV's new diagnostic was split into two tracks: a next generation camera track, in which HyperV built, tested, and demonstrated a prototype 1024 channel camera at its own facility, and a second plasma community beta test track, where selected plasma physics programs received small systems of a few test pixels to evaluate the expected performance of a full scale camera on their experiments. These evaluations were performed as part of an unfunded collaboration with researchers at Los Alamos National Laboratory and the University of California at Davis. Results from the prototype 1024-pixel camera are discussed, as well as results from the collaborations with test pixel system deployment sites.« less

  14. A TV Camera System Which Extracts Feature Points For Non-Contact Eye Movement Detection

    NASA Astrophysics Data System (ADS)

    Tomono, Akira; Iida, Muneo; Kobayashi, Yukio

    1990-04-01

    This paper proposes a highly efficient camera system which extracts, irrespective of background, feature points such as the pupil, corneal reflection image and dot-marks pasted on a human face in order to detect human eye movement by image processing. Two eye movement detection methods are sugested: One utilizing face orientation as well as pupil position, The other utilizing pupil and corneal reflection images. A method of extracting these feature points using LEDs as illumination devices and a new TV camera system designed to record eye movement are proposed. Two kinds of infra-red LEDs are used. These LEDs are set up a short distance apart and emit polarized light of different wavelengths. One light source beams from near the optical axis of the lens and the other is some distance from the optical axis. The LEDs are operated in synchronization with the camera. The camera includes 3 CCD image pick-up sensors and a prism system with 2 boundary layers. Incident rays are separated into 2 wavelengths by the first boundary layer of the prism. One set of rays forms an image on CCD-3. The other set is split by the half-mirror layer of the prism and forms an image including the regularly reflected component by placing a polarizing filter in front of CCD-1 or another image not including the component by not placing a polarizing filter in front of CCD-2. Thus, three images with different reflection characteristics are obtained by three CCDs. Through the experiment, it is shown that two kinds of subtraction operations between the three images output from CCDs accentuate three kinds of feature points: the pupil and corneal reflection images and the dot-marks. Since the S/N ratio of the subtracted image is extremely high, the thresholding process is simple and allows reducting the intensity of the infra-red illumination. A high speed image processing apparatus using this camera system is decribed. Realtime processing of the subtraction, thresholding and gravity position calculation of the feature points is possible.

  15. "He Said What?!" Constructed Dialogue in Various Interface Modes

    ERIC Educational Resources Information Center

    Young, Lesa; Morris, Carla; Langdon, Clifton

    2012-01-01

    This study analyzes the manifestation of constructed dialogue in ASL narratives as dependent on the interface mode (i.e., face-to-face conversation, electronic conversation over videophone, and vlog monologues). Comparisons of eye gaze over three interface modes shows how aspects of constructed dialogue are altered to fit the communication mode.…

  16. New method for analysis of facial growth in a pediatric reconstructed mandible.

    PubMed

    Kau, Chung How; Kamel, Sherif Galal; Wilson, Jim; Wong, Mark E

    2011-04-01

    The aim of this article was to present a new method of analysis for the assessment of facial growth and morphology after surgical resection of the mandible in a growing patient. This was a 2-year longitudinal study of facial growth in a child who had undergone segmental resection of the mandible with immediate reconstruction as a treatment for juvenile aggressive fibromatosis. Three-dimensional digital stereo-photogrammteric cameras were used for image acquisition at several follow-up intervals: immediate, 6 months, and 2 years postresection. After processing and superimposition, shell-to-shell deviation maps were used for the analysis of the facial growth pattern and its deviation from normal growth. The changes were seen as mean surface changes and color maps. An average constructed female face from a previous study was used as a reference for a normal growth pattern. The patient showed significant growth during this period. Positive changes took place around the nose, lateral brow area, and lower lip and chin, whereas negative changes were evident at the lower lips and cheeks area. An increase in the vertical dimension of the face at the chin region was also seen prominently. Three-dimensional digital stereo-photogrammetry can be used as an objective, noninvasive method for quantifying and monitoring facial growth and its abnormalities. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  17. 3D FaceCam: a fast and accurate 3D facial imaging device for biometrics applications

    NASA Astrophysics Data System (ADS)

    Geng, Jason; Zhuang, Ping; May, Patrick; Yi, Steven; Tunnell, David

    2004-08-01

    Human faces are fundamentally three-dimensional (3D) objects, and each face has its unique 3D geometric profile. The 3D geometric features of a human face can be used, together with its 2D texture, for rapid and accurate face recognition purposes. Due to the lack of low-cost and robust 3D sensors and effective 3D facial recognition (FR) algorithms, almost all existing FR systems use 2D face images. Genex has developed 3D solutions that overcome the inherent problems in 2D while also addressing limitations in other 3D alternatives. One important aspect of our solution is a unique 3D camera (the 3D FaceCam) that combines multiple imaging sensors within a single compact device to provide instantaneous, ear-to-ear coverage of a human face. This 3D camera uses three high-resolution CCD sensors and a color encoded pattern projection system. The RGB color information from each pixel is used to compute the range data and generate an accurate 3D surface map. The imaging system uses no moving parts and combines multiple 3D views to provide detailed and complete 3D coverage of the entire face. Images are captured within a fraction of a second and full-frame 3D data is produced within a few seconds. This described method provides much better data coverage and accuracy in feature areas with sharp features or details (such as the nose and eyes). Using this 3D data, we have been able to demonstrate that a 3D approach can significantly improve the performance of facial recognition. We have conducted tests in which we have varied the lighting conditions and angle of image acquisition in the "field." These tests have shown that the matching results are significantly improved when enrolling a 3D image rather than a single 2D image. With its 3D solutions, Genex is working toward unlocking the promise of powerful 3D FR and transferring FR from a lab technology into a real-world biometric solution.

  18. Energy conservation using face detection

    NASA Astrophysics Data System (ADS)

    Deotale, Nilesh T.; Kalbande, Dhananjay R.; Mishra, Akassh A.

    2011-10-01

    Computerized Face Detection, is concerned with the difficult task of converting a video signal of a person to written text. It has several applications like face recognition, simultaneous multiple face processing, biometrics, security, video surveillance, human computer interface, image database management, digital cameras use face detection for autofocus, selecting regions of interest in photo slideshows that use a pan-and-scale and The Present Paper deals with energy conservation using face detection. Automating the process to a computer requires the use of various image processing techniques. There are various methods that can be used for Face Detection such as Contour tracking methods, Template matching, Controlled background, Model based, Motion based and color based. Basically, the video of the subject are converted into images are further selected manually for processing. However, several factors like poor illumination, movement of face, viewpoint-dependent Physical appearance, Acquisition geometry, Imaging conditions, Compression artifacts makes Face detection difficult. This paper reports an algorithm for conservation of energy using face detection for various devices. The present paper suggests Energy Conservation can be done by Detecting the Face and reducing the brightness of complete image and then adjusting the brightness of the particular area of an image where the face is located using histogram equalization.

  19. Trench Reveals Two Faces of Soils

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This approximate true-color image mosaic from the panoramic camera on the Mars Exploration Rover Opportunity shows a trench dug by the rover in the vicinity of the 'Anatolia' region. Two imprints from the rover's Mossbauer spectrometer instrument were left in the exposed soils. Detailed comparisons between soils exposed at the surface and those found at depth reveal that surface soils have higher levels of hematite while subsurface soils show fine particles derived from basalt. The trench is approximately 11 centimeters deep. This image was taken on sol 81 with the panoramic camera's 430-, 530- and 750-nanometer filters.

  20. 'El Capitan's' Scientific Gems

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This mosaic of images taken by the panoramic camera onboard the Mars Exploration Rover Opportunity shows the rock region dubbed 'El Capitan,' which lies within the larger outcrop near the rover's landing site. 'El Capitan' is being studied in great detail using the scientific instruments on the rover's arm; images from the panoramic camera help scientists choose the locations for this compositional work. The millimeter-scale detail of the lamination covering these rocks can be seen. The face of the rock to the right of the mosaic may be a future target for grinding with the rover's rock abrasion tool.

  1. Effectiveness of Taxicab Security Equipment in Reducing Driver Homicide Rates

    PubMed Central

    Menéndez, Cammie K.C.; Amandus, Harlan E.; Damadi, Parisa; Wu, Nan; Konda, Srinivas; Hendricks, Scott A.

    2015-01-01

    Background Taxicab drivers historically have had one of the highest work-related homicide rates of any occupation. In 2010 the taxicab driver homicide rate was 7.4 per 100,000 drivers, compared to the overall rate of 0.37 per 100,000 workers. Purpose Evaluate the effectiveness of taxicab security cameras and partitions on citywide taxicab driver homicide rates. Methods Taxicab driver homicide rates were compared in 26 major cities in the U.S. licensing taxicabs with security cameras (n=8); bullet-resistant partitions (n=7); and cities where taxicabs were not equipped with either security cameras or partitions (n=11). News clippings of taxicab driver homicides and the number of licensed taxicabs by city were used to construct taxicab driver homicide rates spanning 15 years (1996–2010). Generalized estimating equations were constructed to model the Poisson-distributed homicide rates on city-specific safety equipment installation status, controlling for city homicide rate and the concurrent decline of homicide rates over time. Data were analyzed in 2012. Results Cities with cameras experienced a threefold reduction in taxicab driver homicides compared with control cities (RR=0.27; 95% CI=0.12, 0.61; p=0.002). There was no difference in homicide rates for cities with partitions compared with control cities (RR=1.15; 95% CI=0.80, 1.64; p=0.575). Conclusions Municipal ordinances and company policies mandating security cameras appear to be highly effective in reducing taxicab driver deaths due to workplace violence. PMID:23790983

  2. LPT. Low power test control building (TAN641) east facade. Sign ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Low power test control building (TAN-641) east facade. Sign says "Energy and Systems Technology Laboratory, INEL" (Post-ANP-use). Camera facing west. INEEL negative no. HD-40-3-2 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  3. 20. VIEW OF TEST FACILITY IN 1967 WHEN EQUIPPED FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. VIEW OF TEST FACILITY IN 1967 WHEN EQUIPPED FOR DOSIMETER TEST BY HEALTH PHYSICISTS. CAMERA FACING EAST. INEL PHOTO NUMBER 76-2853, TAKEN MAY 16, 1967. PHOTOGRAPHER: CAPEK. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  4. Contextual view of Fyffe Avenue and Boone Drive. Dispensary (Naval ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Fyffe Avenue and Boone Drive. Dispensary (Naval Medical Center Oakland and Dental Clinic San Francisco Branch Clinics, Building no. 417) is shown at left. Camera facing northwest. - Naval Supply Annex Stockton, Rough & Ready Island, Stockton, San Joaquin County, CA

  5. Contextual view of Fyffe Avenue and Boone Drive. Dispensary (Naval ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Fyffe Avenue and Boone Drive. Dispensary (Naval Medical Center Oakland and Dental Clinic San Francisco Branch Clinics, building no. 417) is shown at the center. Camera facing northeast. - Naval Supply Annex Stockton, Rough & Ready Island, Stockton, San Joaquin County, CA

  6. SPERTI, Instrument Cell Building (PER606). Oblique view of north and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SPERT-I, Instrument Cell Building (PER-606). Oblique view of north and east facades. Camera facing southwest. Date: August 2003. INEEL negative no. HD-35-4-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  7. Contextual view showing building H70 at left with building H81 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view showing building H70 at left with building H81 at right in background; camera facing northeast. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  8. 3. VIEW OF ARVFS BUNKER TAKEN FROM APPROXIMATELY 150 FEET ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VIEW OF ARVFS BUNKER TAKEN FROM APPROXIMATELY 150 FEET EAST OF BUNKER DOOR. CAMERA FACING WEST. VIEW SHOWS EARTH MOUND COVERING CONTROL BUNKER AND REMAINS OF CABLE CHASE. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  9. FAST CHOPPER BUILDING, TRA665. DETAIL OF STEEL DOOR ENTRY TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FAST CHOPPER BUILDING, TRA-665. DETAIL OF STEEL DOOR ENTRY TO LOWER LEVEL. CAMERA FACING NORTH. INL NEGATIVE NO. HD42-1. Mike Crane, Photographer, 3/2004 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  10. KENNEDY SPACE CENTER, FLA. - NASA Vehicle Manager Scott Thurston (facing camera) talks to the media in the Orbiter Processing Facility. The media was invited to see the orbiter Atlantis as it is being prepared for Return to Flight. Both local and national reporters representing print and TV networks were able to see work in progress on Atlantis, including the reinstallation of the Reinforced Carbon-Carbon panels on the orbiter’s wing leading edge; wiring inspections; and checks of the engines in the Orbital Maneuvering System.

    NASA Image and Video Library

    2003-09-26

    KENNEDY SPACE CENTER, FLA. - NASA Vehicle Manager Scott Thurston (facing camera) talks to the media in the Orbiter Processing Facility. The media was invited to see the orbiter Atlantis as it is being prepared for Return to Flight. Both local and national reporters representing print and TV networks were able to see work in progress on Atlantis, including the reinstallation of the Reinforced Carbon-Carbon panels on the orbiter’s wing leading edge; wiring inspections; and checks of the engines in the Orbital Maneuvering System.

  11. Hubble Space Telescope: Wide field and planetary camera instrument handbook. Version 2.1

    NASA Technical Reports Server (NTRS)

    Griffiths, Richard (Editor)

    1990-01-01

    An overview is presented of the development and construction of the Wide Field and Planetary Camera (WF/PC). The WF/PC is a duel two dimensional spectrophotometer with rudimentary polarimetric and transmission grating capabilities. The instrument operates from 1150 to 11000 A with a resolution of 0.1 arcsec per pixel or 0.043 arcsec per pixel. Data products and standard calibration methods are briefly summarized.

  12. Strategic options towards an affordable high-performance infrared camera

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.

    2016-05-01

    The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.

  13. Predicting high risk of exacerbations in bronchiectasis: the E-FACED score.

    PubMed

    Martinez-Garcia, M A; Athanazio, R A; Girón, R; Máiz-Carro, L; de la Rosa, D; Olveira, C; de Gracia, J; Vendrell, M; Prados-Sánchez, C; Gramblicka, G; Corso Pereira, M; Lundgren, F L; Fernandes De Figueiredo, M; Arancibia, F; Rached, S Z

    2017-01-01

    Although the FACED score has demonstrated a great prognostic capacity in bronchiectasis, it does not include the number or severity of exacerbations as a separate variable, which is important in the natural history of these patients. Construction and external validation of a new index, the E-FACED, to evaluate the predictive capacity of exacerbations and mortality. The new score was constructed on the basis of the complete cohort for the construction of the original FACED score, while the external validation was undertaken with six cohorts from three countries (Brazil, Argentina, and Chile). The main outcome was the number of annual exacerbations/hospitalizations, with all-cause and respiratory-related deaths as the secondary outcomes. A statistical evaluation comprised the relative weight and ideal cut-off point for the number or severity of the exacerbations and was incorporated into the FACED score (E-FACED). The results obtained after the application of FACED and E-FACED were compared in both the cohorts. A total of 1,470 patients with bronchiectasis (819 from the construction cohorts and 651 from the external validation cohorts) were followed up for 5 years after diagnosis. The best cut-off point was at least two exacerbations in the previous year (two additional points), meaning that the E-FACED has nine points of growing severity. E-FACED presented an excellent prognostic capacity for exacerbations (areas under the receiver operating characteristic curve: 0.82 for at least two exacerbations in 1 year and 0.87 for at least one hospitalization in 1 year) that was statistically better than that of the FACED score (0.72 and 0.78, P <0.05, respectively). The predictive capacities for all-cause and respiratory mortality were 0.87 and 0.86, respectively, with both being similar to those of the FACED. E-FACED score significantly increases the FACED capacity to predict future yearly exacerbations while maintaining the score's simplicity and prognostic capacity for death.

  14. Predicting high risk of exacerbations in bronchiectasis: the E-FACED score

    PubMed Central

    Martinez-Garcia, MA; Athanazio, RA; Girón, R; Máiz-Carro, L; de la Rosa, D; Olveira, C; de Gracia, J; Vendrell, M; Prados-Sánchez, C; Gramblicka, G; Corso Pereira, M; Lundgren, FL; Fernandes De Figueiredo, M; Arancibia, F; Rached, SZ

    2017-01-01

    Background Although the FACED score has demonstrated a great prognostic capacity in bronchiectasis, it does not include the number or severity of exacerbations as a separate variable, which is important in the natural history of these patients. Objective Construction and external validation of a new index, the E-FACED, to evaluate the predictive capacity of exacerbations and mortality. Methods The new score was constructed on the basis of the complete cohort for the construction of the original FACED score, while the external validation was undertaken with six cohorts from three countries (Brazil, Argentina, and Chile). The main outcome was the number of annual exacerbations/hospitalizations, with all-cause and respiratory-related deaths as the secondary outcomes. A statistical evaluation comprised the relative weight and ideal cut-off point for the number or severity of the exacerbations and was incorporated into the FACED score (E-FACED). The results obtained after the application of FACED and E-FACED were compared in both the cohorts. Results A total of 1,470 patients with bronchiectasis (819 from the construction cohorts and 651 from the external validation cohorts) were followed up for 5 years after diagnosis. The best cut-off point was at least two exacerbations in the previous year (two additional points), meaning that the E-FACED has nine points of growing severity. E-FACED presented an excellent prognostic capacity for exacerbations (areas under the receiver operating characteristic curve: 0.82 for at least two exacerbations in 1 year and 0.87 for at least one hospitalization in 1 year) that was statistically better than that of the FACED score (0.72 and 0.78, P<0.05, respectively). The predictive capacities for all-cause and respiratory mortality were 0.87 and 0.86, respectively, with both being similar to those of the FACED. Conclusion E-FACED score significantly increases the FACED capacity to predict future yearly exacerbations while maintaining the score’s simplicity and prognostic capacity for death. PMID:28182132

  15. Beam measurements using visible synchrotron light at NSLS2 storage ring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Weixing, E-mail: chengwx@bnl.gov; Bacha, Bel; Singh, Om

    2016-07-27

    Visible Synchrotron Light Monitor (SLM) diagnostic beamline has been designed and constructed at NSLS2 storage ring, to characterize the electron beam profile at various machine conditions. Due to the excellent alignment, SLM beamline was able to see the first visible light when beam was circulating the ring for the first turn. The beamline has been commissioned for the past year. Besides a normal CCD camera to monitor the beam profile, streak camera and gated camera are used to measure the longitudinal and transverse profile to understand the beam dynamics. Measurement results from these cameras will be presented in this paper.more » A time correlated single photon counting system (TCSPC) has also been setup to measure the single bunch purity.« less

  16. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  17. Standardized rendering from IR surveillance motion imagery

    NASA Astrophysics Data System (ADS)

    Prokoski, F. J.

    2014-06-01

    Government agencies, including defense and law enforcement, increasingly make use of video from surveillance systems and camera phones owned by non-government entities.Making advanced and standardized motion imaging technology available to private and commercial users at cost-effective prices would benefit all parties. In particular, incorporating thermal infrared into commercial surveillance systems offers substantial benefits beyond night vision capability. Face rendering is a process to facilitate exploitation of thermal infrared surveillance imagery from the general area of a crime scene, to assist investigations with and without cooperating eyewitnesses. Face rendering automatically generates greyscale representations similar to police artist sketches for faces in surveillance imagery collected from proximate locations and times to a crime under investigation. Near-realtime generation of face renderings can provide law enforcement with an investigation tool to assess witness memory and credibility, and integrate reports from multiple eyewitnesses, Renderings can be quickly disseminated through social media to warn of a person who may pose an immediate threat, and to solicit the public's help in identifying possible suspects and witnesses. Renderings are pose-standardized so as to not divulge the presence and location of eyewitnesses and surveillance cameras. Incorporation of thermal infrared imaging into commercial surveillance systems will significantly improve system performance, and reduce manual review times, at an incremental cost that will continue to decrease. Benefits to criminal justice would include improved reliability of eyewitness testimony and improved accuracy of distinguishing among minority groups in eyewitness and surveillance identifications.

  18. Reverse alignment "mirror image" visualization as a laparoscopic training tool improves task performance.

    PubMed

    Dunnican, Ward J; Singh, T Paul; Ata, Ashar; Bendana, Emma E; Conlee, Thomas D; Dolce, Charles J; Ramakrishnan, Rakesh

    2010-06-01

    Reverse alignment (mirror image) visualization is a disconcerting situation occasionally faced during laparoscopic operations. This occurs when the camera faces back at the surgeon in the opposite direction from which the surgeon's body and instruments are facing. Most surgeons will attempt to optimize trocar and camera placement to avoid this situation. The authors' objective was to determine whether the intentional use of reverse alignment visualization during laparoscopic training would improve performance. A standard box trainer was configured for reverse alignment, and 34 medical students and junior surgical residents were randomized to train with either forward alignment (DIRECT) or reverse alignment (MIRROR) visualization. Enrollees were tested on both modalities before and after a 4-week structured training program specific to their modality. Student's t test was used to determine differences in task performance between the 2 groups. Twenty-one participants completed the study (10 DIRECT, 11 MIRROR). There were no significant differences in performance time between DIRECT or MIRROR participants during forward or reverse alignment initial testing. At final testing, DIRECT participants had improved times only in forward alignment performance; they demonstrated no significant improvement in reverse alignment performance. MIRROR participants had significant time improvement in both forward and reverse alignment performance at final testing. Reverse alignment imaging for laparoscopic training improves task performance for both reverse alignment and forward alignment tasks. This may be translated into improved performance in the operating room when faced with reverse alignment situations. Minimal lab training can account for drastic adaptation to this environment.

  19. 8. VIEW NORTH DURING CONSTRUCTION, DECEMBER 1995, FACE OF ORIGINAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VIEW NORTH DURING CONSTRUCTION, DECEMBER 1995, FACE OF ORIGINAL 1882 MASONRY DAM WITH CAPSTONES - Norwich Water Power Company, Dam, West bank of Shetucket River opposite Fourteenth Street, Greenville section, Norwich, New London County, CT

  20. Image quality assessment for selfies with and without super resolution

    NASA Astrophysics Data System (ADS)

    Kubota, Aya; Gohshi, Seiichi

    2018-04-01

    With the advent of cellphone cameras, in particular, on smartphones, many people now take photos of themselves alone and with others in the frame; such photos are popularly known as "selfies". Most smartphones are equipped with two cameras: the front-facing and rear cameras. The camera located on the back of the smartphone is referred to as the "out-camera," whereas the one located on the front of the smartphone is called the "in-camera." In-cameras are mainly used for selfies. Some smartphones feature high-resolution cameras. However, the original image quality cannot be obtained because smartphone cameras often have low-performance lenses. Super resolution (SR) is one of the recent technological advancements that has increased image resolution. We developed a new SR technology that can be processed on smartphones. Smartphones with new SR technology are currently available in the market have already registered sales. However, the effective use of new SR technology has not yet been verified. Comparing the image quality with and without SR on smartphone display is necessary to confirm the usefulness of this new technology. Methods that are based on objective and subjective assessments are required to quantitatively measure image quality. It is known that the typical object assessment value, such as Peak Signal to Noise Ratio (PSNR), does not go together with how we feel when we assess image/video. When digital broadcast started, the standard was determined using subjective assessment. Although subjective assessment usually comes at high cost because of personnel expenses for observers, the results are highly reproducible when they are conducted under right conditions and statistical analysis. In this study, the subjective assessment results for selfie images are reported.

  1. Hydraulics of embankment-dam breaching

    NASA Astrophysics Data System (ADS)

    Walder, J. S.; Iverson, R. M.; Logan, M.; Godt, J. W.; Solovitz, S.

    2012-12-01

    Constructed or natural earthen dams can pose hazards to downstream communities. Experiments to date on earthen-dam breaching have focused on dam geometries relevant to engineering practice. We have begun experiments with dam geometries more like those of natural dams. Water was impounded behind dams constructed at the downstream end of the USGS debris-flow flume. Dams were made of compacted, well-sorted, moist beach sand (D50=0.21 mm), 3.5 m from toe to toe, but varying in height from 0.5 to 1 m; the lower the dam, the smaller the reservoir volume and the broader the initially flat crest. Breaching was started by cutting a slot 30-40 mm wide and deep in the dam crest after filling the reservoir. Water level and pore pressure within the dam were monitored. Experiments were also recorded by an array of still- and video cameras above the flume and a submerged video camera pointed at the upstream dam face. Photogrammetric software was used to create DEMs from stereo pairs, and particle-image velocimetry was used to compute the surface-velocity field from the motion of tracers scattered on the water surface. As noted by others, breaching involves formation and migration of a knickpoint (or several). Once the knickpoint reaches the upstream dam face, it takes on an arcuate form whose continued migration we determined by measuring the onset of motion of colored markers on the dam face. The arcuate feature, which can be considered the head of the "breach channel", is nearly coincident with the transition from subcritical to supercritical flow; that is, it acts as a weir that hydraulically controls reservoir emptying. Photogenic slope failures farther downstream, although the morphologically dominant process at work, play no role at all in hydraulic control aside from rare instances in which they extend upstream so far as to perturb the weir, where the flow cross section is nearly self-similar through time. The domain downstream of the critical-flow section does influence the hydrograph in another way: the broader the initial dam crest, the longer the time before critical flow control is established. Flood duration is thus increased but peak discharge is decreased. Visual inspection and overhead videography reveal little turbidity in water pouring over the weir, implying that sediment there moves dominantly as bedload. Furthermore, underwater videography gives the overall impression that along the upstream dam face, erosion occurs without redeposition. Thus it would be a mistake to use empiricisms for equilibrium bedload transport to model erosion of the embankment. In mathematical terms, erosion rate cannot be backed out by calculating the divergence of transport rate; rather, transport rate should be regarded as the spatial integral of erosion rate. We use photogrammetry and motion of the colored markers to determine the erosion rate of the weir, and then infer shear stress at the weir by applying the van Rijn sediment-pickup function. Shear stress determined in this fashion is much less than what one calculates from the gradient of the energy head (an approach appropriate to steady flow). Shear stress inferred from the pickup-function calculation can serve as a constraint on computational fluid-dynamics models. Another constraint on such models, revealed by the underwater videography, is the upstream limit of sand movement, where bed shear stress equals the critical value for sand entrainment.

  2. 2. EXTERIOR VIEW OF DOWNSTREAM SIDE OF COTTAGE 191 TAKEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. EXTERIOR VIEW OF DOWNSTREAM SIDE OF COTTAGE 191 TAKEN FROM ROOF OF GARAGE 393. CAMERA FACING SOUTHEAST. COTTAGE 181 AND CHILDREN'S PLAY AREA VISIBLE ON EITHER SIDE OF ROOF. GRAPE ARBOR IN FOREGROUND. - Swan Falls Village, Cottage 191, Snake River, Kuna, Ada County, ID

  3. Contextual view of the Hall of Transportation from Yerba Buena ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of the Hall of Transportation from Yerba Buena Island, showing Palace of Fine and Decorative Arts (Building 3) at far right, camera facing northwest - Golden Gate International Exposition, Hall of Transportation, 440 California Avenue, Treasure Island, San Francisco, San Francisco County, CA

  4. FAN HOUSE INTERIOR. THREE MOTOR DRIVES FOR POSITIVE DISPLACEMENT BLOWERS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FAN HOUSE INTERIOR. THREE MOTOR DRIVES FOR POSITIVE DISPLACEMENT BLOWERS LINE UP ON NORTH WALL. CONCRETE PEDESTALS. CAMERA FACES NORTHEAST. INL NEGATIVE NO. 4291. Unknown Photographer, 2/26/1952 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  5. Interior view of west main room in original tworoom portion. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of west main room in original two-room portion. Note muslin ceiling temporarily tacked up by the HABS team to afford clearer view. Camera facing west. - Warner Ranch, Ranch House, San Felipe Road (State Highway S2), Warner Springs, San Diego County, CA

  6. Methodology of the determination of the uncertainties by using the biometric device the broadway 3D

    NASA Astrophysics Data System (ADS)

    Jasek, Roman; Talandova, Hana; Adamek, Milan

    2016-06-01

    The biometric identification by face is among one of the most widely used methods of biometric identification. Due to it provides a faster and more accurate identification; it was implemented into area of security 3D face reader by Broadway manufacturer was used to measure. It is equipped with the 3D camera system, which uses the method of structured light scanning and saves the template into the 3D model of face. The obtained data were evaluated by software Turnstile Enrolment Application (TEA). The measurements were used 3D face reader the Broadway 3D. First, the person was scanned and stored in the database. Thereafter person has already been compared with the stored template in the database for each method. Finally, a measure of reliability was evaluated for the Broadway 3D face reader.

  7. An exploratory study into the effect of time-restricted internet access on face-validity, construct validity and reliability of postgraduate knowledge progress testing

    PubMed Central

    2013-01-01

    Background Yearly formative knowledge testing (also known as progress testing) was shown to have a limited construct-validity and reliability in postgraduate medical education. One way to improve construct-validity and reliability is to improve the authenticity of a test. As easily accessible internet has become inseparably linked to daily clinical practice, we hypothesized that allowing internet access for a limited amount of time during the progress test would improve the perception of authenticity (face-validity) of the test, which would in turn improve the construct-validity and reliability of postgraduate progress testing. Methods Postgraduate trainees taking the yearly knowledge progress test were asked to participate in a study where they could access the internet for 30 minutes at the end of a traditional pen and paper test. Before and after the test they were asked to complete a short questionnaire regarding the face-validity of the test. Results Mean test scores increased significantly for all training years. Trainees indicated that the face-validity of the test improved with internet access and that they would like to continue to have internet access during future testing. Internet access did not improve the construct-validity or reliability of the test. Conclusion Improving the face-validity of postgraduate progress testing, by adding the possibility to search the internet for a limited amount of time, positively influences test performance and face-validity. However, it did not change the reliability or the construct-validity of the test. PMID:24195696

  8. An Orientation Sensor-Based Head Tracking System for Driver Behaviour Monitoring.

    PubMed

    Zhao, Yifan; Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros

    2017-11-22

    Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers' behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone.

  9. Nonintrusive iris image acquisition system based on a pan-tilt-zoom camera and light stripe projection

    NASA Astrophysics Data System (ADS)

    Yoon, Soweon; Jung, Ho Gi; Park, Kang Ryoung; Kim, Jaihie

    2009-03-01

    Although iris recognition is one of the most accurate biometric technologies, it has not yet been widely used in practical applications. This is mainly due to user inconvenience during the image acquisition phase. Specifically, users try to adjust their eye position within small capture volume at a close distance from the system. To overcome these problems, we propose a novel iris image acquisition system that provides users with unconstrained environments: a large operating range, enabling movement from standing posture, and capturing good-quality iris images in an acceptable time. The proposed system has the following three contributions compared with previous works: (1) the capture volume is significantly increased by using a pan-tilt-zoom (PTZ) camera guided by a light stripe projection, (2) the iris location in the large capture volume is found fast due to 1-D vertical face searching from the user's horizontal position obtained by the light stripe projection, and (3) zooming and focusing on the user's irises at a distance are accurate and fast using the estimated 3-D position of a face by the light stripe projection and the PTZ camera. Experimental results show that the proposed system can capture good-quality iris images in 2.479 s on average at a distance of 1.5 to 3 m, while allowing a limited amount of movement by the user.

  10. Automatic 2.5-D Facial Landmarking and Emotion Annotation for Social Interaction Assistance.

    PubMed

    Zhao, Xi; Zou, Jianhua; Li, Huibin; Dellandrea, Emmanuel; Kakadiaris, Ioannis A; Chen, Liming

    2016-09-01

    People with low vision, Alzheimer's disease, and autism spectrum disorder experience difficulties in perceiving or interpreting facial expression of emotion in their social lives. Though automatic facial expression recognition (FER) methods on 2-D videos have been extensively investigated, their performance was constrained by challenges in head pose and lighting conditions. The shape information in 3-D facial data can reduce or even overcome these challenges. However, high expenses of 3-D cameras prevent their widespread use. Fortunately, 2.5-D facial data from emerging portable RGB-D cameras provide a good balance for this dilemma. In this paper, we propose an automatic emotion annotation solution on 2.5-D facial data collected from RGB-D cameras. The solution consists of a facial landmarking method and a FER method. Specifically, we propose building a deformable partial face model and fit the model to a 2.5-D face for localizing facial landmarks automatically. In FER, a novel action unit (AU) space-based FER method has been proposed. Facial features are extracted using landmarks and further represented as coordinates in the AU space, which are classified into facial expressions. Evaluated on three publicly accessible facial databases, namely EURECOM, FRGC, and Bosphorus databases, the proposed facial landmarking and expression recognition methods have achieved satisfactory results. Possible real-world applications using our algorithms have also been discussed.

  11. Examining the Challenging Hindrances facing in the Construction Projects: South India’s Perspective

    NASA Astrophysics Data System (ADS)

    Subramanyam, K.; Haridharan, M. K.

    2017-07-01

    Developing countries like India require a huge infrastructure to facilitate needs of the people. Construction industry provides several opportunities to the individuals. Construction manager work is to supervise and organize the construction activities in construction projects. Now a day construction manager facing challenges. This paper aimed to study the challenges facing by the construction manager in the perception of construction professionals. 39 variables were taken from the literature review which found to be severe impact on construction managers’ performance. Construction manager, project manager and site engineers are the respondents for this survey. Using SPSS, regression analysis was done and recognized significant challenges. These challenges were classified into 5 domains. In management challenges, resource availability and allocation, risks and uncertainties existing in the project onsite, top management support and cost constraints are the most significant variables. In skills requirement of a construction manager challenges, technical skills required to learn and adapt new technology in the project, decision making and planning according to the situation in site are the most significant variables. In performance challenges, implementation of tasks according to the plan is the important variable whereas in onsite challenges, manage project risks, develop project policies and procedures are the most important.

  12. Light field rendering with omni-directional camera

    NASA Astrophysics Data System (ADS)

    Todoroki, Hiroshi; Saito, Hideo

    2003-06-01

    This paper presents an approach to capture visual appearance of a real environment such as an interior of a room. We propose the method for generating arbitrary viewpoint images by building light field with the omni-directional camera, which can capture the wide circumferences. Omni-directional camera used in this technique is a special camera with the hyperbolic mirror in the upper part of a camera, so that we can capture luminosity in the environment in the range of 360 degree of circumferences in one image. We apply the light field method, which is one technique of Image-Based-Rendering(IBR), for generating the arbitrary viewpoint images. The light field is a kind of the database that records the luminosity information in the object space. We employ the omni-directional camera for constructing the light field, so that we can collect many view direction images in the light field. Thus our method allows the user to explore the wide scene, that can acheive realistic representation of virtual enviroment. For demonstating the proposed method, we capture image sequence in our lab's interior environment with an omni-directional camera, and succesfully generate arbitray viewpoint images for virual tour of the environment.

  13. Triton Mosaic

    NASA Image and Video Library

    1999-08-25

    Mosaic of Triton constructed from 16 individual images. After globally minimizing the camera pointing errors, the frames we reprocessed by map projections, photometric function removal and placement in the mosaic.

  14. High-accuracy and robust face recognition system based on optical parallel correlator using a temporal image sequence

    NASA Astrophysics Data System (ADS)

    Watanabe, Eriko; Ishikawa, Mami; Ohta, Maiko; Kodate, Kashiko

    2005-09-01

    Face recognition is used in a wide range of security systems, such as monitoring credit card use, searching for individuals with street cameras via Internet and maintaining immigration control. There are still many technical subjects under study. For instance, the number of images that can be stored is limited under the current system, and the rate of recognition must be improved to account for photo shots taken at different angles under various conditions. We implemented a fully automatic Fast Face Recognition Optical Correlator (FARCO) system by using a 1000 frame/s optical parallel correlator designed and assembled by us. Operational speed for the 1: N (i.e. matching a pair of images among N, where N refers to the number of images in the database) identification experiment (4000 face images) amounts to less than 1.5 seconds, including the pre/post processing. From trial 1: N identification experiments using FARCO, we acquired low error rates of 2.6% False Reject Rate and 1.3% False Accept Rate. By making the most of the high-speed data-processing capability of this system, much more robustness can be achieved for various recognition conditions when large-category data are registered for a single person. We propose a face recognition algorithm for the FARCO while employing a temporal image sequence of moving images. Applying this algorithm to a natural posture, a two times higher recognition rate scored compared with our conventional system. The system has high potential for future use in a variety of purposes such as search for criminal suspects by use of street and airport video cameras, registration of babies at hospitals or handling of an immeasurable number of images in a database.

  15. I spy with my little eye: typical, daily exposure to faces documented from a first-person infant perspective.

    PubMed

    Sugden, Nicole A; Mohamed-Ali, Marwan I; Moulson, Margaret C

    2014-02-01

    Exposure to faces is known to shape and change the face processing system; however, no study has yet documented infants' natural daily first-hand exposure to faces. One- and three-month-old infants' visual experience was recorded through head-mounted cameras. The video recordings were coded for faces to determine: (1) How often are infants exposed to faces? (2) To what type of faces are they exposed? and (3) Do frequently encountered face types reflect infants' typical pattern of perceptual narrowing? As hypothesized, infants spent a large proportion of their time (25%) exposed to faces; these faces were primarily female (70%), own-race (96%), and adult-age (81%). Infants were exposed to more individual exemplars of female, own-race, and adult-age faces than to male, other-race, and child- or older-adult-age faces. Each exposure to own-race faces was longer than to other-race faces. There were no differences in exposure duration related to the gender or age of the face. Previous research has found that the face types frequently experienced by our participants are preferred over and more successfully recognized than other face types. The patterns of face exposure revealed in the current study coincide with the known trajectory of perceptual narrowing seen later in infancy. © 2013 The Authors. Developmental Psychobiology Published by Wiley Periodicals, Inc.

  16. 1. GENERAL VIEW OF SLC3W SHOWING SOUTH FACE AND EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. GENERAL VIEW OF SLC-3W SHOWING SOUTH FACE AND EAST SIDE OF A-FRAME MOBILE SERVICE TOWER (MST). MST IN SERVICE POSITION OVER LAUNCHER AND FLAME BUCKET. CABLE TRAYS BETWEEN LAUNCH OPERATIONS BUILDING (BLDG. 763) AND SLC-3W IN FOREGROUND. LIQUID OXYGEN APRON VISIBLE IMMEDIATELY EAST (RIGHT) OF MST; FUEL APRON VISIBLE IMMEDIATELY WEST (LEFT) OF MST. A PORTION OF THE FLAME BUCKET VISIBLE BELOW THE SOUTH FACE OF THE MST. CAMERA TOWERS VISIBLE EAST OF MST BETWEEN ROAD AND CABLE TRAY, AND SOUTH OF MST NEAR LEFT MARGIN OF PHOTOGRAPH. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 West, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  17. HOT CELL BUILDING, TRA632. CONTEXTUAL AERIAL VIEW OF HOT CELL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632. CONTEXTUAL AERIAL VIEW OF HOT CELL BUILDING, IN VIEW AT LEFT, AS YET WITHOUT ROOF. PLUG STORAGE BUILDING LIES BETWEEN IT AND THE SOUTH SIDE OF THE MTR BUILDING AND ITS WING. NOTE CONCRETE DRIVE BETWEEN ROLL-UP DOOR IN MTR BUILDING AND CHARGING FACE OF PLUG STORAGE. REACTOR SERVICES BUILDING (TRA-635) WILL COVER THIS DRIVE AND BUTT UP TO CHARGING FACE. DOTTED LINE IS ON ORIGINAL NEGATIVE. TRA PARKING LOT IN LEFT CORNER OF THE VIEW. CAMERA FACING NORTHWESTERLY. INL NEGATIVE NO. 8274. Unknown Photographer, 7/2/1953 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  18. View of MISSE taken during Expedition Six

    NASA Image and Video Library

    2003-01-01

    ISS006-348-019 (January 2003) ---- Materials International Space Station Experiment (MISSE), a suitcase-sized experiment attached to the outside of the space station to expose hundreds of potential space construction materials to the environment, leading to stronger, more durable spacecraft construction. Photographed by one of the Expedition 6 crew members with a 35mm camera.

  19. Instruments for Imaging from Far to Near

    NASA Technical Reports Server (NTRS)

    Mungas, Greg; Boynton, John; Sepulveda, Cesar

    2009-01-01

    The acronym CHAMP (signifying camera, hand lens, and microscope ) denotes any of several proposed optoelectronic instruments that would be capable of color imaging at working distances that could be varied continuously through a range from infinity down to several millimeters. As in any optical instrument, the magnification, depth of field, and spatial resolution would vary with the working distance. For example, in one CHAMP version, at a working distance of 2.5 m, the instrument would function as an electronic camera with a magnification of 1/100, whereas at a working distance of 7 mm, the instrument would function as a microscope/electronic camera with a magnification of 4.4. Moreover, as described below, when operating at or near the shortest-working-distance/highest-magnification combination, a CHAMP could be made to perform one or more spectral imaging functions. CHAMPs were originally intended to be used in robotic geological exploration of the Moon and Mars. The CHAMP concept also has potential for diverse terrestrial applications that could include remotely controlled or robotic geological exploration, prospecting, field microbiology, environmental surveying, and assembly- line inspection. A CHAMP (see figure) would include two lens cells: (1) a distal cell corresponding to the objective lens assembly of a conventional telescope or microscope and (2) a proximal cell that would contain the focusing camera lens assembly and the camera electronic image-detector chip, which would be of the active-pixel-sensor (APS) type. The distal lens cell would face outward from a housing, while the proximal lens cell would lie in a clean environment inside the housing. The proximal lens cell would contain a beam splitter that would enable simultaneous use of the imaging optics (that is, proximal and distal lens assemblies) for imaging and illumination of the field of view. The APS chip would be mounted on a focal plane on a side face of the beam splitter, while light for illuminating the field of view would enter the imaging optics via the end face of the beam splitter. The proximal lens cell would be mounted on a sled that could be translated along the optical axis for focus adjustment. The position of the CHAMP would initially be chosen at the desired working distance of the distal lens from (corresponding to an approximate desired magnification of) an object to be examined. During subsequent operation, the working distance would ordinarily remain fixed at the chosen value and the position of the proximal lens cell within the instrument would be adjusted for focus as needed.

  20. Technology survey on video face tracking

    NASA Astrophysics Data System (ADS)

    Zhang, Tong; Gomes, Herman Martins

    2014-03-01

    With the pervasiveness of monitoring cameras installed in public areas, schools, hospitals, work places and homes, video analytics technologies for interpreting these video contents are becoming increasingly relevant to people's lives. Among such technologies, human face detection and tracking (and face identification in many cases) are particularly useful in various application scenarios. While plenty of research has been conducted on face tracking and many promising approaches have been proposed, there are still significant challenges in recognizing and tracking people in videos with uncontrolled capturing conditions, largely due to pose and illumination variations, as well as occlusions and cluttered background. It is especially complex to track and identify multiple people simultaneously in real time due to the large amount of computation involved. In this paper, we present a survey on literature and software that are published or developed during recent years on the face tracking topic. The survey covers the following topics: 1) mainstream and state-of-the-art face tracking methods, including features used to model the targets and metrics used for tracking; 2) face identification and face clustering from face sequences; and 3) software packages or demonstrations that are available for algorithm development or trial. A number of publically available databases for face tracking are also introduced.

Top