Monitoring Hitting Load in Tennis Using Inertial Sensors and Machine Learning.
Whiteside, David; Cant, Olivia; Connolly, Molly; Reid, Machar
2017-10-01
Quantifying external workload is fundamental to training prescription in sport. In tennis, global positioning data are imprecise and fail to capture hitting loads. The current gold standard (manual notation) is time intensive and often not possible given players' heavy travel schedules. To develop an automated stroke-classification system to help quantify hitting load in tennis. Nineteen athletes wore an inertial measurement unit (IMU) on their wrist during 66 video-recorded training sessions. Video footage was manually notated such that known shot type (serve, rally forehand, slice forehand, forehand volley, rally backhand, slice backhand, backhand volley, smash, or false positive) was associated with the corresponding IMU data for 28,582 shots. Six types of machine-learning models were then constructed to classify true shot type from the IMU signals. Across 10-fold cross-validation, a cubic-kernel support vector machine classified binned shots (overhead, forehand, or backhand) with an accuracy of 97.4%. A second cubic-kernel support vector machine achieved 93.2% accuracy when classifying all 9 shot types. With a view to monitoring external load, the combination of miniature inertial sensors and machine learning offers a practical and automated method of quantifying shot counts and discriminating shot types in elite tennis players.
Object detection in cinematographic video sequences for automatic indexing
NASA Astrophysics Data System (ADS)
Stauder, Jurgen; Chupeau, Bertrand; Oisel, Lionel
2003-06-01
This paper presents an object detection framework applied to cinematographic post-processing of video sequences. Post-processing is done after production and before editing. At the beginning of each shot of a video, a slate (also called clapperboard) is shown. The slate contains notably an electronic audio timecode that is necessary for audio-visual synchronization. This paper presents an object detection framework to detect slates in video sequences for automatic indexing and post-processing. It is based on five steps. The first two steps aim to reduce drastically the video data to be analyzed. They ensure high recall rate but have low precision. The first step detects images at the beginning of a shot possibly showing up a slate while the second step searches in these images for candidates regions with color distribution similar to slates. The objective is to not miss any slate while eliminating long parts of video without slate appearance. The third and fourth steps are statistical classification and pattern matching to detected and precisely locate slates in candidate regions. These steps ensure high recall rate and high precision. The objective is to detect slates with very little false alarms to minimize interactive corrections. In a last step, electronic timecodes are read from slates to automize audio-visual synchronization. The presented slate detector has a recall rate of 89% and a precision of 97,5%. By temporal integration, much more than 89% of shots in dailies are detected. By timecode coherence analysis, the precision can be raised too. Issues for future work are to accelerate the system to be faster than real-time and to extend the framework for several slate types.
Sprengers, Andre M J; Caan, Matthan W A; Moerman, Kevin M; Nederveen, Aart J; Lamerichs, Rolf M; Stoker, Jaap
2013-04-01
This study proposes a scale space based algorithm for automated segmentation of single-shot tagged images of modest SNR. Furthermore the algorithm was designed for analysis of discontinuous or shearing types of motion, i.e. segmentation of broken tag patterns. The proposed algorithm utilises non-linear scale space for automatic segmentation of single-shot tagged images. The algorithm's ability to automatically segment tagged shearing motion was evaluated in a numerical simulation and in vivo. A typical shearing deformation was simulated in a Shepp-Logan phantom allowing for quantitative evaluation of the algorithm's success rate as a function of both SNR and the amount of deformation. For a qualitative in vivo evaluation tagged images showing deformations in the calf muscles and eye movement in a healthy volunteer were acquired. Both the numerical simulation and the in vivo tagged data demonstrated the algorithm's ability for automated segmentation of single-shot tagged MR provided that SNR of the images is above 10 and the amount of deformation does not exceed the tag spacing. The latter constraint can be met by adjusting the tag delay or the tag spacing. The scale space based algorithm for automatic segmentation of single-shot tagged MR enables the application of tagged MR to complex (shearing) deformation and the processing of datasets with relatively low SNR.
NASA Astrophysics Data System (ADS)
Li, Wei; Chen, Ting; Zhang, Wenjun; Shi, Yunyu; Li, Jun
2012-04-01
In recent years, Music video data is increasing at an astonishing speed. Shot segmentation and keyframe extraction constitute a fundamental unit in organizing, indexing, retrieving video content. In this paper a unified framework is proposed to detect the shot boundaries and extract the keyframe of a shot. Music video is first segmented to shots by illumination-invariant chromaticity histogram in independent component (IC) analysis feature space .Then we presents a new metric, image complexity, to extract keyframe in a shot which is computed by ICs. Experimental results show the framework is effective and has a good performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, N; Knutson, N; Schmidt, M
Purpose: To verify a method used to automatically acquire jaw, MLC, collimator and couch star shots for a Varian TrueBeam linear accelerator utilizing Developer Mode and an Electronic Portal Imaging Device (EPID). Methods: An XML script was written to automate motion of the jaws, MLC, collimator and couch in TrueBeam Developer Mode (TBDM) to acquire star shot measurements. The XML script also dictates MV imaging parameters to facilitate automatic acquisition and recording of integrated EPID images. Since couch star shot measurements cannot be acquired using a combination of EPID and jaw/MLC collimation alone due to a fixed imager geometry, amore » method utilizing a 5mm wide steel ruler placed on the table and centered within a 15×15cm2 open field to produce a surrogate of the narrow field aperture was investigated. Four individual star shot measurements (X jaw, Y jaw, MLC and couch) were obtained using our proposed as well as traditional film-based method. Integrated EPID images and scanned measurement films were analyzed and compared. Results: Star shot (X jaw, Y jaw, MLC and couch) measurements were obtained in a single 5 minute delivery using the TBDM XML script method compared to 60 minutes for equivalent traditional film measurements. Analysis of the images and films demonstrated comparable isocentricity results, agreeing within 0.3mm of each other. Conclusion: The presented automatic approach of acquiring star shot measurements using TBDM and EPID has proven to be more efficient than the traditional film approach with equivalent results.« less
NASA Astrophysics Data System (ADS)
Orenstein, E. C.; Morgado, P. M.; Peacock, E.; Sosik, H. M.; Jaffe, J. S.
2016-02-01
Technological advances in instrumentation and computing have allowed oceanographers to develop imaging systems capable of collecting extremely large data sets. With the advent of in situ plankton imaging systems, scientists must now commonly deal with "big data" sets containing tens of millions of samples spanning hundreds of classes, making manual classification untenable. Automated annotation methods are now considered to be the bottleneck between collection and interpretation. Typically, such classifiers learn to approximate a function that predicts a predefined set of classes for which a considerable amount of labeled training data is available. The requirement that the training data span all the classes of concern is problematic for plankton imaging systems since they sample such diverse, rapidly changing populations. These data sets may contain relatively rare, sparsely distributed, taxa that will not have associated training data; a classifier trained on a limited set of classes will miss these samples. The computer vision community, leveraging advances in Convolutional Neural Networks (CNNs), has recently attempted to tackle such problems using "zero-shot" object categorization methods. Under a zero-shot framework, a classifier is trained to map samples onto a set of attributes rather than a class label. These attributes can include visual and non-visual information such as what an organism is made out of, where it is distributed globally, or how it reproduces. A second stage classifier is then used to extrapolate a class. In this work, we demonstrate a zero-shot classifier, implemented with a CNN, to retrieve out-of-training-set labels from images. This method is applied to data from two continuously imaging, moored instruments: the Scripps Plankton Camera System (SPCS) and the Imaging FlowCytobot (IFCB). Results from simulated deployment scenarios indicate zero-shot classifiers could be successful at recovering samples of rare taxa in image sets. This capability will allow ecologists to identify trends in the distribution of difficult to sample organisms in their data.
Automated divertor target design by adjoint shape sensitivity analysis and a one-shot method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dekeyser, W., E-mail: Wouter.Dekeyser@kuleuven.be; Reiter, D.; Baelmans, M.
As magnetic confinement fusion progresses towards the development of first reactor-scale devices, computational tokamak divertor design is a topic of high priority. Presently, edge plasma codes are used in a forward approach, where magnetic field and divertor geometry are manually adjusted to meet design requirements. Due to the complex edge plasma flows and large number of design variables, this method is computationally very demanding. On the other hand, efficient optimization-based design strategies have been developed in computational aerodynamics and fluid mechanics. Such an optimization approach to divertor target shape design is elaborated in the present paper. A general formulation ofmore » the design problems is given, and conditions characterizing the optimal designs are formulated. Using a continuous adjoint framework, design sensitivities can be computed at a cost of only two edge plasma simulations, independent of the number of design variables. Furthermore, by using a one-shot method the entire optimization problem can be solved at an equivalent cost of only a few forward simulations. The methodology is applied to target shape design for uniform power load, in simplified edge plasma geometry.« less
National Ignition Facility Control and Information System Operational Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, C D; Beeler, R G; Bowers, G A
The National Ignition Facility (NIF) in Livermore, California, is the world's highest-energy laser fusion system and one of the premier large scale scientific projects in the United States. The system is designed to setup and fire a laser shot to a fusion ignition or high energy density target at rates up to a shot every 4 hours. NIF has 192 laser beams delivering up to 1.8 MJ of energy to a {approx}2 mm target that is planned to produce >100 billion atm of pressure and temperatures of >100 million degrees centigrade. NIF is housed in a ten-story building footprint themore » size of three football fields as shown in Fig. 1. Commissioning was recently completed and NIF will be formally dedicated at Lawrence Livermore National Laboratory on May 29, 2009. The control system has 60,000 hardware controls points and employs 2 million lines of control system code. The control room has highly automated equipment setup prior to firing laser system shots. This automation has a data driven implementation that is conducive to dynamic modification and optimization depending on the shot goals defined by the end user experimenters. NIF has extensive facility machine history and infrastructure maintenance workflow tools both under development and deployed. An extensive operational tools suite has been developed to support facility operations including experimental shot setup, machine readiness, machine health and safety, and machine history. The following paragraphs discuss the current state and future upgrades to these four categories of operational tools.« less
Automating Nearshore Bathymetry Extraction from Wave Motion in Satellite Optical Imagery
2012-03-01
positions and overlap in the electromagnetic spectrum (From DigitalGlobe, 2011b). ..............................18 Figure 9. STK snap shot of...to-Noise Ratio STK Satellite Tool Kit UTM Universal Transverse Mercator WKB Wave Kinematics Bathymetry xviii THIS PAGE INTENTIONALLY LEFT...planned over the coming months. 21 Figure 9. STK snap shot of WorldView-2 collection pass. C. METHOD The imagery was collected at about 2200Z
2015-07-08
Compressive surface residual stresses can be applied via multi ple techniques (shot/gravity peening, low plasticity burnishing, laser shock peening...eigenstrains are used include internal stresses due to inclusions/particles or fibers [48], differences in coefficient of thermal expansion of different phases...residual stresses induced by shot peening [44,52 54], laser shock peening [55 61], and welding [62 66]. For shot peening analysis, the amount of residual
Automated videography for residential communications
NASA Astrophysics Data System (ADS)
Kurtz, Andrew F.; Neustaedter, Carman; Blose, Andrew C.
2010-02-01
The current widespread use of webcams for personal video communication over the Internet suggests that opportunities exist to develop video communications systems optimized for domestic use. We discuss both prior and existing technologies, and the results of user studies that indicate potential needs and expectations for people relative to personal video communications. In particular, users anticipate an easily used, high image quality video system, which enables multitasking communications during the course of real-world activities and provides appropriate privacy controls. To address these needs, we propose a potential approach premised on automated capture of user activity. We then describe a method that adapts cinematography principles, with a dual-camera videography system, to automatically control image capture relative to user activity, using semantic or activity-based cues to determine user position and motion. In particular, we discuss an approach to automatically manage shot framing, shot selection, and shot transitions, with respect to one or more local users engaged in real-time, unscripted events, while transmitting the resulting video to a remote viewer. The goal is to tightly frame subjects (to provide more detail), while minimizing subject loss and repeated abrupt shot framing changes in the images as perceived by a remote viewer. We also discuss some aspects of the system and related technologies that we have experimented with thus far. In summary, the method enables users to participate in interactive video-mediated communications while engaged in other activities.
Control and automation of the Pegasus multi-point Thomson scattering system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bodner, G. M., E-mail: gbodner@wisc.edu; Bongard, M. W.; Fonck, R. J.
A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. Additionally, the system has been upgraded with a set of fast (∼1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.
Control and automation of the Pegasus multi-point Thomson scattering system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bodner, Grant M.; Bongard, Michael W.; Fonck, Raymond J.
A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. In addition, the system has been upgraded with a set of fast (~1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.
Control and automation of the Pegasus multi-point Thomson scattering system
Bodner, Grant M.; Bongard, Michael W.; Fonck, Raymond J.; ...
2016-08-12
A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. In addition, the system has been upgraded with a set of fast (~1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... a permit to use nonpermissible explosives and/or shot-firing units in the blasting of rock while... appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of...
Using deep neural networks to augment NIF post-shot analysis
NASA Astrophysics Data System (ADS)
Humbird, Kelli; Peterson, Luc; McClarren, Ryan; Field, John; Gaffney, Jim; Kruse, Michael; Nora, Ryan; Spears, Brian
2017-10-01
Post-shot analysis of National Ignition Facility (NIF) experiments is the process of determining which simulation inputs yield results consistent with experimental observations. This analysis is typically accomplished by running suites of manually adjusted simulations, or Monte Carlo sampling surrogate models that approximate the response surfaces of the physics code. These approaches are expensive and often find simulations that match only a small subset of observables simultaneously. We demonstrate an alternative method for performing post-shot analysis using inverse models, which map directly from experimental observables to simulation inputs with quantified uncertainties. The models are created using a novel machine learning algorithm which automates the construction and initialization of deep neural networks to optimize predictive accuracy. We show how these neural networks, trained on large databases of post-shot simulations, can rigorously quantify the agreement between simulation and experiment. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Semantic Shot Classification in Sports Video
NASA Astrophysics Data System (ADS)
Duan, Ling-Yu; Xu, Min; Tian, Qi
2003-01-01
In this paper, we present a unified framework for semantic shot classification in sports videos. Unlike previous approaches, which focus on clustering by aggregating shots with similar low-level features, the proposed scheme makes use of domain knowledge of a specific sport to perform a top-down video shot classification, including identification of video shot classes for each sport, and supervised learning and classification of the given sports video with low-level and middle-level features extracted from the sports video. It is observed that for each sport we can predefine a small number of semantic shot classes, about 5~10, which covers 90~95% of sports broadcasting video. With the supervised learning method, we can map the low-level features to middle-level semantic video shot attributes such as dominant object motion (a player), camera motion patterns, and court shape, etc. On the basis of the appropriate fusion of those middle-level shot classes, we classify video shots into the predefined video shot classes, each of which has a clear semantic meaning. The proposed method has been tested over 4 types of sports videos: tennis, basketball, volleyball and soccer. Good classification accuracy of 85~95% has been achieved. With correctly classified sports video shots, further structural and temporal analysis, such as event detection, video skimming, table of content, etc, will be greatly facilitated.
A fully automated digitally controlled 30-inch telescope
NASA Technical Reports Server (NTRS)
Colgate, S. A.; Moore, E. P.; Carlson, R.
1975-01-01
A fully automated 30-inch (75-cm) telescope has been successfully designed and constructed from a military surplus Nike-Ajax radar mount. Novel features include: closed-loop operation between mountain telescope and campus computer 30 km apart via microwave link, a TV-type sensor which is photon shot-noise limited, a special lightweight primary mirror, and a stepping motor drive capable of slewing and settling one degree in one second or a radian in fifteen seconds.
Control and Information Systems for the National Ignition Facility
Brunton, Gordon; Casey, Allan; Christensen, Marvin; ...
2017-03-23
Orchestration of every National Ignition Facility (NIF) shot cycle is managed by the Integrated Computer Control System (ICCS), which uses a scalable software architecture running code on more than 1950 front-end processors, embedded controllers, and supervisory servers. The ICCS operates laser and industrial control hardware containing 66 000 control and monitor points to ensure that all of NIF’s laser beams arrive at the target within 30 ps of each other and are aligned to a pointing accuracy of less than 50 μm root-mean-square, while ensuring that a host of diagnostic instruments record data in a few billionths of a second.more » NIF’s automated control subsystems are built from a common object-oriented software framework that distributes the software across the computer network and achieves interoperation between different software languages and target architectures. A large suite of business and scientific software tools supports experimental planning, experimental setup, facility configuration, and post-shot analysis. Standard business services using open-source software, commercial workflow tools, and database and messaging technologies have been developed. An information technology infrastructure consisting of servers, network devices, and storage provides the foundation for these systems. Thus, this work is an overview of the control and information systems used to support a wide variety of experiments during the National Ignition Campaign.« less
Control and Information Systems for the National Ignition Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunton, Gordon; Casey, Allan; Christensen, Marvin
Orchestration of every National Ignition Facility (NIF) shot cycle is managed by the Integrated Computer Control System (ICCS), which uses a scalable software architecture running code on more than 1950 front-end processors, embedded controllers, and supervisory servers. The ICCS operates laser and industrial control hardware containing 66 000 control and monitor points to ensure that all of NIF’s laser beams arrive at the target within 30 ps of each other and are aligned to a pointing accuracy of less than 50 μm root-mean-square, while ensuring that a host of diagnostic instruments record data in a few billionths of a second.more » NIF’s automated control subsystems are built from a common object-oriented software framework that distributes the software across the computer network and achieves interoperation between different software languages and target architectures. A large suite of business and scientific software tools supports experimental planning, experimental setup, facility configuration, and post-shot analysis. Standard business services using open-source software, commercial workflow tools, and database and messaging technologies have been developed. An information technology infrastructure consisting of servers, network devices, and storage provides the foundation for these systems. Thus, this work is an overview of the control and information systems used to support a wide variety of experiments during the National Ignition Campaign.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartley, R.; Kartz, M.; Behrendt, W.
1996-10-01
The laser wavefront of the NIF Beamlet demonstration system is corrected for static aberrations with a wavefront control system. The system operates closed loop with a probe beam prior to a shot and has a loop bandwidth of about 3 Hz. However, until recently the wavefront control system was disabled several minutes prior to the shot to allow time to manually reconfigure its attenuators and probe beam insertion mechanism to shot mode. Thermally-induced dynamic variations in gas density in the Beamlet main beam line produce significant wavefront error. After about 5-8 seconds, the wavefront error has increased to a new,more » higher level due to turbulence- induced aberrations no longer being corrected- This implies that there is a turbulence-induced aberration noise bandwidth of less than one Hertz, and that the wavefront controller could correct for the majority of turbulence-induced aberration (about one- third wave) by automating its reconfiguration to occur within one second of the shot, This modification was recently implemented on Beamlet; we call this modification the t{sub 0}-1 system.« less
A procedure to determine the radiation isocenter size in a linear accelerator.
González, A; Castro, I; Martínez, J A
2004-06-01
Measurement of radiation isocenter is a fundamental part of commissioning and quality assurance (QA) for a linear accelerator (linac). In this work we present an automated procedure for the analysis of the stars-shots employed in the radiation isocenter determination. Once the star-shot film has been developed and digitized, the resulting image is analyzed by scanning concentric circles centered around the intersection of the lasers that had been previously marked on the film. The center and the radius of the minimum circle intersecting the central rays are determined with an accuracy and precision better than 1% of the pixel size. The procedure is applied to the position and size determination of the radiation isocenter by means of the analysis of star-shots, placed in different planes with respect to the gantry, couch and collimator rotation axes.
NASA Astrophysics Data System (ADS)
Lopez, Alejandro; Noe, Miquel; Fernandez, Gabriel
2004-10-01
The GMF4iTV project (Generic Media Framework for Interactive Television) is an IST European project that consists of an end-to-end broadcasting platform providing interactivity on heterogeneous multimedia devices such as Set-Top-Boxes and PCs according to the Multimedia Home Platform (MHP) standard from DVB. This platform allows the content providers to create enhanced audiovisual contents with a degree of interactivity at moving object level or shot change from a video. The end user is then able to interact with moving objects from the video or individual shots allowing the enjoyment of additional contents associated to them (MHP applications, HTML pages, JPEG, MPEG4 files...). This paper focus the attention to the issues related to metadata and content transmission, synchronization, signaling and bitrate allocation of the GMF4iTV project.
The Conversational Framework and the ISE "Basketball Shot" Video Analysis Activity
ERIC Educational Resources Information Center
English, Vincent; Crotty, Yvonne; Farren, Margaret
2015-01-01
Inspiring Science Education (ISE) (http://www.inspiringscience.eu/) is an EU funded initiative that seeks to further the use of inquiry-based science learning (IBSL) through the medium of ICT in the classroom. The Basketball Shot is a scenario (lesson plan) that involves the use of video capture to help the student investigate the concepts of…
ERIC Educational Resources Information Center
Scott, Rachel Elizabeth
2016-01-01
Librarians are frequently asked to teach several databases in a one-shot session, despite findings suggesting that such database demonstrations do not lead to optimal student outcomes. The "ACRL Framework for Information Literacy for Higher Education" highlights the concepts of metaliteracy and metacognition. This paper investigates ways…
The killing efficiency of soft iron shot
Andrews, R.; Longcore, J.R.
1969-01-01
A cooperative research effort between the ammunition industry and the Bureau of Sport Fisheries and Wildlife is aimed at finding a suitable non-toxic substitute for lead shot. A contract study by an independent research organization evaluated ways of coating or detoxifying lead shot or replacing it with another metal. As a result of that study, the only promising candidate is soft iron. Previous tests of hard iron shot had suggested that its killing effectiveness was poor at longer ranges due to the lower density. In addition, its hardness caused excessive damage to shotgun barrels. A unique, automated shooting facility was constructed at the Patuxent Wildlife Research Center to test the killing effectiveness of soft iron shot under controlled conditions. Tethered game-farm mallards were transported across a shooting point in a manner simulating free flight. A microswitch triggered a mounted shotgun so that each shot was 'perfect.' A soft iron shot, in Number 4 size, was produced by the ammunition industry and loaded in 12-gauge shells to give optimum ballistic performance. Commercial loads of lead shot in both Number 4 and Number 6 size were used for comparison. A total of 2,010 ducks were shot at ranges of 30 to 65 yards and at broadside and head-on angles in a statistically designed procedure. The following data were recorded for each duck: time until death, broken wing or leg bones, and number of embedded shot. Those ducks not killed outright were held for 10 days. From these data, ducks were categorized as 'probably bagged,' 'probably lost cripples,' or survivors. The test revealed that the killing effectiveness of this soft iron shot was superior to its anticipated performance and close to that obtained with commercial lead loads containing an equal number of pellets. Bagging a duck, in terms of rapid death or broken wing, was primarily dependent on the probability of a shot striking that vital area, and therefore a function of range. There was no indication that iron shot would result in greater crippling loss. Despite the apparent effectiveness of this iron shot, transition to its use in waterfowl hunting is not now possible. The sample used for this test was produced by a laboratory procedure that is unsuitable for manufacture. There is no process for producing soft iron shot in the quantities needed. Industry is doing its best to resolve this problem.
A novel sub-shot segmentation method for user-generated video
NASA Astrophysics Data System (ADS)
Lei, Zhuo; Zhang, Qian; Zheng, Chi; Qiu, Guoping
2018-04-01
With the proliferation of the user-generated videos, temporal segmentation is becoming a challengeable problem. Traditional video temporal segmentation methods like shot detection are not able to work on unedited user-generated videos, since they often only contain one single long shot. We propose a novel temporal segmentation framework for user-generated video. It finds similar frames with a tree partitioning min-Hash technique, constructs sparse temporal constrained affinity sub-graphs, and finally divides the video into sub-shot-level segments with a dense-neighbor-based clustering method. Experimental results show that our approach outperforms all the other related works. Furthermore, it is indicated that the proposed approach is able to segment user-generated videos at an average human level.
Designing effective human-automation-plant interfaces: a control-theoretic perspective.
Jamieson, Greg A; Vicente, Kim J
2005-01-01
In this article, we propose the application of a control-theoretic framework to human-automation interaction. The framework consists of a set of conceptual distinctions that should be respected in automation research and design. We demonstrate how existing automation interface designs in some nuclear plants fail to recognize these distinctions. We further show the value of the approach by applying it to modes of automation. The design guidelines that have been proposed in the automation literature are evaluated from the perspective of the framework. This comparison shows that the framework reveals insights that are frequently overlooked in this literature. A new set of design guidelines is introduced that builds upon the contributions of previous research and draws complementary insights from the control-theoretic framework. The result is a coherent and systematic approach to the design of human-automation-plant interfaces that will yield more concrete design criteria and a broader set of design tools. Applications of this research include improving the effectiveness of human-automation interaction design and the relevance of human-automation interaction research.
NASA Astrophysics Data System (ADS)
Hu, Dianyin; Gao, Ye; Meng, Fanchao; Song, Jun; Wang, Rongqiao
2018-04-01
Combining experiments and finite element analysis (FEA), a systematic study was performed to analyze the microstructural evolution and stress states of shot-peened GH4169 superalloy over a variety of peening intensities and coverages. A dislocation density evolution model was integrated into the representative volume FEA model to quantitatively predict microstructural evolution in the surface layers and compared with experimental results. It was found that surface roughness and through-depth residual stress profile are more sensitive to shot-peening intensity compared to coverage due to the high kinetic energy involved. Moreover, a surface nanocrystallization layer was discovered in the top surface region of GH4169 for all shot-peening conditions. However, the grain refinement was more intensified under high shot-peening coverage, under which enough time was permitted for grain refinement. The grain size gradient predicted by the numerical framework showed good agreement with experimental observations.
Audio-based performance evaluation of squash players
Hajdú-Szücs, Katalin; Fenyvesi, Nóra; Vattay, Gábor
2018-01-01
In competitive sports it is often very hard to quantify the performance. A player to score or overtake may depend on only millesimal of seconds or millimeters. In racquet sports like tennis, table tennis and squash many events will occur in a short time duration, whose recording and analysis can help reveal the differences in performance. In this paper we show that it is possible to architect a framework that utilizes the characteristic sound patterns to precisely classify the types of and localize the positions of these events. From these basic information the shot types and the ball speed along the trajectories can be estimated. Comparing these estimates with the optimal speed and target the precision of the shot can be defined. The detailed shot statistics and precision information significantly enriches and improves data available today. Feeding them back to the players and the coaches facilitates to describe playing performance objectively and to improve strategy skills. The framework is implemented, its hardware and software components are installed and tested in a squash court. PMID:29579067
A Framework for Evaluation and Use of Automated Scoring
ERIC Educational Resources Information Center
Williamson, David M.; Xi, Xiaoming; Breyer, F. Jay
2012-01-01
A framework for evaluation and use of automated scoring of constructed-response tasks is provided that entails both evaluation of automated scoring as well as guidelines for implementation and maintenance in the context of constantly evolving technologies. Consideration of validity issues and challenges associated with automated scoring are…
NASA Technical Reports Server (NTRS)
Axdahl, Erik L.
2015-01-01
Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.
Apparent and Actual Use of Observational Frameworks by Experienced Teachers.
ERIC Educational Resources Information Center
Satern, Miriam N.
This study investigated observational strategies that were used by six experienced physical education teachers when viewing a videotape of motor skills (standing vertical jump, overarm throw, tennis serve, basketball jump shot and dance sequence). Four observational frameworks were proposed as being representative of subdisciplinary knowledge…
Layton, Kelvin J; Gallichan, Daniel; Testud, Frederik; Cocosco, Chris A; Welz, Anna M; Barmet, Christoph; Pruessmann, Klaas P; Hennig, Jürgen; Zaitsev, Maxim
2013-09-01
It has recently been demonstrated that nonlinear encoding fields result in a spatially varying resolution. This work develops an automated procedure to design single-shot trajectories that create a local resolution improvement in a region of interest. The technique is based on the design of optimized local k-space trajectories and can be applied to arbitrary hardware configurations that employ any number of linear and nonlinear encoding fields. The trajectories designed in this work are tested with the currently available hardware setup consisting of three standard linear gradients and two quadrupolar encoding fields generated from a custom-built gradient insert. A field camera is used to measure the actual encoding trajectories up to third-order terms, enabling accurate reconstructions of these demanding single-shot trajectories, although the eddy current and concomitant field terms of the gradient insert have not been completely characterized. The local resolution improvement is demonstrated in phantom and in vivo experiments. Copyright © 2012 Wiley Periodicals, Inc.
Towards cooperative guidance and control of highly automated vehicles: H-Mode and Conduct-by-Wire.
Flemisch, Frank Ole; Bengler, Klaus; Bubb, Heiner; Winner, Hermann; Bruder, Ralph
2014-01-01
This article provides a general ergonomic framework of cooperative guidance and control for vehicles with an emphasis on the cooperation between a human and a highly automated vehicle. In the twenty-first century, mobility and automation technologies are increasingly fused. In the sky, highly automated aircraft are flying with a high safety record. On the ground, a variety of driver assistance systems are being developed, and highly automated vehicles with increasingly autonomous capabilities are becoming possible. Human-centred automation has paved the way for a better cooperation between automation and humans. How can these highly automated systems be structured so that they can be easily understood, how will they cooperate with the human? The presented research was conducted using the methods of iterative build-up and refinement of framework by triangulation, i.e. by instantiating and testing the framework with at least two derived concepts and prototypes. This article sketches a general, conceptual ergonomic framework of cooperative guidance and control of highly automated vehicles, two concepts derived from the framework, prototypes and pilot data. Cooperation is exemplified in a list of aspects and related to levels of the driving task. With the concept 'Conduct-by-Wire', cooperation happens mainly on the guidance level, where the driver can delegate manoeuvres to the automation with a specialised manoeuvre interface. With H-Mode, a haptic-multimodal interaction with highly automated vehicles based on the H(orse)-Metaphor, cooperation is mainly done on guidance and control with a haptically active interface. Cooperativeness should be a key aspect for future human-automation systems. Especially for highly automated vehicles, cooperative guidance and control is a research direction with already promising concepts and prototypes that should be further explored. The application of the presented approach is every human-machine system that moves and includes high levels of assistance/automation.
Physics of automated driving in framework of three-phase traffic theory.
Kerner, Boris S
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Physics of automated driving in framework of three-phase traffic theory
NASA Astrophysics Data System (ADS)
Kerner, Boris S.
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moeglein, W. A.; Griswold, R.; Mehdi, B. L.
In-situ (scanning) transmission electron microscopy (S/TEM) is being developed for numerous applications in the study of nucleation and growth under electrochemical driving forces. For this type of experiment, one of the key parameters is to identify when nucleation initiates. Typically the process of identifying the moment that crystals begin to form is a manual process requiring the user to perform an observation and respond accordingly (adjust focus, magnification, translate the stage etc.). However, as the speed of the cameras being used to perform these observations increases, the ability of a user to “catch” the important initial stage of nucleation decreasesmore » (there is more information that is available in the first few milliseconds of the process). Here we show that video shot boundary detection (SBD) can automatically detect frames where a change in the image occurs. We show that this method can be applied to quickly and accurately identify points of change during crystal growth. This technique allows for automated segmentation of a digital stream for further analysis and the assignment of arbitrary time stamps for the initiation of processes that are independent of the user’s ability to observe and react.« less
Wesnes, Keith A; Barrett, Marilyn L; Udani, Jay K
2013-08-01
Energy drinks are widely available mostly containing glucose, and several have been demonstrated to improve alertness and cognitive function; these effects generally being identified 30-60min after administration. The present study assessed whether an energy shot without carbohydrates would affect major aspects of cognitive function and also mood in volunteers over a 6h time period. This randomized, double-blind, placebo-controlled,crossover study compared the acute effects of the energy shot with a matching placebo in 94 healthy volunteers. Cognitive function was assessed with a widely used set of automated tests of attention and memory. Mood was assessed with the Bond-Lader, Beck Anxiety Index, Beck Depression Index, Chalder Fatigue Scales (CFS), and the POMS. The volunteers were requested to limit their sleep to between 3 and 6h the night before each testing day. Compared to the placebo, the energy shot significantly improved 6 validated composite cognitive function measures from the CDR System as well as self-rated alertness; the benefits on 4 of the cognitive measures still remaining at 6h. The overall effect sizes of the performance improvements were in the small to medium range and thus notable in this field. In conclusion, an energy shot can significantly improve important aspects of cognitive function for up to 6h compared to placebo in partially sleep-deprived healthy volunteers. Copyright © 2013 Elsevier Ltd. All rights reserved.
Policy Brief: What is the Legal Framework for Automated Vehicles in Texas?
DOT National Transportation Integrated Search
2017-11-01
During the 85th Texas Legislature in 2017, Texas enacted a law related to automated vehicles. The bill, SB 2205, creates the legal framework for automated vehicle operation and testing in Texas. Although this law addresses a number of issues that can...
Divertor target shape optimization in realistic edge plasma geometry
NASA Astrophysics Data System (ADS)
Dekeyser, W.; Reiter, D.; Baelmans, M.
2014-07-01
Tokamak divertor design for next-step fusion reactors heavily relies on numerical simulations of the plasma edge. Currently, the design process is mainly done in a forward approach, where the designer is strongly guided by his experience and physical intuition in proposing divertor shapes, which are then thoroughly assessed by numerical computations. On the other hand, automated design methods based on optimization have proven very successful in the related field of aerodynamic design. By recasting design objectives and constraints into the framework of a mathematical optimization problem, efficient forward-adjoint based algorithms can be used to automatically compute the divertor shape which performs the best with respect to the selected edge plasma model and design criteria. In the past years, we have extended these methods to automated divertor target shape design, using somewhat simplified edge plasma models and geometries. In this paper, we build on and extend previous work to apply these shape optimization methods for the first time in more realistic, single null edge plasma and divertor geometry, as commonly used in current divertor design studies. In a case study with JET-like parameters, we show that the so-called one-shot method is very effective is solving divertor target design problems. Furthermore, by detailed shape sensitivity analysis we demonstrate that the development of the method already at the present state provides physically plausible trends, allowing to achieve a divertor design with an almost perfectly uniform power load for our particular choice of edge plasma model and design criteria.
Matching forensic sketches to mug shot photos.
Klare, Brendan F; Li, Zhifeng; Jain, Anil K
2011-03-01
The problem of matching a forensic sketch to a gallery of mug shot images is addressed in this paper. Previous research in sketch matching only offered solutions to matching highly accurate sketches that were drawn while looking at the subject (viewed sketches). Forensic sketches differ from viewed sketches in that they are drawn by a police sketch artist using the description of the subject provided by an eyewitness. To identify forensic sketches, we present a framework called local feature-based discriminant analysis (LFDA). In LFDA, we individually represent both sketches and photos using SIFT feature descriptors and multiscale local binary patterns (MLBP). Multiple discriminant projections are then used on partitioned vectors of the feature-based representation for minimum distance matching. We apply this method to match a data set of 159 forensic sketches against a mug shot gallery containing 10,159 images. Compared to a leading commercial face recognition system, LFDA offers substantial improvements in matching forensic sketches to the corresponding face images. We were able to further improve the matching performance using race and gender information to reduce the target gallery size. Additional experiments demonstrate that the proposed framework leads to state-of-the-art accuracys when matching viewed sketches.
Automated and Accurate Estimation of Gene Family Abundance from Shotgun Metagenomes
Nayfach, Stephen; Bradley, Patrick H.; Wyman, Stacia K.; Laurent, Timothy J.; Williams, Alex; Eisen, Jonathan A.; Pollard, Katherine S.; Sharpton, Thomas J.
2015-01-01
Shotgun metagenomic DNA sequencing is a widely applicable tool for characterizing the functions that are encoded by microbial communities. Several bioinformatic tools can be used to functionally annotate metagenomes, allowing researchers to draw inferences about the functional potential of the community and to identify putative functional biomarkers. However, little is known about how decisions made during annotation affect the reliability of the results. Here, we use statistical simulations to rigorously assess how to optimize annotation accuracy and speed, given parameters of the input data like read length and library size. We identify best practices in metagenome annotation and use them to guide the development of the Shotgun Metagenome Annotation Pipeline (ShotMAP). ShotMAP is an analytically flexible, end-to-end annotation pipeline that can be implemented either on a local computer or a cloud compute cluster. We use ShotMAP to assess how different annotation databases impact the interpretation of how marine metagenome and metatranscriptome functional capacity changes across seasons. We also apply ShotMAP to data obtained from a clinical microbiome investigation of inflammatory bowel disease. This analysis finds that gut microbiota collected from Crohn’s disease patients are functionally distinct from gut microbiota collected from either ulcerative colitis patients or healthy controls, with differential abundance of metabolic pathways related to host-microbiome interactions that may serve as putative biomarkers of disease. PMID:26565399
Erwin, Paul Campbell; Sheeler, Lorinda; Lott, John M
2009-01-01
An outbreak of foodborne hepatitis A infection compelled two regional health departments in eastern Tennessee to implement an emergency mass clinic for providing hepatitis immune serum globulin (ISG) to several thousand potentially exposed people. For the mass clinic framework, we utilized the smallpox post-event clinic plans of the Centers for Disease Control and Prevention (CDC), although the plans had only been exercised for smallpox. Following CDC's guidelines for staffing and organizing the mass clinic, we provided 5,038 doses of ISG during a total of 24 hours of clinic operation, using 3,467 person-hours, or 1.45 ISG doses per person-hour-very close to the 1.58 doses per person-hour targeted in CDC's smallpox post-event clinic plans. The mass clinic showed that CDC's smallpox post-event clinic guidelines were feasible, practical, and adaptable to other mass clinic situations.
van den Beukel, Arie P; van der Voort, Mascha C
2017-03-01
The introduction of partially automated driving systems changes the driving task into supervising the automation with an occasional need to intervene. To develop interface solutions that adequately support drivers in this new role, this study proposes and evaluates an assessment framework that allows designers to evaluate driver-support within relevant real-world scenarios. Aspects identified as requiring assessment in terms of driver-support within the proposed framework are Accident Avoidance, gained Situation Awareness (SA) and Concept Acceptance. Measurement techniques selected to operationalise these aspects and the associated framework are pilot-tested with twenty-four participants in a driving simulator experiment. The objective of the test is to determine the reliability of the applied measurements for the assessment of the framework and whether the proposed framework is effective in predicting the level of support offered by the concepts. Based on the congruency between measurement scores produced in the test and scores with predefined differences in concept-support, this study demonstrates the framework's reliability. A remaining concern is the framework's weak sensitivity to small differences in offered support. The article concludes that applying the framework is especially advantageous for evaluating early design phases and can successfully contribute to the efficient development of driver's in-control and safe means of operating partially automated vehicles. Copyright © 2016 Elsevier Ltd. All rights reserved.
Design Methodology for Automated Construction Machines
1987-12-11
along with the design of a pair of machines which automate framework installation.-,, 20. DISTRIBUTION IAVAILABILITY OF ABSTRACT 21. ABSTRACT SECURITY... Development Assistant Professor of Civil Engineering and Laura A . Demsetz, David H. Levy, Bruce Schena Graduate Research Assistants December 11, 1987 U.S...are discussed along with the design of a pair of machines which automate framework installation. Preliminary analysis and testing indicate that these
Preliminary Framework for Human-Automation Collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna Helene; Le Blanc, Katya Lee; Spielman, Zachary Alexander
The Department of Energy’s Advanced Reactor Technologies Program sponsors research, development and deployment activities through its Next Generation Nuclear Plant, Advanced Reactor Concepts, and Advanced Small Modular Reactor (aSMR) Programs to promote safety, technical, economical, and environmental advancements of innovative Generation IV nuclear energy technologies. The Human Automation Collaboration (HAC) Research Project is located under the aSMR Program, which identifies developing advanced instrumentation and controls and human-machine interfaces as one of four key research areas. It is expected that the new nuclear power plant designs will employ technology significantly more advanced than the analog systems in the existing reactor fleetmore » as well as utilizing automation to a greater extent. Moving towards more advanced technology and more automation does not necessary imply more efficient and safer operation of the plant. Instead, a number of concerns about how these technologies will affect human performance and the overall safety of the plant need to be addressed. More specifically, it is important to investigate how the operator and the automation work as a team to ensure effective and safe plant operation, also known as the human-automation collaboration (HAC). The focus of the HAC research is to understand how various characteristics of automation (such as its reliability, processes, and modes) effect an operator’s use and awareness of plant conditions. In other words, the research team investigates how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. This report addresses the Department of Energy milestone M4AT-15IN2302054, Complete Preliminary Framework for Human-Automation Collaboration, by discussing the two phased development of a preliminary HAC framework. The framework developed in the first phase was used as the basis for selecting topics to be investigated in more detail. The results and insights gained from the in-depth studies conducted during the second phase were used to revise the framework. This report describes the basis for the framework developed in phase 1, the changes made to the framework in phase 2, and the basis for the changes. Additional research needs are identified and presented in the last section of the report.« less
Augmenting the one-shot framework by additional constraints
Bosse, Torsten
2016-05-12
The (multistep) one-shot method for design optimization problems has been successfully implemented for various applications. To this end, a slowly convergent primal fixed-point iteration of the state equation is augmented by an adjoint iteration and a corresponding preconditioned design update. In this paper we present a modification of the method that allows for additional equality constraints besides the usual state equation. Finally, a retardation analysis and the local convergence of the method in terms of necessary and sufficient conditions are given, which depend on key characteristics of the underlying problem and the quality of the utilized preconditioner.
Augmenting the one-shot framework by additional constraints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosse, Torsten
The (multistep) one-shot method for design optimization problems has been successfully implemented for various applications. To this end, a slowly convergent primal fixed-point iteration of the state equation is augmented by an adjoint iteration and a corresponding preconditioned design update. In this paper we present a modification of the method that allows for additional equality constraints besides the usual state equation. Finally, a retardation analysis and the local convergence of the method in terms of necessary and sufficient conditions are given, which depend on key characteristics of the underlying problem and the quality of the utilized preconditioner.
Transductive multi-view zero-shot learning.
Fu, Yanwei; Hospedales, Timothy M; Xiang, Tao; Gong, Shaogang
2015-11-01
Most existing zero-shot learning approaches exploit transfer learning via an intermediate semantic representation shared between an annotated auxiliary dataset and a target dataset with different classes and no annotation. A projection from a low-level feature space to the semantic representation space is learned from the auxiliary dataset and applied without adaptation to the target dataset. In this paper we identify two inherent limitations with these approaches. First, due to having disjoint and potentially unrelated classes, the projection functions learned from the auxiliary dataset/domain are biased when applied directly to the target dataset/domain. We call this problem the projection domain shift problem and propose a novel framework, transductive multi-view embedding, to solve it. The second limitation is the prototype sparsity problem which refers to the fact that for each target class, only a single prototype is available for zero-shot learning given a semantic representation. To overcome this problem, a novel heterogeneous multi-view hypergraph label propagation method is formulated for zero-shot learning in the transductive embedding space. It effectively exploits the complementary information offered by different semantic representations and takes advantage of the manifold structures of multiple representation spaces in a coherent manner. We demonstrate through extensive experiments that the proposed approach (1) rectifies the projection shift between the auxiliary and target domains, (2) exploits the complementarity of multiple semantic representations, (3) significantly outperforms existing methods for both zero-shot and N-shot recognition on three image and video benchmark datasets, and (4) enables novel cross-view annotation tasks.
Part-task training in the context of automation: current and future directions.
Gutzwiller, Robert S; Clegg, Benjamin A; Blitch, John G
2013-01-01
Automation often elicits a divide-and-conquer outlook. By definition, automation has been suggested to assume control over a part or whole task that was previously performed by a human (Parasuraman & Riley, 1997). When such notions of automation are taken as grounds for training, they readily invoke a part-task training (PTT) approach. This article outlines broad functions of automation as a source of PTT and reviews the PTT literature, focusing on the potential benefits and costs related to using automation as a mechanism for PTT. The article reviews some past work in this area and suggests a path to move beyond the type of work captured by the "automation as PTT" framework. An illustrative experiment shows how automation in training and PTT are actually separable issues. PTT with automation has some utility but ultimately remains an unsatisfactory framework for the future broad potential of automation during training, and we suggest that a new conceptualization is needed.
TARDIS: An Automation Framework for JPL Mission Design and Navigation
NASA Technical Reports Server (NTRS)
Roundhill, Ian M.; Kelly, Richard M.
2014-01-01
Mission Design and Navigation at the Jet Propulsion Laboratory has implemented an automation framework tool to assist in orbit determination and maneuver design analysis. This paper describes the lessons learned from previous automation tools and how they have been implemented in this tool. In addition this tool has revealed challenges in software implementation, testing, and user education. This paper describes some of these challenges and invites others to share their experiences.
Spatial Expansion and Automation of the Pegasus Thomson Scattering Diagnostic System
NASA Astrophysics Data System (ADS)
Bodner, G. M.; Bongard, M. W.; Fonck, R. J.; Reusch, J. A.; Schlossberg, D. J.; Winz, G. R.
2015-11-01
The Pegasus Thomson scattering diagnostic system has recently undergone modifications to increase the spatial range of the diagnostic and automate the Thomson data collection process. Two multichannel spectrometers have been added to the original configuration, providing a total of 24 data channels to view the plasma volume. The new system configuration allows for observation of three distinct regions of the plasma: the local helicity injection (LHI) source (R ~ 67-73.8 cm), the plasma edge (R ~ 51.5-57.6 cm), and the plasma core (R ~ 35-41.1 cm). Each spectrometer utilizes a volume-phase holographic (VPH) grating and a gated-intensified CCD camera. The edge and the LHI spectrometers have been fitted with low-temperature VPH gratings to cover Te = 10 - 100 eV, while the core spectrometer has been fitted with a high-temperature VPH grating to cover Te = 0 . 1 - 1 . 0 keV. The additional spectrometers have been calibrated to account for detector flatness, detector linearity, and vignetting. Operation of the Thomson system has been overhauled to utilize LabVIEW software to synchronize the major components of the Thomson system with the Pegasus shot cycle and to provide intra-shot beam alignment. Multi-point Thomson scattering measurements will be obtained in the aforementioned regions of LHI and Ohmic discharges and will be compared to Langmuir probe measurements. Work supported by US DOE grant DE-FG02-96ER54375.
Augmenting SCA project management and automation framework
NASA Astrophysics Data System (ADS)
Iyapparaja, M.; Sharma, Bhanupriya
2017-11-01
In our daily life we need to keep the records of the things in order to manage it in more efficient and proper way. Our Company manufactures semiconductor chips and sale it to the buyer. Sometimes it manufactures the entire product and sometimes partially and sometimes it sales the intermediary product obtained during manufacturing, so for the better management of the entire process there is a need to keep the track record of all the entity involved in it. Materials and Methods: Therefore to overcome with the problem the need raised to develop the framework for the maintenance of the project and for the automation testing. Project management framework provides an architecture which supports in managing the project by marinating the records of entire requirements, the test cases that were created for testing each unit of the software, defect raised from the past years. So through this the quality of the project can be maintained. Results: Automation framework provides the architecture which supports the development and implementation of the automation test script for the software testing process. Conclusion: For implementing project management framework the product of HP that is Application Lifecycle management is used which provides central repository to maintain the project.
2004-12-01
handling using the X10 home automation protocol. Each 3D graphics client renders its scene according to an assigned virtual camera position. By having...control protocol. DMX is a versatile and robust framework which overcomes limitations of the X10 home automation protocol which we are currently using
A framework for the automated data-driven constitutive characterization of composites
J.G. Michopoulos; John Hermanson; T. Furukawa; A. Iliopoulos
2010-01-01
We present advances on the development of a mechatronically and algorithmically automated framework for the data-driven identification of constitutive material models based on energy density considerations. These models can capture both the linear and nonlinear constitutive response of multiaxially loaded composite materials in a manner that accounts for progressive...
Kawata, Yasuo; Arimura, Hidetaka; Ikushima, Koujirou; Jin, Ze; Morita, Kento; Tokunaga, Chiaki; Yabu-Uchi, Hidetake; Shioyama, Yoshiyuki; Sasaki, Tomonari; Honda, Hiroshi; Sasaki, Masayuki
2017-10-01
The aim of this study was to investigate the impact of pixel-based machine learning (ML) techniques, i.e., fuzzy-c-means clustering method (FCM), and the artificial neural network (ANN) and support vector machine (SVM), on an automated framework for delineation of gross tumor volume (GTV) regions of lung cancer for stereotactic body radiation therapy. The morphological and metabolic features for GTV regions, which were determined based on the knowledge of radiation oncologists, were fed on a pixel-by-pixel basis into the respective FCM, ANN, and SVM ML techniques. Then, the ML techniques were incorporated into the automated delineation framework of GTVs followed by an optimum contour selection (OCS) method, which we proposed in a previous study. The three-ML-based frameworks were evaluated for 16 lung cancer cases (six solid, four ground glass opacity (GGO), six part-solid GGO) with the datasets of planning computed tomography (CT) and 18 F-fluorodeoxyglucose (FDG) positron emission tomography (PET)/CT images using the three-dimensional Dice similarity coefficient (DSC). DSC denotes the degree of region similarity between the GTVs contoured by radiation oncologists and those estimated using the automated framework. The FCM-based framework achieved the highest DSCs of 0.79±0.06, whereas DSCs of the ANN-based and SVM-based frameworks were 0.76±0.14 and 0.73±0.14, respectively. The FCM-based framework provided the highest segmentation accuracy and precision without a learning process (lowest calculation cost). Therefore, the FCM-based framework can be useful for delineation of tumor regions in practical treatment planning. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
A framework for automatic information quality ranking of diabetes websites.
Belen Sağlam, Rahime; Taskaya Temizel, Tugba
2015-01-01
Objective: When searching for particular medical information on the internet the challenge lies in distinguishing the websites that are relevant to the topic, and contain accurate information. In this article, we propose a framework that automatically identifies and ranks diabetes websites according to their relevance and information quality based on the website content. Design: The proposed framework ranks diabetes websites according to their content quality, relevance and evidence based medicine. The framework combines information retrieval techniques with a lexical resource based on Sentiwordnet making it possible to work with biased and untrusted websites while, at the same time, ensuring the content relevance. Measurement: The evaluation measurements used were Pearson-correlation, true positives, false positives and accuracy. We tested the framework with a benchmark data set consisting of 55 websites with varying degrees of information quality problems. Results: The proposed framework gives good results that are comparable with the non-automated information quality measuring approaches in the literature. The correlation between the results of the proposed automated framework and ground-truth is 0.68 on an average with p < 0.001 which is greater than the other proposed automated methods in the literature (r score in average is 0.33).
Automated measurement of vocal fold vibratory asymmetry from high-speed videoendoscopy recordings.
Mehta, Daryush D; Deliyski, Dimitar D; Quatieri, Thomas F; Hillman, Robert E
2011-02-01
In prior work, a manually derived measure of vocal fold vibratory phase asymmetry correlated to varying degrees with visual judgments made from laryngeal high-speed videoendoscopy (HSV) recordings. This investigation extended this work by establishing an automated HSV-based framework to quantify 3 categories of vocal fold vibratory asymmetry. HSV-based analysis provided for cycle-to-cycle estimates of left-right phase asymmetry, left-right amplitude asymmetry, and axis shift during glottal closure for 52 speakers with no vocal pathology producing comfortable and pressed phonation. An initial cross-validation of the automated left-right phase asymmetry measure was performed by correlating the measure with other objective and subjective assessments of phase asymmetry. Vocal fold vibratory asymmetry was exhibited to a similar extent in both comfortable and pressed phonations. The automated measure of left-right phase asymmetry strongly correlated with manually derived measures and moderately correlated with visual-perceptual ratings. Correlations with the visual-perceptual ratings remained relatively consistent as the automated measure was derived from kymograms taken at different glottal locations. An automated HSV-based framework for the quantification of vocal fold vibratory asymmetry was developed and initially validated. This framework serves as a platform for investigating relationships between vocal fold tissue motion and acoustic measures of voice function.
Interleaved EPI diffusion imaging using SPIRiT-based reconstruction with virtual coil compression.
Dong, Zijing; Wang, Fuyixue; Ma, Xiaodong; Zhang, Zhe; Dai, Erpeng; Yuan, Chun; Guo, Hua
2018-03-01
To develop a novel diffusion imaging reconstruction framework based on iterative self-consistent parallel imaging reconstruction (SPIRiT) for multishot interleaved echo planar imaging (iEPI), with computation acceleration by virtual coil compression. As a general approach for autocalibrating parallel imaging, SPIRiT improves the performance of traditional generalized autocalibrating partially parallel acquisitions (GRAPPA) methods in that the formulation with self-consistency is better conditioned, suggesting SPIRiT to be a better candidate in k-space-based reconstruction. In this study, a general SPIRiT framework is adopted to incorporate both coil sensitivity and phase variation information as virtual coils and then is applied to 2D navigated iEPI diffusion imaging. To reduce the reconstruction time when using a large number of coils and shots, a novel shot-coil compression method is proposed for computation acceleration in Cartesian sampling. Simulations and in vivo experiments were conducted to evaluate the performance of the proposed method. Compared with the conventional coil compression, the shot-coil compression achieved higher compression rates with reduced errors. The simulation and in vivo experiments demonstrate that the SPIRiT-based reconstruction outperformed the existing method, realigned GRAPPA, and provided superior images with reduced artifacts. The SPIRiT-based reconstruction with virtual coil compression is a reliable method for high-resolution iEPI diffusion imaging. Magn Reson Med 79:1525-1531, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
DoD Application Store: Enabling C2 Agility?
2014-06-01
Framework, will include automated delivery of software patches, web applications, widgets and mobile application packages. The envisioned DoD...Marketplace within the Ozone Widget Framework, will include automated delivery of software patches, web applications, widgets and mobile application...current needs. DoD has started to make inroads within this environment with several Programs of Record (PoR) embracing widgets and other mobile
Rand, David G.; Kraft-Todd, Gordon; Gruber, June
2015-01-01
Cooperation is central to human existence, forming the bedrock of everyday social relationships and larger societal structures. Thus, understanding the psychological underpinnings of cooperation is of both scientific and practical importance. Recent work using a dual-process framework suggests that intuitive processing can promote cooperation while deliberative processing can undermine it. Here we add to this line of research by more specifically identifying deliberative and intuitive processes that affect cooperation. To do so, we applied automated text analysis using the Linguistic Inquiry and Word Count (LIWC) software to investigate the association between behavior in one-shot anonymous economic cooperation games and the presence inhibition (a deliberative process) and positive emotion (an intuitive process) in free-response narratives written after (Study 1, N = 4,218) or during (Study 2, N = 236) the decision-making process. Consistent with previous results, across both studies inhibition predicted reduced cooperation while positive emotion predicted increased cooperation (even when controlling for negative emotion). Importantly, there was a significant interaction between positive emotion and inhibition, such that the most cooperative individuals had high positive emotion and low inhibition. This suggests that inhibition (i.e., reflective or deliberative processing) may undermine cooperative behavior by suppressing the prosocial effects of positive emotion. PMID:25625722
Rand, David G; Kraft-Todd, Gordon; Gruber, June
2015-01-01
Cooperation is central to human existence, forming the bedrock of everyday social relationships and larger societal structures. Thus, understanding the psychological underpinnings of cooperation is of both scientific and practical importance. Recent work using a dual-process framework suggests that intuitive processing can promote cooperation while deliberative processing can undermine it. Here we add to this line of research by more specifically identifying deliberative and intuitive processes that affect cooperation. To do so, we applied automated text analysis using the Linguistic Inquiry and Word Count (LIWC) software to investigate the association between behavior in one-shot anonymous economic cooperation games and the presence inhibition (a deliberative process) and positive emotion (an intuitive process) in free-response narratives written after (Study 1, N = 4,218) or during (Study 2, N = 236) the decision-making process. Consistent with previous results, across both studies inhibition predicted reduced cooperation while positive emotion predicted increased cooperation (even when controlling for negative emotion). Importantly, there was a significant interaction between positive emotion and inhibition, such that the most cooperative individuals had high positive emotion and low inhibition. This suggests that inhibition (i.e., reflective or deliberative processing) may undermine cooperative behavior by suppressing the prosocial effects of positive emotion.
A Computational Framework for Automation of Point Defect Calculations
NASA Astrophysics Data System (ADS)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration
A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.
Automation of fluorescent differential display with digital readout.
Meade, Jonathan D; Cho, Yong-Jig; Fisher, Jeffrey S; Walden, Jamie C; Guo, Zhen; Liang, Peng
2006-01-01
Since its invention in 1992, differential display (DD) has become the most commonly used technique for identifying differentially expressed genes because of its many advantages over competing technologies such as DNA microarray, serial analysis of gene expression (SAGE), and subtractive hybridization. Despite the great impact of the method on biomedical research, there has been a lack of automation of DD technology to increase its throughput and accuracy for systematic gene expression analysis. Most of previous DD work has taken a "shot-gun" approach of identifying one gene at a time, with a limited number of polymerase chain reaction (PCR) reactions set up manually, giving DD a low-tech and low-throughput image. We have optimized the DD process with a new platform that incorporates fluorescent digital readout, automated liquid handling, and large-format gels capable of running entire 96-well plates. The resulting streamlined fluorescent DD (FDD) technology offers an unprecedented accuracy, sensitivity, and throughput in comprehensive and quantitative analysis of gene expression. These major improvements will allow researchers to find differentially expressed genes of interest, both known and novel, quickly and easily.
Human factors evaluation of level 2 and level 3 automated driving concepts : concepts of operation.
DOT National Transportation Integrated Search
2014-07-01
The Concepts of Operation document evaluates the functional framework of operations for Level 2 and Level 3 automated vehicle systems. This is done by defining the varying levels of automation, the operator vehicle interactions, and system components...
A Framework for Modeling Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)
1996-01-01
Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.
Xu, Yang; Liu, Yuan-Zhi; Boppart, Stephen A; Carney, P Scott
2016-03-10
In this paper, we introduce an algorithm framework for the automation of interferometric synthetic aperture microscopy (ISAM). Under this framework, common processing steps such as dispersion correction, Fourier domain resampling, and computational adaptive optics aberration correction are carried out as metrics-assisted parameter search problems. We further present the results of this algorithm applied to phantom and biological tissue samples and compare with manually adjusted results. With the automated algorithm, near-optimal ISAM reconstruction can be achieved without manual adjustment. At the same time, the technical barrier for the nonexpert using ISAM imaging is also significantly lowered.
Exploring the Use of a Test Automation Framework
NASA Technical Reports Server (NTRS)
Cervantes, Alex
2009-01-01
It is known that software testers, more often than not, lack the time needed to fully test the delivered software product within the time period allotted to them. When problems in the implementation phase of a development project occur, it normally causes the software delivery date to slide. As a result, testers either need to work longer hours, or supplementary resources need to be added to the test team in order to meet aggressive test deadlines. One solution to this problem is to provide testers with a test automation framework to facilitate the development of automated test solutions.
A Videography Analysis Framework for Video Retrieval and Summarization (Open Access)
2012-09-07
J. S. D. Mason, and M.Pawlewski. Video genre classification using dy- namics. In IEEE ICASSP, 2001. [16] Ashutosh Saxena, Sung H. Chung, and Andrew Y...directing semantics for film shot classification. IEEE Transactions on Circuits and Systems for Video Technology (TCSVT), 19(10):1529–1542, 2009. [23
DataForge: Modular platform for data storage and analysis
NASA Astrophysics Data System (ADS)
Nozik, Alexander
2018-04-01
DataForge is a framework for automated data acquisition, storage and analysis based on modern achievements of applied programming. The aim of the DataForge is to automate some standard tasks like parallel data processing, logging, output sorting and distributed computing. Also the framework extensively uses declarative programming principles via meta-data concept which allows a certain degree of meta-programming and improves results reproducibility.
Automated Fluid Feature Extraction from Transient Simulations
NASA Technical Reports Server (NTRS)
Haimes, Robert
2000-01-01
In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense.
Intuition, deliberation, and the evolution of cooperation
Bear, Adam; Rand, David G.
2016-01-01
Humans often cooperate with strangers, despite the costs involved. A long tradition of theoretical modeling has sought ultimate evolutionary explanations for this seemingly altruistic behavior. More recently, an entirely separate body of experimental work has begun to investigate cooperation’s proximate cognitive underpinnings using a dual-process framework: Is deliberative self-control necessary to reign in selfish impulses, or does self-interested deliberation restrain an intuitive desire to cooperate? Integrating these ultimate and proximate approaches, we introduce dual-process cognition into a formal game-theoretic model of the evolution of cooperation. Agents play prisoner’s dilemma games, some of which are one-shot and others of which involve reciprocity. They can either respond by using a generalized intuition, which is not sensitive to whether the game is one-shot or reciprocal, or pay a (stochastically varying) cost to deliberate and tailor their strategy to the type of game they are facing. We find that, depending on the level of reciprocity and assortment, selection favors one of two strategies: intuitive defectors who never deliberate, or dual-process agents who intuitively cooperate but sometimes use deliberation to defect in one-shot games. Critically, selection never favors agents who use deliberation to override selfish impulses: Deliberation only serves to undermine cooperation with strangers. Thus, by introducing a formal theoretical framework for exploring cooperation through a dual-process lens, we provide a clear answer regarding the role of deliberation in cooperation based on evolutionary modeling, help to organize a growing body of sometimes-conflicting empirical results, and shed light on the nature of human cognition and social decision making. PMID:26755603
Intuition, deliberation, and the evolution of cooperation.
Bear, Adam; Rand, David G
2016-01-26
Humans often cooperate with strangers, despite the costs involved. A long tradition of theoretical modeling has sought ultimate evolutionary explanations for this seemingly altruistic behavior. More recently, an entirely separate body of experimental work has begun to investigate cooperation's proximate cognitive underpinnings using a dual-process framework: Is deliberative self-control necessary to reign in selfish impulses, or does self-interested deliberation restrain an intuitive desire to cooperate? Integrating these ultimate and proximate approaches, we introduce dual-process cognition into a formal game-theoretic model of the evolution of cooperation. Agents play prisoner's dilemma games, some of which are one-shot and others of which involve reciprocity. They can either respond by using a generalized intuition, which is not sensitive to whether the game is one-shot or reciprocal, or pay a (stochastically varying) cost to deliberate and tailor their strategy to the type of game they are facing. We find that, depending on the level of reciprocity and assortment, selection favors one of two strategies: intuitive defectors who never deliberate, or dual-process agents who intuitively cooperate but sometimes use deliberation to defect in one-shot games. Critically, selection never favors agents who use deliberation to override selfish impulses: Deliberation only serves to undermine cooperation with strangers. Thus, by introducing a formal theoretical framework for exploring cooperation through a dual-process lens, we provide a clear answer regarding the role of deliberation in cooperation based on evolutionary modeling, help to organize a growing body of sometimes-conflicting empirical results, and shed light on the nature of human cognition and social decision making.
A computational framework for automation of point defect calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei
We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.
A computational framework for automation of point defect calculations
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; ...
2017-01-13
We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.
Process development for automated solar cell and module production. Task 4: Automated array assembly
NASA Technical Reports Server (NTRS)
Hagerty, J. J.
1981-01-01
The cell preparation station was installed in its new enclosure. Operation verification tests were performed. The detailed layout drawings of the automated lamination station were produced and construction began. All major and most minor components were delivered by vendors. The station framework was built and assembly of components begun.
Developing a General Framework for Human Autonomy Teaming
NASA Technical Reports Server (NTRS)
Lachter, Joel; Brandt, Summer; Shively, Jay
2017-01-01
Automation has entered nearly every aspect of our lives, but it often remains hard to understand. Why is this? Automation is often brittle, requiring constant human oversight to assure it operates as intended. This oversight has become harder as automation has become more complicated. To resolve this problem, Human-Autonomy Teaming (HAT) has been proposed. HAT looks to make automation act as more of a teammate, by having it communicate with human operators in a more human, goal-directed, manner which provides transparency into the reasoning behind automated recommendations and actions. This, in turn, permits more trust in the automation when it is appropriate, and less when it is not, allowing a more targeted supervision of automated functions. This paper proposes a framework for HAT, incorporating two key tenets: bi-directional communication, and operator directed authority. We have successfully applied these tenets to integrating the autonomous constrained flight planner (an aide for planning diverts) into a dispatch station. We propose the development of general design patterns that may allow these results to be generalized to domains such as photography and automotive navigation. While these domains are very different, we find application of our HAT tenets provides a number of opportunities for improving interaction between human operators and automation.
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Shih, Ann T.
2014-01-01
This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.
Improvement of Binary Analysis Components in Automated Malware Analysis Framework
2017-02-21
analyze malicious software (malware) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program...AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...currently valid OMB control number . PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. 1. REPORT DATE (DD-MM-YYYY) 21-02-2017 2. REPORT
Automated Analysis of Stateflow Models
NASA Technical Reports Server (NTRS)
Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier
2017-01-01
Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.
Benefits estimation framework for automated vehicle operations.
DOT National Transportation Integrated Search
2015-08-01
Automated vehicles have the potential to bring about transformative safety, mobility, energy, and environmental benefits to the surface transportation system. They are also being introduced into a complex transportation system, where second-order imp...
Augmenting team cognition in human-automation teams performing in complex operational environments.
Cuevas, Haydee M; Fiore, Stephen M; Caldwell, Barrett S; Strater, Laura
2007-05-01
There is a growing reliance on automation (e.g., intelligent agents, semi-autonomous robotic systems) to effectively execute increasingly cognitively complex tasks. Successful team performance for such tasks has become even more dependent on team cognition, addressing both human-human and human-automation teams. Team cognition can be viewed as the binding mechanism that produces coordinated behavior within experienced teams, emerging from the interplay between each team member's individual cognition and team process behaviors (e.g., coordination, communication). In order to better understand team cognition in human-automation teams, team performance models need to address issues surrounding the effect of human-agent and human-robot interaction on critical team processes such as coordination and communication. Toward this end, we present a preliminary theoretical framework illustrating how the design and implementation of automation technology may influence team cognition and team coordination in complex operational environments. Integrating constructs from organizational and cognitive science, our proposed framework outlines how information exchange and updating between humans and automation technology may affect lower-level (e.g., working memory) and higher-level (e.g., sense making) cognitive processes as well as teams' higher-order "metacognitive" processes (e.g., performance monitoring). Issues surrounding human-automation interaction are discussed and implications are presented within the context of designing automation technology to improve task performance in human-automation teams.
A framework for simultaneous aerodynamic design optimization in the presence of chaos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Günther, Stefanie, E-mail: stefanie.guenther@scicomp.uni-kl.de; Gauger, Nicolas R.; Wang, Qiqi
Integrating existing solvers for unsteady partial differential equations into a simultaneous optimization method is challenging due to the forward-in-time information propagation of classical time-stepping methods. This paper applies the simultaneous single-step one-shot optimization method to a reformulated unsteady constraint that allows for both forward- and backward-in-time information propagation. Especially in the presence of chaotic and turbulent flow, solving the initial value problem simultaneously with the optimization problem often scales poorly with the time domain length. The new formulation relaxes the initial condition and instead solves a least squares problem for the discrete partial differential equations. This enables efficient one-shot optimizationmore » that is independent of the time domain length, even in the presence of chaos.« less
Karampinos, Dimitrios C.; Banerjee, Suchandrima; King, Kevin F.; Link, Thomas M.; Majumdar, Sharmila
2011-01-01
Previous studies have shown that skeletal muscle diffusion tensor imaging (DTI) can non-invasively probe changes in the muscle fiber architecture and microstructure in diseased and damaged muscles. However, DTI fiber reconstruction in small muscles and in muscle regions close to aponeuroses and tendons remains challenging because of partial volume effects. Increasing the spatial resolution of skeletal muscle single-shot diffusion weighted (DW)-EPI can be hindered by the inherently low SNR of muscle DW-EPI due to the short muscle T2 and the high sensitivity of single-shot EPI to off-resonance effects and T2* blurring. In the present work, eddy-current compensated diffusion-weighted stimulated echo preparation is combined with sensitivity encoding (SENSE) to maintain good SNR properties and reduce the sensitivity to distortions and T2* blurring in high resolution skeletal muscle single-shot DW-EPI. An analytical framework is developed for optimizing the reduction factor and diffusion weighting time to achieve maximum SNR. Arguments for the selection of the experimental parameters are then presented considering the compromise between SNR, B0-induced distortions, T2* blurring effects and tissue incoherent motion effects. Based on the selected parameters in a high resolution skeletal muscle single-shot DW-EPI protocol, imaging protocols at lower acquisition matrix sizes are defined with matched bandwidth in the phase-encoding direction and SNR. In vivo results show that high resolution skeletal muscle DTI with minimized sensitivity to geometric distortions and T2* blurring is feasible using the proposed methodology. In particular, a significant benefit is demonstrated from reducing partial volume effects on resolving multi-pennate muscles and muscles with small cross sections in calf muscle DTI. PMID:22081519
ActionMap: A web-based software that automates loci assignments to framework maps.
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-07-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).
ActionMap: a web-based software that automates loci assignments to framework maps
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-01-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/). PMID:12824426
Kukhareva, Polina V; Kawamoto, Kensaku; Shields, David E; Barfuss, Darryl T; Halley, Anne M; Tippetts, Tyler J; Warner, Phillip B; Bray, Bruce E; Staes, Catherine J
2014-01-01
Electronic quality measurement (QM) and clinical decision support (CDS) are closely related but are typically implemented independently, resulting in significant duplication of effort. While it seems intuitive that technical approaches could be re-used across these two related use cases, such reuse is seldom reported in the literature, especially for standards-based approaches. Therefore, we evaluated the feasibility of using a standards-based CDS framework aligned with anticipated EHR certification criteria to implement electronic QM. The CDS-QM framework was used to automate a complex national quality measure (SCIP-VTE-2) at an academic healthcare system which had previously relied on time-consuming manual chart abstractions. Compared with 305 manually-reviewed reference cases, the recall of automated measurement was 100%. The precision was 96.3% (CI:92.6%-98.5%) for ascertaining the denominator and 96.2% (CI:92.3%-98.4%) for the numerator. We therefore validated that a standards-based CDS-QM framework can successfully enable automated QM, and we identified benefits and challenges with this approach. PMID:25954389
Toward a Framework for Dynamic Service Binding in E-Procurement
NASA Astrophysics Data System (ADS)
Ashoori, Maryam; Eze, Benjamin; Benyoucef, Morad; Peyton, Liam
In an online environment, an E-Procurement process should be able to react and adapt in near real-time to changes in suppliers, requirements, and regulations. WS-BPEL is an emerging standard for process automation, but is oriented towards design-time binding of services. This missing issue can be resolved through designing an extension to WS-BPEL to support automation of flexible e-Procurement processes. Our proposed framework will support dynamic acquisition of procurement services from different suppliers dealing with changing procurement requirements. The proposed framework is illustrated by applying it to health care where different health insurance providers could be involved to procure the medication for patients.
Cyber Security Research Frameworks For Coevolutionary Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rush, George D.; Tauritz, Daniel Remy
Several architectures have been created for developing and testing systems used in network security, but most are meant to provide a platform for running cyber security experiments as opposed to automating experiment processes. In the first paper, we propose a framework termed Distributed Cyber Security Automation Framework for Experiments (DCAFE) that enables experiment automation and control in a distributed environment. Predictive analysis of adversaries is another thorny issue in cyber security. Game theory can be used to mathematically analyze adversary models, but its scalability limitations restrict its use. Computational game theory allows us to scale classical game theory to larger,more » more complex systems. In the second paper, we propose a framework termed Coevolutionary Agent-based Network Defense Lightweight Event System (CANDLES) that can coevolve attacker and defender agent strategies and capabilities and evaluate potential solutions with a custom network defense simulation. The third paper is a continuation of the CANDLES project in which we rewrote key parts of the framework. Attackers and defenders have been redesigned to evolve pure strategy, and a new network security simulation is devised which specifies network architecture and adds a temporal aspect. We also add a hill climber algorithm to evaluate the search space and justify the use of a coevolutionary algorithm.« less
Modeling treatment of ischemic heart disease with partially observable Markov decision processes.
Hauskrecht, M; Fraser, H
1998-01-01
Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead they are very often dependent and interleaved over time, mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of Partially observable Markov decision processes (POMDPs) developed and used in operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In the paper, we show how the POMDP framework could be used to model and solve the problem of the management of patients with ischemic heart disease, and point out modeling advantages of the framework over standard decision formalisms.
ERIC Educational Resources Information Center
Bodily, Robert; Nyland, Rob; Wiley, David
2017-01-01
The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…
Joint modality fusion and temporal context exploitation for semantic video analysis
NASA Astrophysics Data System (ADS)
Papadopoulos, Georgios Th; Mezaris, Vasileios; Kompatsiaris, Ioannis; Strintzis, Michael G.
2011-12-01
In this paper, a multi-modal context-aware approach to semantic video analysis is presented. Overall, the examined video sequence is initially segmented into shots and for every resulting shot appropriate color, motion and audio features are extracted. Then, Hidden Markov Models (HMMs) are employed for performing an initial association of each shot with the semantic classes that are of interest separately for each modality. Subsequently, a graphical modeling-based approach is proposed for jointly performing modality fusion and temporal context exploitation. Novelties of this work include the combined use of contextual information and multi-modal fusion, and the development of a new representation for providing motion distribution information to HMMs. Specifically, an integrated Bayesian Network is introduced for simultaneously performing information fusion of the individual modality analysis results and exploitation of temporal context, contrary to the usual practice of performing each task separately. Contextual information is in the form of temporal relations among the supported classes. Additionally, a new computationally efficient method for providing motion energy distribution-related information to HMMs, which supports the incorporation of motion characteristics from previous frames to the currently examined one, is presented. The final outcome of this overall video analysis framework is the association of a semantic class with every shot. Experimental results as well as comparative evaluation from the application of the proposed approach to four datasets belonging to the domains of tennis, news and volleyball broadcast video are presented.
Developing and testing a decision model for predicting influenza vaccination compliance.
Carter, W B; Beach, L R; Inui, T S; Kirscht, J P; Prodzinski, J C
1986-01-01
Influenza vaccination has long been recommended for elderly high-risk patients, yet national surveys indicate that vaccination compliance rates are remarkably low (20 percent). We conducted a study to model prospectively the flu shot decisions and subsequent behavior of an elderly and/or chronically diseased (at high risk for complications of influenza) ambulatory care population at the Seattle VA Medical Center. Prior to the 1980-81 flu shot season, a random (stratified by disease) sample of 63 patients, drawn from the total population of high-risk patients in the general medicine clinic, was interviewed to identify patient-defined concerns regarding flu shots. Six potential consequences of influenza and nine of vaccination were emphasized by patients and provided the content for a weighted hierarchical utility model questionnaire. The utility model provides an operational framework for (1) obtaining subjective value and relative importance judgments from patients; (2) combining these judgments to obtain a prediction of behavioral intention and behavior for each patient; and, if the model is valid (predictive of behavior), (3) identifying those factors which are most salient to patient's decisions and subsequent behavior. Prior to the 1981-82 flu season, the decision model questionnaire was administered to 350 other high-risk patients from the same general medicine clinic population. The decision model correctly predicted behavioral intention for 87 percent and vaccination behavior for 82 percent of this population and, more importantly, differentiated shot "takers" and "nontakers" along several attitudinal dimensions that suggest specific content areas for clinical compliance intervention strategies. PMID:3949541
Surface Signatures of an Underground Explosion as Captured by Photogrammetry
NASA Astrophysics Data System (ADS)
Schultz-Fellenz, E. S.; Sussman, A. J.; Swanson, E.; Coppersmith, R.; Cooley, J.; Rougier, E.; Larmat, C. S.; Norskog, K.
2016-12-01
This study employed high-resolution photogrammetric modeling to quantify cm-scale surface topographic changes resulting from a 5000kg underground chemical explosion. The test occurred in April 2016 at a depth of 76m within a quartz monzonite intrusion in southern Nevada. The field area was a 210m x 150m polygon broadly centered on the explosion's emplacement hole. A grid of ground control points (GCPs) installed in the field area established control within the collection boundaries and ensured high-resolution digital model parameterization. Using RTK GPS techniques, GCP targets were surveyed in the days before and then again immediately after the underground explosion. A quadcopter UAS with a 12MP camera payload captured overlapping imagery at two flight altitudes (10m and 30m AGL) along automated flight courses for consistency and repeatability. The overlapping imagery was used to generate two digital elevation models, pre-shot and post-shot, for each of the flight altitudes. Spatial analyses of the DEMs and orthoimagery show uplift on the order of 1 to 18cm in the immediate area near ground zero. Other features such as alluvial fracturing appear in the photogrammetric and topographic datasets. Portions of the nearby granite outcrop experienced rock fall and rock rotation. The study detected erosional and depositional features on the test bed and adjacent to it. In addition to vertical change, pre-shot and post-shot surveys of the GCPs suggest evidence for lateral motion on the test bed surface, with movement away from surface ground zero on the order of 1 to 3cm. Results demonstrate that UAS photogrammetry method provides an efficient, high-fidelity, non-invasive method to quantify surface deformation. The photogrammetry data allow quantification of permanent surface deformation and of the spatial extent of damage. These constraints are necessary to develop hydrodynamic and seismic models of explosions that can be verified against recorded seismic data.
An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)
NASA Astrophysics Data System (ADS)
van den Heever, Lize; Marais, Neilen; Slabber, Martin
2016-08-01
This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.
Integrated approach to multimodal media content analysis
NASA Astrophysics Data System (ADS)
Zhang, Tong; Kuo, C.-C. Jay
1999-12-01
In this work, we present a system for the automatic segmentation, indexing and retrieval of audiovisual data based on the combination of audio, visual and textural content analysis. The video stream is demultiplexed into audio, image and caption components. Then, a semantic segmentation of the audio signal based on audio content analysis is conducted, and each segment is indexed as one of the basic audio types. The image sequence is segmented into shots based on visual information analysis, and keyframes are extracted from each shot. Meanwhile, keywords are detected from the closed caption. Index tables are designed for both linear and non-linear access to the video. It is shown by experiments that the proposed methods for multimodal media content analysis are effective. And that the integrated framework achieves satisfactory results for video information filtering and retrieval.
Embodied Interactions in Human-Machine Decision Making for Situation Awareness Enhancement Systems
2016-06-09
characterize differences in spatial navigation strategies in a complex task, the Traveling Salesman Problem (TSP). For the second year, we developed...visual processing, leading to better solutions for spatial optimization problems . I will develop a framework to determine which body expressions best...methods include systematic characterization of gestures during complex problem solving. 15. SUBJECT TERMS Embodied interaction, gestures, one-shot
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data
2017-01-01
Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects. PMID:28984823
Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data.
Falque, Raphael; Vidal-Calleja, Teresa; Miro, Jaime Valls
2017-10-06
Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects.
Modeling Multiple Human-Automation Distributed Systems using Network-form Games
NASA Technical Reports Server (NTRS)
Brat, Guillaume
2012-01-01
The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.
DOT National Transportation Integrated Search
2018-01-07
Connected and automated vehicles (CAV) are poised to transform surface transportation systems in the United States. Near-term CAV technologies like cooperative adaptive cruise control (CACC) have the potential to deliver energy efficiency and air qua...
Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Shih, Ann T.
2015-01-01
Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system failures and anomalies of avionic systems are also incorporated. The resultant model helps simulate the emergence of automation-related issues in today's modern airliners from a top-down, generalized approach, which serves as a platform to evaluate NASA developed technologies
A continuous dry 300 mK cooler for THz sensing applications.
Klemencic, G M; Ade, P A R; Chase, S; Sudiwala, R; Woodcraft, A L
2016-04-01
We describe and demonstrate the automated operation of a novel cryostat design that is capable of maintaining an unloaded base temperature of less than 300 mK continuously, without the need to recycle the gases within the final cold head, as is the case for conventional single shot sorption pumped (3)He cooling systems. This closed dry system uses only 5 l of (3)He gas, making this an economical alternative to traditional systems where a long hold time is required. During testing, a temperature of 365 mK was maintained with a constant 20 μW load, simulating the cooling requirement of a far infrared camera.
We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...
We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...
Matsuda, Fumio; Nakabayashi, Ryo; Sawada, Yuji; Suzuki, Makoto; Hirai, Masami Y.; Kanaya, Shigehiko; Saito, Kazuki
2011-01-01
A novel framework for automated elucidation of metabolite structures in liquid chromatography–mass spectrometer metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method. PMID:22645535
Planning treatment of ischemic heart disease with partially observable Markov decision processes.
Hauskrecht, M; Fraser, H
2000-03-01
Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead, they are very often dependent and interleaved over time. This is mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of partially observable Markov decision processes (POMDPs) developed and used in the operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In this paper, we show how the POMDP framework can be used to model and solve the problem of the management of patients with ischemic heart disease (IHD), and demonstrate the modeling advantages of the framework over standard decision formalisms.
The Future of Library Automation in Schools.
ERIC Educational Resources Information Center
Anderson, Elaine
2000-01-01
Addresses the future of library automation programs for schools. Discusses requirements of emerging OPACs and circulation systems; the Schools Interoperability Framework (SIF), an industry initiatives to develop an open specification for ensuring that K-12 instructional and administrative software applications work together more effectively; home…
Initial Assessment and Modeling Framework Development for Automated Mobility Districts: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hou, Yi; Young, Stanley E; Garikapati, Venu
Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displaces private automobiles for day-to-day travel in dense activity districts. This paper examines a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMD). This paper reviews several such districts, including airports, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology andmore » others with more traditional transit-based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs« less
NASA Astrophysics Data System (ADS)
Zhang, Pengpeng
The Leksell Gamma KnifeRTM (LGK) is a tool for providing accurate stereotactic radiosurgical treatment of brain lesions, especially tumors. Currently, the treatment planning team "forward" plans radiation treatment parameters while viewing a series of 2D MR scans. This primarily manual process is cumbersome and time consuming because the difficulty in visualizing the large search space for the radiation parameters (i.e., shot overlap, number, location, size, and weight). I hypothesize that a computer-aided "inverse" planning procedure that utilizes tumor geometry and treatment goals could significantly improve the planning process and therapeutic outcome of LGK radiosurgery. My basic observation is that the treatment team is best at identification of the location of the lesion and prescribing a lethal, yet safe, radiation dose. The treatment planning computer is best at determining both the 3D tumor geometry and optimal LGK shot parameters necessary to deliver a desirable dose pattern to the tumor while sparing adjacent normal tissue. My treatment planning procedure asks the neurosurgeon to identify the tumor and critical structures in MR images and the oncologist to prescribe a tumoricidal radiation dose. Computer-assistance begins with geometric modeling of the 3D tumor's medial axis properties. This begins with a new algorithm, a Gradient-Phase Plot (G-P Plot) decomposition of the tumor object's medial axis. I have found that medial axis seeding, while insufficient in most cases to produce an acceptable treatment plan, greatly reduces the solution space for Guided Evolutionary Simulated Annealing (GESA) treatment plan optimization by specifying an initial estimate for shot number, size, and location, but not weight. They are used to generate multiple initial plans which become initial seed plans for GESA. The shot location and weight parameters evolve and compete in the GESA procedure. The GESA objective function optimizes tumor irradiation (i.e., as close to the prescribed dose as possible) and minimizes normal tissue and critical structure damage. In tests of five patient data sets (4 acoustic neuromas and 1 meningioma), the G-P Plot/GESA-generated treatment plans improved conformality of the lethal dose to the tumor, required no human interaction, improved dose homogeneity, suggested use of fewer shots, and reduced treatment administration time.
Seismic refraction survey of the ANS preferred site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, R.K.; Hopkins, R.A.; Doll, W.E.
1992-02-01
Between September 19, 1991 and October 8, 1991 personnel from Martin Marietta Energy Systems, Inc. (Energy Systems), Automated Sciences Group, Inc., and Marrich, Inc. performed a seismic refraction survey at the Advanced Neutron Source (ANS) preferred site. The purpose of this survey was to provide estimates of top-of-rock topography, based on seismic velocities, and to delineate variations in rock and soil velocities. Forty-four seismic refraction spreads were shot to determine top-of-rock depths at 42 locations. Nine of the seismic spreads were shot with long offsets to provide 216 top-of-rock depths for 4 seismic refraction profiles. The refraction spread locations weremore » based on the grid for the ANS Phase I drilling program. Interpretation of the seismic refraction data supports the assumption that the top-of-rock surface generally follows the local topography. The shallow top-of-rock interface interpreted from the seismic refraction data is also supported by limited drill information at the site. Some zones of anomalous data are present that could be the result of locally variable weathering, a localized variation in shale content, or depth to top-of-rock greater than the site norm.« less
Centroid stabilization in alignment of FOA corner cube: designing of a matched filter
NASA Astrophysics Data System (ADS)
Awwal, Abdul; Wilhelmsen, Karl; Roberts, Randy; Leach, Richard; Miller Kamm, Victoria; Ngo, Tony; Lowe-Webb, Roger
2015-02-01
The current automation of image-based alignment of NIF high energy laser beams is providing the capability of executing multiple target shots per day. An important aspect of performing multiple shots in a day is to reduce additional time spent aligning specific beams due to perturbations in those beam images. One such alignment is beam centration through the second and third harmonic generating crystals in the final optics assembly (FOA), which employs two retro-reflecting corner cubes to represent the beam center. The FOA houses the frequency conversion crystals for third harmonic generation as the beams enters the target chamber. Beam-to-beam variations and systematic beam changes over time in the FOA corner-cube images can lead to a reduction in accuracy as well as increased convergence durations for the template based centroid detector. This work presents a systematic approach of maintaining FOA corner cube centroid templates so that stable position estimation is applied thereby leading to fast convergence of alignment control loops. In the matched filtering approach, a template is designed based on most recent images taken in the last 60 days. The results show that new filter reduces the divergence of the position estimation of FOA images.
Phase reconstruction using compressive two-step parallel phase-shifting digital holography
NASA Astrophysics Data System (ADS)
Ramachandran, Prakash; Alex, Zachariah C.; Nelleri, Anith
2018-04-01
The linear relationship between the sample complex object wave and its approximated complex Fresnel field obtained using single shot parallel phase-shifting digital holograms (PPSDH) is used in compressive sensing framework and an accurate phase reconstruction is demonstrated. It is shown that the accuracy of phase reconstruction of this method is better than that of compressive sensing adapted single exposure inline holography (SEOL) method. It is derived that the measurement model of PPSDH method retains both the real and imaginary parts of the Fresnel field but with an approximation noise and the measurement model of SEOL retains only the real part exactly equal to the real part of the complex Fresnel field and its imaginary part is completely not available. Numerical simulation is performed for CS adapted PPSDH and CS adapted SEOL and it is demonstrated that the phase reconstruction is accurate for CS adapted PPSDH and can be used for single shot digital holographic reconstruction.
Low Data Drug Discovery with One-Shot Learning.
Altae-Tran, Han; Ramsundar, Bharath; Pappu, Aneesh S; Pande, Vijay
2017-04-26
Recent advances in machine learning have made significant contributions to drug discovery. Deep neural networks in particular have been demonstrated to provide significant boosts in predictive power when inferring the properties and activities of small-molecule compounds (Ma, J. et al. J. Chem. Inf. 2015, 55, 263-274). However, the applicability of these techniques has been limited by the requirement for large amounts of training data. In this work, we demonstrate how one-shot learning can be used to significantly lower the amounts of data required to make meaningful predictions in drug discovery applications. We introduce a new architecture, the iterative refinement long short-term memory, that, when combined with graph convolutional neural networks, significantly improves learning of meaningful distance metrics over small-molecules. We open source all models introduced in this work as part of DeepChem, an open-source framework for deep-learning in drug discovery (Ramsundar, B. deepchem.io. https://github.com/deepchem/deepchem, 2016).
Chang, Hing-Chiu; Hui, Edward S; Chiu, Pui-Wai; Liu, Xiaoxi; Chen, Nan-Kuei
2018-05-01
Three-dimensional (3D) multiplexed sensitivity encoding and reconstruction (3D-MUSER) algorithm is proposed to reduce aliasing artifacts and signal corruption caused by inter-shot 3D phase variations in 3D diffusion-weighted echo planar imaging (DW-EPI). 3D-MUSER extends the original framework of multiplexed sensitivity encoding (MUSE) to a hybrid k-space-based reconstruction, thereby enabling the correction of inter-shot 3D phase variations. A 3D single-shot EPI navigator echo was used to measure inter-shot 3D phase variations. The performance of 3D-MUSER was evaluated by analyses of point-spread function (PSF), signal-to-noise ratio (SNR), and artifact levels. The efficacy of phase correction using 3D-MUSER for different slab thicknesses and b-values were investigated. Simulations showed that 3D-MUSER could eliminate artifacts because of through-slab phase variation and reduce noise amplification because of SENSE reconstruction. All aliasing artifacts and signal corruption in 3D interleaved DW-EPI acquired with different slab thicknesses and b-values were reduced by our new algorithm. A near-whole brain single-slab 3D DTI with 1.3-mm isotropic voxel acquired at 1.5T was successfully demonstrated. 3D phase correction for 3D interleaved DW-EPI data is made possible by 3D-MUSER, thereby improving feasible slab thickness and maximum feasible b-value. Magn Reson Med 79:2702-2712, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Understanding human management of automation errors
McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.
2013-01-01
Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042
Understanding human management of automation errors.
McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D
2014-01-01
Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.
Continuous integration for concurrent MOOSE framework and application development on GitHub
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...
2015-11-20
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Continuous integration for concurrent MOOSE framework and application development on GitHub
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Model and Interoperability using Meta Data Annotations
NASA Astrophysics Data System (ADS)
David, O.
2011-12-01
Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside. While providing all those capabilities, a significant reduction in the size of the model source code was achieved. To support the benefit of annotations for a modeler, studies were conducted to evaluate the effectiveness of an annotation based framework approach with other modeling frameworks and libraries, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A typical hydrological model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks.
Karampinos, Dimitrios C; Banerjee, Suchandrima; King, Kevin F; Link, Thomas M; Majumdar, Sharmila
2012-05-01
Previous studies have shown that skeletal muscle diffusion tensor imaging (DTI) can noninvasively probe changes in the muscle fiber architecture and microstructure in diseased and damaged muscles. However, DTI fiber reconstruction in small muscles and in muscle regions close to aponeuroses and tendons remains challenging because of partial volume effects. Increasing the spatial resolution of skeletal muscle single-shot diffusion-weighted echo planar imaging (DW-EPI) can be hindered by the inherently low signal-to-noise ratio (SNR) of muscle DW-EPI because of the short muscle T(2) and the high sensitivity of single-shot EPI to off-resonance effects and T(2)* blurring. In this article, eddy current-compensated diffusion-weighted stimulated-echo preparation is combined with sensitivity encoding (SENSE) to maintain good SNR properties and to reduce the sensitivity to distortions and T(2)* blurring in high-resolution skeletal muscle single-shot DW-EPI. An analytical framework is developed to optimize the reduction factor and diffusion weighting time to achieve maximum SNR. Arguments for the selection of the experimental parameters are then presented considering the compromise between SNR, B(0)-induced distortions, T(2)* blurring effects and tissue incoherent motion effects. On the basis of the selected parameters in a high-resolution skeletal muscle single-shot DW-EPI protocol, imaging protocols at lower acquisition matrix sizes are defined with matched bandwidth in the phase-encoding direction and SNR. In vivo results show that high-resolution skeletal muscle DTI with minimized sensitivity to geometric distortions and T(2)* blurring is feasible using the proposed methodology. In particular, a significant benefit is demonstrated from a reduction in partial volume effects for resolving multi-pennate muscles and muscles with small cross-sections in calf muscle DTI. Copyright © 2011 John Wiley & Sons, Ltd.
Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.
Otero-Muras, Irene; Banga, Julio R
2017-07-21
In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.
... MMR - shots; Pneumococcal - shots; Polio - shots; IPV - shots; Rotavirus - shots; Tdap - shots ... conjugate vaccine Pneumococcal polysaccharide vaccine Polio immunization (vaccine) Rotavirus vaccine Tdap vaccine
Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques
2016-10-01
The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
SPECTIX, a PETAL+ X-ray spectrometer: design, calibration and preliminary tests
NASA Astrophysics Data System (ADS)
Reverdin, C.; Bastiani, S.; Batani, D.; Brambrink, E.; Boutoux, G.; Duval, A.; Hulin, S.; Jakubowska, K.; Koenig, M.; Lantuéjoul-Thfoin, I.; Lecherbourg, L.; Szabo, C. I.; Vauzour, B.
2018-01-01
The present article describes the design, the calibration and preliminary tests of the X-ray transmission crystal spectrometer SPECTIX (Spectromètre PEtal à Cristaux en Transmission X) built in the framework of the PETAL (PETawatt Aquitaine Laser) project and located in the Laser MégaJoule (LMJ) facility [1,2]. SPECTIX aims at characterizing the hard x-ray Kα emission generated by the interaction of the PETAL ps ultra high-energy laser with a target. The broad spectral range covered by this spectrometer (7 to 150 keV) is achieved by using two measurement channels composed by two distinct crystals. Due to the harsh environment experienced by the spectrometer during a LMJ-PETAL shot, passive detection with image plates is used. Shielding has been dimensioned in order to protect the detector against PETAL shot products. It includes a magnetic dipole to remove electrons entering the spectrometer, a 20 mm thick tungsten frontal collimation and a 6 mm thick lead housing. The SPECTIX performances, including the shielding efficiency, have been tested during an experimental campain performed at the PICO 2000 laser facility at LULI. Improvements inferred from these tests are currently being implemented. Full commissioning of SPECTIX is planned on PETAL shots at the end of 2017.
NASA Astrophysics Data System (ADS)
Huang, T.; Alarcon, C.; Quach, N. T.
2014-12-01
Capture, curate, and analysis are the typical activities performed at any given Earth Science data center. Modern data management systems must be adaptable to heterogeneous science data formats, scalable to meet the mission's quality of service requirements, and able to manage the life-cycle of any given science data product. Designing a scalable data management doesn't happen overnight. It takes countless hours of refining, refactoring, retesting, and re-architecting. The Horizon data management and workflow framework, developed at the Jet Propulsion Laboratory, is a portable, scalable, and reusable framework for developing high-performance data management and product generation workflow systems to automate data capturing, data curation, and data analysis activities. The NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC)'s Data Management and Archive System (DMAS) is its core data infrastructure that handles capturing and distribution of hundreds of thousands of satellite observations each day around the clock. DMAS is an application of the Horizon framework. The NASA Global Imagery Browse Services (GIBS) is NASA's Earth Observing System Data and Information System (EOSDIS)'s solution for making high-resolution global imageries available to the science communities. The Imagery Exchange (TIE), an application of the Horizon framework, is a core subsystem for GIBS responsible for data capturing and imagery generation automation to support the EOSDIS' 12 distributed active archive centers and 17 Science Investigator-led Processing Systems (SIPS). This presentation discusses our ongoing effort in refining, refactoring, retesting, and re-architecting the Horizon framework to enable data-intensive science and its applications.
Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yuche; Gonder, Jeffrey; Young, Stanley
Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less
Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach
Chen, Yuche; Gonder, Jeffrey; Young, Stanley; ...
2017-11-06
Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less
A Framework for Automated Marmoset Vocalization Detection And Classification
2016-09-08
recent push to automate vocalization monitoring in a range of mammals. Such efforts have been used to classify bird songs [11], African elephants [12... Elephant ( Loxodonta africana ) Vocalizations,” vol. 117, no. 2, pp. 956–963, 2005. [13] J. C. Brown, “Automatic classification of killer whale
Adaptive Automation Design and Implementation
2015-09-17
Study : Space Navigator This section demonstrates the player modeling paradigm, focusing specifically on the response generation section of the player ...human-machine system, a real-time player modeling framework for imitating a specific person’s task performance, and the Adaptive Automation System...Model . . . . . . . . . . . . . . . . . . . . . . . 13 Clustering-Based Real-Time Player Modeling . . . . . . . . . . . . . . . . . . . . . . 15 An
In-camera automation of photographic composition rules.
Banerjee, Serene; Evans, Brian L
2007-07-01
At the time of image acquisition, professional photographers apply many rules of thumb to improve the composition of their photographs. This paper develops a joint optical-digital processing framework for automating composition rules during image acquisition for photographs with one main subject. Within the framework, we automate three photographic composition rules: repositioning the main subject, making the main subject more prominent, and making objects that merge with the main subject less prominent. The idea is to provide to the user alternate pictures obtained by applying photographic composition rules in addition to the original picture taken by the user. The proposed algorithms do not depend on prior knowledge of the indoor/outdoor setting or scene content. The proposed algorithms are also designed to be amenable to software implementation on fixed-point programmable digital signal processors available in digital still cameras.
The architecture of a modern military health information system.
Mukherji, Raj J; Egyhazy, Csaba J
2004-06-01
This article describes a melding of a government-sponsored architecture for complex systems with open systems engineering architecture developed by the Institute for Electrical and Electronics Engineers (IEEE). Our experience in using these two architectures in building a complex healthcare system is described in this paper. The work described shows that it is possible to combine these two architectural frameworks in describing the systems, operational, and technical views of a complex automation system. The advantage in combining the two architectural frameworks lies in the simplicity of implementation and ease of understanding of automation system architectural elements by medical professionals.
Parallels in Computer-Aided Design Framework and Software Development Environment Efforts.
1992-05-01
de - sign kits, and tool and design management frameworks. Also, books about software engineer- ing environments [Long 91] and electronic design...tool integration [Zarrella 90], and agreement upon a universal de - sign automation framework, such as the CAD Framework Initiative (CFI) [Malasky 91...ments: identification, control, status accounting, and audit and review. The paper by Dart ex- tracts 15 CM concepts from existing SDEs and tools
Centroid stabilization for laser alignment to corner cubes: designing a matched filter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Awwal, Abdul A. S.; Bliss, Erlan; Brunton, Gordon
2016-11-08
Automation of image-based alignment of National Ignition Facility high energy laser beams is providing the capability of executing multiple target shots per day. One important alignment is beam centration through the second and third harmonic generating crystals in the final optics assembly (FOA), which employs two retroreflecting corner cubes as centering references for each beam. Beam-to-beam variations and systematic beam changes over time in the FOA corner cube images can lead to a reduction in accuracy as well as increased convergence durations for the template-based position detector. A systematic approach is described that maintains FOA corner cube templates and guaranteesmore » stable position estimation.« less
Centroid stabilization for laser alignment to corner cubes: designing a matched filter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Awwal, Abdul A. S.; Bliss, Erlan; Brunton, Gordon
2016-11-08
Automation of image-based alignment of NIF high energy laser beams is providing the capability of executing multiple target shots per day. One important alignment is beam centration through the second and third harmonic generating crystals in the final optics assembly (FOA), which employs two retro-reflecting corner cubes as centering references for each beam. Beam-to-beam variations and systematic beam changes over time in the FOA corner cube images can lead to a reduction in accuracy as well as increased convergence durations for the template-based position detector. A systematic approach is described that maintains FOA corner cube templates and guarantees stable positionmore » estimation.« less
A continuous dry 300 mK cooler for THz sensing applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klemencic, G. M., E-mail: Georgina.Klemencic@astro.cf.ac.uk; Ade, P. A. R.; Sudiwala, R.
We describe and demonstrate the automated operation of a novel cryostat design that is capable of maintaining an unloaded base temperature of less than 300 mK continuously, without the need to recycle the gases within the final cold head, as is the case for conventional single shot sorption pumped {sup 3}He cooling systems. This closed dry system uses only 5 l of {sup 3}He gas, making this an economical alternative to traditional systems where a long hold time is required. During testing, a temperature of 365 mK was maintained with a constant 20 μW load, simulating the cooling requirement ofmore » a far infrared camera.« less
Misty picture weather-watch and microbarograph project: Experiments 9412-14-18
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, J.W.; Church, H.W.; Huck, T.W.
1987-01-01
Special meteorological observations and predictions for MISTY PICTURE are described. Ground zero measurements of winds and temperatures were used to develop predictions for needed light winds during the night for deployment of the helium bag for the precursor experiment. This also entailed correlations with the White Sands network of automated surface observation stations as well as general circulation and upper air reports from the regional synoptic weather observing and reporting network. Pilot balloon observations of upper winds and Tethersonde observations were made during bag deployment to further document local circulation developments. During the test countdown, radiosonde balloon observations of uppermore » air temperatures and winds were made to allow prediction of atmospheric effects on airblast propagation that could break windows to nearly 200 km range from the MISTY PICTURE explosion yield. These data indicated that there would be no strong off-site propagations on shot day, but at shot time the weak convergence zone in the shot area disturbed the wind pattern and generated a northwestward sound duct. Some banded airblast focusing resulted that gave relatively high overpressures just south of the Admin Park, at the Observer's Area, and in San Antonio where a number of windows were claimed broken. Relatively weak blasts, between caustics or foci, were recorded by microbarographs at Admin Park, Stallion, and Socorro. Very weak and barely detectable waves were propagated eastward to Carrizozo where MINOR SCALE had broken windows in 1985, and to the southeast toward Tularosa and Alamogordo. Five microbarograph stations were also operated around the west side of a 200 km radius circle, to document airblast waves ducted and focused by relatively high temperatures and easterly monsoon winds near 50 km altitudes. 15 refs., 39 figs., 16 tabs.« less
Kevlar: Transitioning Helix for Research to Practice
2016-03-01
entropy randomization techniques, automated program repairs leveraging highly-optimized virtual machine technology, and developing a novel framework...attacker from exploiting residual vulnerabilities in a wide variety of classes. Helix/Kevlar uses novel, fine-grained, high- entropy diversification...the Air Force, and IARPA). Salient features of Helix/Kevlar include developing high- entropy randomization techniques, automated program repairs
DOT National Transportation Integrated Search
2017-09-01
A number of Connected and/or Automated Vehicle (CAV) applications have recently been designed to improve the performance of our transportation system. Safety, mobility and environmental sustainability are three cornerstone performance metrics when ev...
Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram
2016-01-01
Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
An Automated Design Framework for Multicellular Recombinase Logic.
Guiziou, Sarah; Ulliana, Federico; Moreau, Violaine; Leclere, Michel; Bonnet, Jerome
2018-05-18
Tools to systematically reprogram cellular behavior are crucial to address pressing challenges in manufacturing, environment, or healthcare. Recombinases can very efficiently encode Boolean and history-dependent logic in many species, yet current designs are performed on a case-by-case basis, limiting their scalability and requiring time-consuming optimization. Here we present an automated workflow for designing recombinase logic devices executing Boolean functions. Our theoretical framework uses a reduced library of computational devices distributed into different cellular subpopulations, which are then composed in various manners to implement all desired logic functions at the multicellular level. Our design platform called CALIN (Composable Asynchronous Logic using Integrase Networks) is broadly accessible via a web server, taking truth tables as inputs and providing corresponding DNA designs and sequences as outputs (available at http://synbio.cbs.cnrs.fr/calin ). We anticipate that this automated design workflow will streamline the implementation of Boolean functions in many organisms and for various applications.
Man-Robot Symbiosis: A Framework For Cooperative Intelligence And Control
NASA Astrophysics Data System (ADS)
Parker, Lynne E.; Pin, Francois G.
1988-10-01
The man-robot symbiosis concept has the fundamental objective of bridging the gap between fully human-controlled and fully autonomous systems to achieve true man-robot cooperative control and intelligence. Such a system would allow improved speed, accuracy, and efficiency of task execution, while retaining the man in the loop for innovative reasoning and decision-making. The symbiont would have capabilities for supervised and unsupervised learning, allowing an increase of expertise in a wide task domain. This paper describes a robotic system architecture facilitating the symbiotic integration of teleoperative and automated modes of task execution. The architecture reflects a unique blend of many disciplines of artificial intelligence into a working system, including job or mission planning, dynamic task allocation, man-robot communication, automated monitoring, and machine learning. These disciplines are embodied in five major components of the symbiotic framework: the Job Planner, the Dynamic Task Allocator, the Presenter/Interpreter, the Automated Monitor, and the Learning System.
A Human-Autonomy Teaming Approach for a Flight-Following Task
NASA Technical Reports Server (NTRS)
Brandt, Summer L.; Lachter, Joel; Russell, Ricky; Shively, R. Jay
2017-01-01
Human involvement with increasingly autonomous systems must adjust to allow for a more dynamic relationship involving cooperation and teamwork. As part of an ongoing project to develop a framework for human autonomy teaming (HAT) in aviation, a study was conducted to evaluate proposed tenets of HAT. Participants performed a flight-following task at a ground station both with and without HAT features enabled. Overall, participants preferred the ground station with HAT features enabled over the station without the HAT features. Participants reported that the HAT displays and automation were preferred for keeping up with operationally important issues. Additionally, participants reported that the HAT displays and automation provided enough situation awareness to complete the task, reduced the necessary workload and were efficient. Overall, there was general agreement that HAT features supported teaming with the automation. These results will be used to refine and expand our proposed framework for human-autonomy teaming.
NASA Astrophysics Data System (ADS)
Chęciński, Jakub; Frankowski, Marek
2016-10-01
We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.
Machine learning of network metrics in ATLAS Distributed Data Management
NASA Astrophysics Data System (ADS)
Lassnig, Mario; Toler, Wesley; Vamosi, Ralf; Bogado, Joaquin; ATLAS Collaboration
2017-10-01
The increasing volume of physics data poses a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from one of our ongoing automation efforts that focuses on network metrics. First, we describe our machine learning framework built atop the ATLAS Analytics Platform. This framework can automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for networkaware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.
Automated Fluid Feature Extraction from Transient Simulations
NASA Technical Reports Server (NTRS)
Haimes, Robert
1998-01-01
In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: Shocks; Vortex ores; Regions of Recirculation; Boundary Layers; Wakes.
NASA Astrophysics Data System (ADS)
Umbu Kondi Maliwemu, Erich; Malau, Viktor; Iswanto, Priyo Tri
2018-01-01
Shot peening is a mechanical surface treatment with a beneficial effect to generate compressive residual stress caused by plastic deformation on the surface of material. This plastic deformation can improve the surface characteristics of metallic materials, such as modification of surface morphology, surface roughness, and surface hardness. The objective of this study is to investigate the effect of shot peening in different shot distance and shot angle on surface morphology, surface roughness, and surface hardness of 316L biomaterial. Shot distance was varied at 6, 8, 10, and 12 cm and shot angle at 30, 60, and 90°, working pressure at 7 kg/cm2, shot duration for 20 minutes, and using steel balls S-170 with diameter of 0.6 mm. The results present that the shot distance and shot angle of shot peening give the significant effect to improve the surface morphology, surface roughness, and surface hardness of 316 L biomaterial. Shot peening can increase the surface roughness by the increasing of shot distance and by the decreasing of shot angle. The nearest shot distance (6 cm) and the largest shot angle (90°) give the best results on the grain refinement with the surface roughness of 1.04 μm and surface hardness of 534 kg/mm2.
Study on Plastic Deformation Characteristics of Shot Peening of Ni-Based Superalloy GH4079
NASA Astrophysics Data System (ADS)
Zhong, L. Q.; Liang, Y. L.; Hu, H.
2017-09-01
In this paper, the X-ray stress diffractometer, surface roughness tester, field emission scanning electron microscope(SEM), dynamic ultra-small microhardness tester were used to measure the surface residual stress and roughness, topography and surface hardness changes of GH4079 superalloy, which was processed by metallographic grinding, turning, metallographic grinding +shot peening and turning + shot peening. Analysized the effects of shot peening parameters on shot peening plastic deformation features; and the effects of the surface state before shot peening on shot peening plastic deformation characteristics. Results show that: the surface residual compressive stress, surface roughness and surface hardness of GH4079 superalloy were increased by shot peening, in addition, the increment of the surface residual compressive stress, surface roughness and surface hardness induced by shot peening increased with increasing shot peening intensity, shot peening time, shot peening pressure and shot hardness, but harden layer depth was not affected considerably. The more plastic deformation degree of before shot peening surface state, the less increment of the surface residual compressive stress, surface roughness and surface hardness induced by shot peening.
Araki, Tadashi; Kumar, P Krishna; Suri, Harman S; Ikeda, Nobutaka; Gupta, Ajay; Saba, Luca; Rajan, Jeny; Lavra, Francesco; Sharma, Aditya M; Shafique, Shoaib; Nicolaides, Andrew; Laird, John R; Suri, Jasjit S
2016-07-01
The degree of stenosis in the carotid artery can be predicted using automated carotid lumen diameter (LD) measured from B-mode ultrasound images. Systolic velocity-based methods for measurement of LD are subjective. With the advancement of high resolution imaging, image-based methods have started to emerge. However, they require robust image analysis for accurate LD measurement. This paper presents two different algorithms for automated segmentation of the lumen borders in carotid ultrasound images. Both algorithms are modeled as a two stage process. Stage one consists of a global-based model using scale-space framework for the extraction of the region of interest. This stage is common to both algorithms. Stage two is modeled using a local-based strategy that extracts the lumen interfaces. At this stage, the algorithm-1 is modeled as a region-based strategy using a classification framework, whereas the algorithm-2 is modeled as a boundary-based approach that uses the level set framework. Two sets of databases (DB), Japan DB (JDB) (202 patients, 404 images) and Hong Kong DB (HKDB) (50 patients, 300 images) were used in this study. Two trained neuroradiologists performed manual LD tracings. The mean automated LD measured was 6.35 ± 0.95 mm for JDB and 6.20 ± 1.35 mm for HKDB. The precision-of-merit was: 97.4 % and 98.0 % w.r.t to two manual tracings for JDB and 99.7 % and 97.9 % w.r.t to two manual tracings for HKDB. Statistical tests such as ANOVA, Chi-Squared, T-test, and Mann-Whitney test were conducted to show the stability and reliability of the automated techniques.
DESCQA: Synthetic Sky Catalog Validation Framework
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph
2018-04-01
The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.
Brandes, Susanne; Mokhtari, Zeinab; Essig, Fabian; Hünniger, Kerstin; Kurzai, Oliver; Figge, Marc Thilo
2015-02-01
Time-lapse microscopy is an important technique to study the dynamics of various biological processes. The labor-intensive manual analysis of microscopy videos is increasingly replaced by automated segmentation and tracking methods. These methods are often limited to certain cell morphologies and/or cell stainings. In this paper, we present an automated segmentation and tracking framework that does not have these restrictions. In particular, our framework handles highly variable cell shapes and does not rely on any cell stainings. Our segmentation approach is based on a combination of spatial and temporal image variations to detect moving cells in microscopy videos. This method yields a sensitivity of 99% and a precision of 95% in object detection. The tracking of cells consists of different steps, starting from single-cell tracking based on a nearest-neighbor-approach, detection of cell-cell interactions and splitting of cell clusters, and finally combining tracklets using methods from graph theory. The segmentation and tracking framework was applied to synthetic as well as experimental datasets with varying cell densities implying different numbers of cell-cell interactions. We established a validation framework to measure the performance of our tracking technique. The cell tracking accuracy was found to be >99% for all datasets indicating a high accuracy for connecting the detected cells between different time points. Copyright © 2014 Elsevier B.V. All rights reserved.
Modeling of prepregs during automated draping sequences
NASA Astrophysics Data System (ADS)
Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny
2017-10-01
The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.
Launch Control System Software Development System Automation Testing
NASA Technical Reports Server (NTRS)
Hwang, Andrew
2017-01-01
The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This system requires high quality testing that will measure and test the capabilities of the system. For the past two years, the Exploration and Operations Division at Kennedy Space Center (KSC) has assigned a group including interns and full-time engineers to develop automated tests to save the project time and money. The team worked on automating the testing process for the SCCS GUI that would use streamed simulated data from the testing servers to produce data, plots, statuses, etc. to the GUI. The software used to develop automated tests included an automated testing framework and an automation library. The automated testing framework has a tabular-style syntax, which means the functionality of a line of code must have the appropriate number of tabs for the line to function as intended. The header section contains either paths to custom resources or the names of libraries being used. The automation library contains functionality to automate anything that appears on a desired screen with the use of image recognition software to detect and control GUI components. The data section contains any data values strictly created for the current testing file. The body section holds the tests that are being run. The function section can include any number of functions that may be used by the current testing file or any other file that resources it. The resources and body section are required for all test files; the data and function sections can be left empty if the data values and functions being used are from a resourced library or another file. To help equip the automation team with better tools, the Project Lead of the Automated Testing Team, Jason Kapusta, assigned the task to install and train an optical character recognition (OCR) tool to Brandon Echols, a fellow intern, and I. The purpose of the OCR tool is to analyze an image and find the coordinates of any group of text. Some issues that arose while installing the OCR tool included the absence of certain libraries needed to train the tool and an outdated software version. We eventually resolved the issues and successfully installed the OCR tool. Training the tool required many images and different fonts and sizes, but in the end the tool learned to accurately decipher the text in the images and their coordinates. The OCR tool produced a file that contained significant metadata for each section of text, but only the text and coordinates of the text was required for our purpose. The team made a script to parse the information we wanted from the OCR file to a different file that would be used by automation functions within the automated framework. Since a majority of development and testing for the automated test cases for the GUI in question has been done using live simulated data on the workstations at the Launch Control Center (LCC), a large amount of progress has been made. As of this writing, about 60% of all of automated testing has been implemented. Additionally, the OCR tool will help make our automated tests more robust due to the tool's text recognition being highly scalable to different text fonts and text sizes. Soon we will have the whole test system automated, allowing for more full-time engineers working on development projects.
An automated and integrated framework for dust storm detection based on ogc web processing services
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.
2014-11-01
Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.
Concept-oriented indexing of video databases: toward semantic sensitive retrieval and browsing.
Fan, Jianping; Luo, Hangzai; Elmagarmid, Ahmed K
2004-07-01
Digital video now plays an important role in medical education, health care, telemedicine and other medical applications. Several content-based video retrieval (CBVR) systems have been proposed in the past, but they still suffer from the following challenging problems: semantic gap, semantic video concept modeling, semantic video classification, and concept-oriented video database indexing and access. In this paper, we propose a novel framework to make some advances toward the final goal to solve these problems. Specifically, the framework includes: 1) a semantic-sensitive video content representation framework by using principal video shots to enhance the quality of features; 2) semantic video concept interpretation by using flexible mixture model to bridge the semantic gap; 3) a novel semantic video-classifier training framework by integrating feature selection, parameter estimation, and model selection seamlessly in a single algorithm; and 4) a concept-oriented video database organization technique through a certain domain-dependent concept hierarchy to enable semantic-sensitive video retrieval and browsing.
Towards Careful Practices for Automated Linguistic Analysis of Group Learning
ERIC Educational Resources Information Center
Howley, Iris; Rosé, Carolyn Penstein
2016-01-01
The multifaceted nature of collaborative learning environments necessitates theory to investigate the cognitive, motivational, and relational dimensions of collaboration. Several existing frameworks include aspects related to each of these three. This article explores the capability of multi-dimensional frameworks for analysis of collaborative…
Verifying the Modal Logic Cube Is an Easy Task (For Higher-Order Automated Reasoners)
NASA Astrophysics Data System (ADS)
Benzmüller, Christoph
Prominent logics, including quantified multimodal logics, can be elegantly embedded in simple type theory (classical higher-order logic). Furthermore, off-the-shelf reasoning systems for simple type type theory exist that can be uniformly employed for reasoning within and about embedded logics. In this paper we focus on reasoning about modal logics and exploit our framework for the automated verification of inclusion and equivalence relations between them. Related work has applied first-order automated theorem provers for the task. Our solution achieves significant improvements, most notably, with respect to elegance and simplicity of the problem encodings as well as with respect to automation performance.
NASA Astrophysics Data System (ADS)
Lemoff, Brian E.; Martin, Robert B.; Sluch, Mikhail; Kafka, Kristopher M.; McCormick, William; Ice, Robert
2013-06-01
The capability to positively and covertly identify people at a safe distance, 24-hours per day, could provide a valuable advantage in protecting installations, both domestically and in an asymmetric warfare environment. This capability would enable installation security officers to identify known bad actors from a safe distance, even if they are approaching under cover of darkness. We will describe an active-SWIR imaging system being developed to automatically detect, track, and identify people at long range using computer face recognition. The system illuminates the target with an eye-safe and invisible SWIR laser beam, to provide consistent high-resolution imagery night and day. SWIR facial imagery produced by the system is matched against a watch-list of mug shots using computer face recognition algorithms. The current system relies on an operator to point the camera and to review and interpret the face recognition results. Automation software is being developed that will allow the system to be cued to a location by an external system, automatically detect a person, track the person as they move, zoom in on the face, select good facial images, and process the face recognition results, producing alarms and sharing data with other systems when people are detected and identified. Progress on the automation of this system will be presented along with experimental night-time face recognition results at distance.
Wehage, Kristopher; Chenhansa, Panan; Schoenung, Julie M
2017-01-01
GreenScreen® for Safer Chemicals is a framework for comparative chemical hazard assessment. It is the first transparent, open and publicly accessible framework of its kind, allowing manufacturers and governmental agencies to make informed decisions about the chemicals and substances used in consumer products and buildings. In the GreenScreen® benchmarking process, chemical hazards are assessed and classified based on 18 hazard endpoints from up to 30 different sources. The result is a simple numerical benchmark score and accompanying assessment report that allows users to flag chemicals of concern and identify safer alternatives. Although the screening process is straightforward, aggregating and sorting hazard data is tedious, time-consuming, and prone to human error. In light of these challenges, the present work demonstrates the usage of automation to cull chemical hazard data from publicly available internet resources, assign metadata, and perform a GreenScreen® hazard assessment using the GreenScreen® "List Translator." The automated technique, written as a module in the Python programming language, generates GreenScreen® List Translation data for over 3000 chemicals in approximately 30 s. Discussion of the potential benefits and limitations of automated techniques is provided. By embedding the library into a web-based graphical user interface, the extensibility of the library is demonstrated. The accompanying source code is made available to the hazard assessment community. Integr Environ Assess Manag 2017;13:167-176. © 2016 SETAC. © 2016 SETAC.
Local mechanical properties of LFT injection molded parts: Numerical simulations versus experiments
NASA Astrophysics Data System (ADS)
Desplentere, F.; Soete, K.; Bonte, H.; Debrabandere, E.
2014-05-01
In predictive engineering for polymer processes, the proper prediction of material microstructure from known processing conditions and constituent material properties is a critical step forward properly predicting bulk properties in the finished composite. Operating within the context of long-fiber thermoplastics (LFT, length < 15mm) this investigation concentrates on the prediction of the local mechanical properties of an injection molded part. To realize this, the Autodesk Simulation Moldflow Insight 2014 software has been used. In this software, a fiber breakage algorithm for the polymer flow inside the mold is available. Using well known micro mechanic formulas allow to combine the local fiber length with the local orientation into local mechanical properties. Different experiments were performed using a commercially available glass fiber filled compound to compare the measured data with the numerical simulation results. In this investigation, tensile tests and 3 point bending tests are considered. To characterize the fiber length distribution of the polymer melt entering the mold (necessary for the numerical simulations), air shots were performed. For those air shots, similar homogenization conditions were used as during the injection molding tests. The fiber length distribution is characterized using automated optical method on samples for which the matrix material is burned away. Using the appropriate settings for the different experiments, good predictions of the local mechanical properties are obtained.
Virtual targeting in three-dimensional space with sound and light interference
NASA Astrophysics Data System (ADS)
Chua, Florence B.; DeMarco, Robert M.; Bergen, Michael T.; Short, Kenneth R.; Servatius, Richard J.
2006-05-01
Law enforcement and the military are critically concerned with the targeting and firing accuracy of opponents. Stimuli which impede opponent targeting and firing accuracy can be incorporated into defense systems. An automated virtual firing range was developed to assess human targeting accuracy under conditions of sound and light interference, while avoiding dangers associated with live fire. This system has the ability to quantify sound and light interference effects on targeting and firing accuracy in three dimensions. This was achieved by development of a hardware and software system that presents the subject with a sound or light target, preceded by a sound or light interference. SonyXplod. TM 4-way speakers present sound interference and sound targeting. The Martin ® MiniMAC TM Profile operates as a source of light interference, while a red laser light serves as a target. A tracking system was created to monitor toy gun movement and firing in three-dimensional space. Data are collected via the Ascension ® Flock of Birds TM tracking system and a custom National Instrument ® LabVIEW TM 7.0 program to monitor gun movement and firing. A test protocol examined system parameters. Results confirm that the system enables tracking of virtual shots from a fired simulation gun to determine shot accuracy and location in three dimensions.
Low Data Drug Discovery with One-Shot Learning
2017-01-01
Recent advances in machine learning have made significant contributions to drug discovery. Deep neural networks in particular have been demonstrated to provide significant boosts in predictive power when inferring the properties and activities of small-molecule compounds (Ma, J. et al. J. Chem. Inf. Model.2015, 55, 263–27425635324). However, the applicability of these techniques has been limited by the requirement for large amounts of training data. In this work, we demonstrate how one-shot learning can be used to significantly lower the amounts of data required to make meaningful predictions in drug discovery applications. We introduce a new architecture, the iterative refinement long short-term memory, that, when combined with graph convolutional neural networks, significantly improves learning of meaningful distance metrics over small-molecules. We open source all models introduced in this work as part of DeepChem, an open-source framework for deep-learning in drug discovery (Ramsundar, B. deepchem.io. https://github.com/deepchem/deepchem, 2016). PMID:28470045
ERIC Educational Resources Information Center
Hooshyar, Danial; Yousefi, Moslem; Lim, Heuiseok
2018-01-01
Automated content generation for educational games has become an emerging research problem, as manual authoring is often time consuming and costly. In this article, we present a procedural content generation framework that intends to produce educational game content from the viewpoint of both designer and user. This framework generates content by…
Designing automation for human use: empirical studies and quantitative models.
Parasuraman, R
2000-07-01
An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les
1991-01-01
The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.
2016-09-01
par. 4) Based on a RED projected size of 22.16 m, a sample calculation for the unadjusted single shot probability of kill for HELLFIRE missiles is...framework based on intelligent objects (SIMIO) environment to model a fast attack craft/fast inshore attack craft anti-surface warfare expanded kill chain...concept of operation efficiency. Based on the operational environment, low cost and less capable unmanned aircraft provide an alternative to the
2016-01-06
Signature// //Signature// //Signature// REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD...D. Musinski 19b. TELEPHONE NUMBER (Include Area Code) (937) 255-0485 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39-18 REPORT
FY14 LLNL OMEGA Experimental Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heeter, R. F.; Fournier, K. B.; Baker, K.
In FY14, LLNL’s High-Energy-Density Physics (HED) and Indirect Drive Inertial Confinement Fusion (ICF-ID) programs conducted several campaigns on the OMEGA laser system and on the EP laser system, as well as campaigns that used the OMEGA and EP beams jointly. Overall these LLNL programs led 324 target shots in FY14, with 246 shots using just the OMEGA laser system, 62 shots using just the EP laser system, and 16 Joint shots using Omega and EP together. Approximately 31% of the total number of shots (62 OMEGA shots, 42 EP shots) shots supported the Indirect Drive Inertial Confinement Fusion Campaign (ICF-ID).more » The remaining 69% (200 OMEGA shots and 36 EP shots, including the 16 Joint shots) were dedicated to experiments for High- Energy-Density Physics (HED). Highlights of the various HED and ICF campaigns are summarized in the following reports.« less
Automation Framework for Flight Dynamics Products Generation
NASA Technical Reports Server (NTRS)
Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla
2010-01-01
XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.
Automation for nondestructive inspection of aircraft
NASA Technical Reports Server (NTRS)
Siegel, M. W.
1994-01-01
We discuss the motivation and an architectural framework for using small mobile robots as automated aids to operators of nondestructive inspection (NDI) equipment. We review the need for aircraft skin inspection, and identify the constraints in commercial airlines operations that make small mobile robots the most attractive alternative for automated aids for NDI procedures. We describe the design and performance of the robot (ANDI) that we designed, built, and are testing for deployment of eddy current probes in prescribed commercial aircraft inspections. We discuss recent work aimed at also providing robotic aids for visual inspection.
A First-Order Estimate of Automated Mobility District Fuel Consumption and GHG Emission Impacts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yuche; Young, Stanley E; Gonder, Jeffrey D
A first of its kind, this study develops a framework to quantify the fuel consumption and greenhouse gas emission impacts of an Automated Small Vehicle Transit system on a campus area. The results show that the automated mobility district system has the potential to reduce transportation system fuel consumption and greenhouse gas emissions, but the benefits are largely dependent on the operation and ridership of the personal rapid transit system. Our study calls for more research to understand the energy and environmental benefits of such a system.
Tractable policy management framework for IoT
NASA Astrophysics Data System (ADS)
Goynugur, Emre; de Mel, Geeth; Sensoy, Murat; Calo, Seraphin
2017-05-01
Due to the advancement in the technology, hype of connected devices (hence forth referred to as IoT) in support of automating the functionality of many domains, be it intelligent manufacturing or smart homes, have become a reality. However, with the proliferation of such connected and interconnected devices, efficiently and effectively managing networks manually becomes an impractical, if not an impossible task. This is because devices have their own obligations and prohibitions in context, and humans are not equip to maintain a bird's-eye-view of the state. Traditionally, policies are used to address the issue, but in the IoT arena, one requires a policy framework in which the language can provide sufficient amount of expressiveness along with efficient reasoning procedures to automate the management. In this work we present our initial work into creating a scalable knowledge-based policy framework for IoT and demonstrate its applicability through a smart home application.
Semi-automated software service integration in virtual organisations
NASA Astrophysics Data System (ADS)
Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh
2015-08-01
To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.
50 CFR 20.134 - Approval of nontoxic shot types and shot coatings.
Code of Federal Regulations, 2014 CFR
2014-10-01
... erosion and absorption of one shot or coated shot in a 24-hour period. Define the nature of the toxic...) with one U.S. No. 4 pellet of lead shot. Dose each bird in one group of 8 males and 8 females with 8 U... males and 8 females with one U.S. No. 4 pellet of the candidate shot type or shot with the proposed...
Complacency and bias in human use of automation: an attentional integration.
Parasuraman, Raja; Manzey, Dietrich H
2010-06-01
Our aim was to review empirical studies of complacency and bias in human interaction with automated and decision support systems and provide an integrated theoretical model for their explanation. Automation-related complacency and automation bias have typically been considered separately and independently. Studies on complacency and automation bias were analyzed with respect to the cognitive processes involved. Automation complacency occurs under conditions of multiple-task load, when manual tasks compete with the automated task for the operator's attention. Automation complacency is found in both naive and expert participants and cannot be overcome with simple practice. Automation bias results in making both omission and commission errors when decision aids are imperfect. Automation bias occurs in both naive and expert participants, cannot be prevented by training or instructions, and can affect decision making in individuals as well as in teams. While automation bias has been conceived of as a special case of decision bias, our analysis suggests that it also depends on attentional processes similar to those involved in automation-related complacency. Complacency and automation bias represent different manifestations of overlapping automation-induced phenomena, with attention playing a central role. An integrated model of complacency and automation bias shows that they result from the dynamic interaction of personal, situational, and automation-related characteristics. The integrated model and attentional synthesis provides a heuristic framework for further research on complacency and automation bias and design options for mitigating such effects in automated and decision support systems.
The variable and chaotic nature of professional golf performance.
Stöckl, Michael; Lamb, Peter F
2018-05-01
In golf, unlike most other sports, individual performance is not the result of direct interactions between players. Instead decision-making and performance is influenced by numerous constraining factors affecting each shot. This study looked at the performance of PGA TOUR golfers in 2011 in terms of stability and variability on a shot-by-shot basis. Stability and variability were assessed using Recurrence Quantification Analysis (RQA) and standard deviation, respectively. About 10% of all shots comprised short stable phases of performance (3.7 ± 1.1 shots per stable phase). Stable phases tended to consist of shots of typical performance, rather than poor or exceptional shots; this finding was consistent for all shot categories. Overall, stability measures were not correlated with tournament performance. Variability across all shots was not related to tournament performance; however, variability in tee shots and short approach shots was higher than for other shot categories. Furthermore, tee shot variability was related to tournament standing: decreased variability was associated with better tournament ranking. The findings in this study showed that PGA TOUR golf performance is chaotic. Further research on amateur golf performance is required to determine whether the structure of amateur golf performance is universal.
Shot model parameters for Cygnus X-1 through phase portrait fitting
NASA Technical Reports Server (NTRS)
Lochner, James C.; Swank, J. H.; Szymkowiak, A. E.
1991-01-01
Shot models for systems having about 1/f power density spectrum are developed by utilizing a distribution of shot durations. Parameters of the distribution are determined by fitting the power spectrum either with analytic forms for the spectrum of a shot model with a given shot profile, or with the spectrum derived from numerical realizations of trial shot models. The shot fraction is specified by fitting the phase portrait, which is a plot of intensity at a given time versus intensity at a delayed time and in principle is sensitive to different shot profiles. These techniques have been extensively applied to the X-ray variability of Cygnus X-1, using HEAO 1 A-2 and an Exosat ME observation. The power spectra suggest models having characteristic shot durations lasting from milliseconds to a few seconds, while the phase portrait fits give shot fractions of about 50 percent. Best fits to the portraits are obtained if the amplitude of the shot is a power-law function of the duration of the shot. These fits prefer shots having a symmetric exponential rise and decay. Results are interpreted in terms of a distribution of magnetic flares in the accretion disk.
Abbas, Ahmed; Guo, Xianrong; Jing, Bing-Yi; Gao, Xin
2014-06-01
Despite significant advances in automated nuclear magnetic resonance-based protein structure determination, the high numbers of false positives and false negatives among the peaks selected by fully automated methods remain a problem. These false positives and negatives impair the performance of resonance assignment methods. One of the main reasons for this problem is that the computational research community often considers peak picking and resonance assignment to be two separate problems, whereas spectroscopists use expert knowledge to pick peaks and assign their resonances at the same time. We propose a novel framework that simultaneously conducts slice picking and spin system forming, an essential step in resonance assignment. Our framework then employs a genetic algorithm, directed by both connectivity information and amino acid typing information from the spin systems, to assign the spin systems to residues. The inputs to our framework can be as few as two commonly used spectra, i.e., CBCA(CO)NH and HNCACB. Different from the existing peak picking and resonance assignment methods that treat peaks as the units, our method is based on 'slices', which are one-dimensional vectors in three-dimensional spectra that correspond to certain ([Formula: see text]) values. Experimental results on both benchmark simulated data sets and four real protein data sets demonstrate that our method significantly outperforms the state-of-the-art methods while using a less number of spectra than those methods. Our method is freely available at http://sfb.kaust.edu.sa/Pages/Software.aspx.
Shot Group Statistics for Small Arms Applications
2017-06-01
standard deviation. Analysis is presented as applied to one , n-round shot group and then is extended to treat multiple, n-round shot groups. A...dispersion measure for multiple, n-round shot groups can be constructed by selecting one of the dispersion measures listed above, measuring the dispersion of...as applied to one , n-round shot group and then is extended to treat multiple, n-round shot groups. A dispersion measure for multiple, n- round shot
Status of the calibration and alignment framework at the Belle II experiment
NASA Astrophysics Data System (ADS)
Dossett, D.; Sevior, M.; Ritter, M.; Kuhr, T.; Bilka, T.; Yaschenko, S.;
2017-10-01
The Belle II detector at the Super KEKB e+e-collider plans to take first collision data in 2018. The monetary and CPU time costs associated with storing and processing the data mean that it is crucial for the detector components at Belle II to be calibrated quickly and accurately. A fast and accurate calibration system would allow the high level trigger to increase the efficiency of event selection, and can give users analysis-quality reconstruction promptly. A flexible framework to automate the fast production of calibration constants is being developed in the Belle II Analysis Software Framework (basf2). Detector experts only need to create two components from C++ base classes in order to use the automation system. The first collects data from Belle II event data files and outputs much smaller files to pass to the second component. This runs the main calibration algorithm to produce calibration constants ready for upload into the conditions database. A Python framework coordinates the input files, order of processing, and submission of jobs. Splitting the operation into collection and algorithm processing stages allows the framework to optionally parallelize the collection stage on a batch system.
The Effect of Multiple Shot Peening on the Corrosion Behavior of Duplex Stainless Steel
NASA Astrophysics Data System (ADS)
Feng, Qiang; She, Jia; Wu, Xueyan; Wang, Chengxi; Jiang, Chuanhai
2018-03-01
Various types of shot peening treatments were applied to duplex stainless steel. The effects of shot peening intensity and working procedures on the microstructure were investigated. The domain size and microstrain evolution in the surface layer were characterized utilizing the Rietveld method. As the shot peening intensity increased, the surface roughness increased in the surface layer; however, it decreased after multiple (dual and triple) shot peening. The mole fraction of strain-induced martensite as a function of the intensity of shot peening was evaluated by XRD measurements. Both potentiodynamic polarization curves and salt spray tests of shot-peened samples in NaCl solution were investigated. The results indicate that traditional shot peening has negative effects on corrosion resistance with increasing shot peening intensity; however, the corrosion rate can be reduced by means of multiple shot peening.
Huck, Nathaniel R; Ballard, Bart M; Fedynich, Alan M; Kraai, Kevin J; Castro, Mauro E
2016-01-01
Historically, lead poisoning through lead shot ingestion was one of the largest health issues affecting waterfowl in North America. Lead shot was banned for use in waterfowl hunting in the US in 1991 and was banned in Canada in 1997. However, biologists need to understand how, and if, lead shot remaining in the environment will continue to impact waterfowl. Our goal was to estimate lead and nontoxic shot consumption by female Northern Pintails (Anas acuta) wintering along the Texas coast. We found shot or metal fragments (or both) in the gizzards of 39 (17%) of 227 female Northern Pintails collected along the Texas coast. Of these, lead shot was found in seven gizzards, steel shot was found in 24 gizzards, and other metal and fragments were found in 20 gizzards. Some females consumed multiple shot types. Overall, shot (lead and nontoxic combined) ingestion rates were similar to those found prior to the lead shot ban in Texas (14%) and Louisiana (17%); however, lead shot ingestion rates were considerably lower, suggesting that it is becoming less available over time. All Northern Pintails that had lead shot in their gizzards were collected from coastal habitats. While it seems that lead shot ingestion by Northern Pintails has decreased since the ban was put in place, monitoring lead shot ingestion rates from different regions will provide insight into its availability in different habitats and under various environmental conditions.
NASA Astrophysics Data System (ADS)
Klemt, Christian; Modat, Marc; Pichat, Jonas; Cardoso, M. J.; Henckel, Joahnn; Hart, Alister; Ourselin, Sebastien
2015-03-01
Metal-on-metal (MoM) hip arthroplasties have been utilised over the last 15 years to restore hip function for 1.5 million patients worldwide. Althoug widely used, this hip arthroplasty releases metal wear debris which lead to muscle atrophy. The degree of muscle wastage differs across patients ranging from mild to severe. The longterm outcomes for patients with MoM hip arthroplasty are reduced for increasing degrees of muscle atrophy, highlighting the need to automatically segment pathological muscles. The automated segmentation of pathological soft tissues is challenging as these lack distinct boundaries and morphologically differ across subjects. As a result, there is no method reported in the literature which has been successfully applied to automatically segment pathological muscles. We propose the first automated framework to delineate severely atrophied muscles by applying a novel automated segmentation propagation framework to patients with MoM hip arthroplasty. The proposed algorithm was used to automatically quantify muscle wastage in these patients.
Lead shot toxicity to passerines
Vyas, N.B.; Spann, J.W.; Heinz, G.H.
2001-01-01
This study evaluated the toxicity of a single size 7.5 lead shot to passerines. No mortalities or signs of plumbism were observed in dosed cowbirds (Molothrus ater) fed a commercial diet, but when given a more natural diet, three of 10 dosed birds died within 1 day. For all survivors from which shot were recovered, all but one excreted the shot within 24 h of dosing, whereas, the dead birds retained their shot. Shot erosion was significantly greater (P < 0.05) when weathered shot were ingested compared to new shot, and the greatest erosion was observed in those birds that died (2.2-9.7%). Blood lead concentrations of birds dosed with new shot were not significantly different (P=0.14) from those of birds exposed to weathered shot. Liver lead concentrations of birds that died ranged from 71 to 137 ppm, dry weight. Despite the short amount of time the shot was retained, songbirds may absorb sufficient lead to compromise their survival.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
...; Approval Procedures for Nontoxic Shot and Shot Coatings AGENCY: Fish and Wildlife Service, Interior. ACTION... Number: 1018-0067. Title: Approval Procedures for Nontoxic Shot and Shot Coatings, 50 CFR 20.134. Service... Respondents: Businesses that produce and/or market approved nontoxic shot types or nontoxic shot coatings...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-01
... supported approval of the shot and the coatings, and one contained no useful information. Therefore, as... Hunting; Application for Approval of Copper-Clad Iron Shot and Fluoropolymer Shot Coatings as Nontoxic for... environmental assessments. SUMMARY: We, the U.S. Fish and Wildlife Service, approve copper-clad iron shot and...
Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.
2018-01-01
The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
Rolling contact fatigue strengths of shot-peened and crack-healed ceramics
NASA Astrophysics Data System (ADS)
Takahashi, K.; Oki, T.
2018-06-01
The effects of shot-peening (SP) and crack-healing on the rolling contact fatigue (RCF) strengths of Al2O3/SiC composite ceramics were investigated. Non-shot-peened, shot- peened, and shot-peened + crack-healed specimens were prepared. SP was performed using ZrO2 beads. The shot-peened + crack-healed specimen was crack-healed after SP. X-ray diffraction clearly showed that SP induced a compressive residual stress up to 300 MPa at the specimen surfaces. Furthermore, the shot-peened + crack-healed specimen retained a compressive residual stress of 200 MPa. The apparent surface fracture toughness of the shot- peened specimens increased owing to the positive effects of the compressive residual stress. RCF tests were performed using a thrust load-bearing test device. The RCF lives of the shot- peened specimens did not improve compared to that of the non-shot-peened specimen, because the numerous SP-introduced surface cracks could act as crack initiation sites during the RCF tests. However, the RCF life of the shot-peened + crack-healed specimen did improve compared to those of non-shot-peened and shot-peened specimens, implying that combining SP and crack-healing was an effective strategy for improving the RCF lives of Al2O3/SiC composite ceramics.
FY16 LLNL Omega Experimental Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heeter, R. F.; Ali, S. J.; Benstead, J.
In FY16, LLNL’s High-Energy-Density Physics (HED) and Indirect Drive Inertial Confinement Fusion (ICF-ID) programs conducted several campaigns on the OMEGA laser system and on the EP laser system, as well as campaigns that used the OMEGA and EP beams jointly. Overall, these LLNL programs led 430 target shots in FY16, with 304 shots using just the OMEGA laser system, and 126 shots using just the EP laser system. Approximately 21% of the total number of shots (77 OMEGA shots and 14 EP shots) supported the Indirect Drive Inertial Confinement Fusion Campaign (ICF-ID). The remaining 79% (227 OMEGA shots and 112more » EP shots) were dedicated to experiments for High-Energy-Density Physics (HED). Highlights of the various HED and ICF campaigns are summarized in the following reports. In addition to these experiments, LLNL Principal Investigators led a variety of Laboratory Basic Science campaigns using OMEGA and EP, including 81 target shots using just OMEGA and 42 shots using just EP. The highlights of these are also summarized, following the ICF and HED campaigns. Overall, LLNL PIs led a total of 553 shots at LLE in FY 2016. In addition, LLNL PIs also supported 57 NLUF shots on Omega and 31 NLUF shots on EP, in collaboration with the academic community.« less
FY15 LLNL OMEGA Experimental Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heeter, R. F.; Baker, K. L.; Barrios, M. A.
In FY15, LLNL’s High-Energy-Density Physics (HED) and Indirect Drive Inertial Confinement Fusion (ICF-ID) programs conducted several campaigns on the OMEGA laser system and on the EP laser system, as well as campaigns that used the OMEGA and EP beams jointly. Overall these LLNL programs led 468 target shots in FY15, with 315 shots using just the OMEGA laser system, 145 shots using just the EP laser system, and 8 Joint shots using Omega and EP together. Approximately 25% of the total number of shots (56 OMEGA shots and 67 EP shots, including the 8 Joint shots) supported the Indirect Drivemore » Inertial Confinement Fusion Campaign (ICF-ID). The remaining 75% (267 OMEGA shots and 86 EP shots) were dedicated to experiments for High-Energy-Density Physics (HED). Highlights of the various HED and ICF campaigns are summarized in the following reports.« less
Analyzing Educators' Online Interactions: A Framework of Online Learning Support Roles
ERIC Educational Resources Information Center
Nacu, Denise C.; Martin, Caitlin K.; Pinkard, Nichole; Gray, Tené
2016-01-01
While the potential benefits of participating in online learning communities are documented, so too are inequities in terms of how different populations access and use them. We present the online learning support roles (OLSR) framework, an approach using both automated analytics and qualitative interpretation to identify and explore online…
NASA Astrophysics Data System (ADS)
Hess, M. R.; Petrovic, V.; Kuester, F.
2017-08-01
Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics
2017-01-01
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.
Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier
2017-10-21
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.
Development of Metrics for Trust in Automation
2010-06-01
Systems Literature Review Defence Research and Development Canada Toronto No. CR-2003-096 Ajzen , I ., & Fishbein , M . (1980). Understanding attitudes...theory and research (pp. 261–287). Thousand Oaks, CA: Sage. Moray, N., Inagaki, T., Itoh, M ., 2000 . Adaptive automation, trust, and self-confidence...Assurance Technical Framework document ( 2000 ), the term ‘trust’ is used 352 times, ranging from reference to the trustworthiness of technology, to
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-26
... June 20, 2012 (77 FR 36980), and one for the fluoropolymer shot coatings on July 6, 2012 (77 FR 39983... Bird Hunting; Application for Approval of Copper-Clad Iron Shot and Fluoropolymer Shot Coatings as... approve copper-clad iron shot and fluoropolymer coatings for hunting waterfowl and coots. We published a...
Copy Hiding Application Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Holger; Poliakoff, David; Robinson, Peter
2016-10-06
CHAI is a light-weight framework which abstracts the automated movement of data (e.g. to/from Host/Device) via RAJA like performance portability programming model constructs. It can be viewed as a utility framework and an adjunct to FAJA (A Performance Portability Framework). Performance Portability is a technique that abstracts the complexities of modern Heterogeneous Architectures while allowing the original program to undergo incremental minimally invasive code changes in order to adapt to the newer architectures.
A systematic examination of the bone destruction pattern of the two-shot technique
Stoetzer, Marcus; Stoetzer, Carsten; Rana, Majeed; Zeller, Alexander; Hanke, Alexander; Gellrich, Nils-Claudius; von See, Constantin
2014-01-01
Introduction: The two-shot technique is an effective stopping power method. The precise mechanisms of action on the bone and soft-tissue structures of the skull; however, remain largely unclear. The aim of this study is to compare the terminal ballistics of the two-shot and single-shot techniques. Materials and Methods: 40 fresh pigs’ heads were randomly divided into 4 groups (n = 10). Either a single shot or two shots were fired at each head with a full metal jacket or a semi-jacketed bullet. Using thin-layer computed tomography and photography, the diameter of the destruction pattern and the fractures along the bullet path were then imaged and assessed. Results: A single shot fired with a full metal jacket bullet causes minor lateral destruction along the bullet path. With two shots fired with a full metal jacket bullet, however, the maximum diameter of the bullet path is significantly greater (P < 0.05) than it is with a single shot fired with a full metal jacket bullet. In contrast, the maximum diameter with a semi-jacketed bullet is similar with the single-shot and two-shot techniques. Conclusion: With the two-shot technique, a full metal jacket bullet causes a destruction pattern that is comparable to that of a single shot fired with a semi-jacketed bullet. PMID:24812454
Dialog detection in narrative video by shot and face analysis
NASA Astrophysics Data System (ADS)
Kroon, B.; Nesvadba, J.; Hanjalic, A.
2007-01-01
The proliferation of captured personal and broadcast content in personal consumer archives necessitates comfortable access to stored audiovisual content. Intuitive retrieval and navigation solutions require however a semantic level that cannot be reached by generic multimedia content analysis alone. A fusion with film grammar rules can help to boost the reliability significantly. The current paper describes the fusion of low-level content analysis cues including face parameters and inter-shot similarities to segment commercial content into film grammar rule-based entities and subsequently classify those sequences into so-called shot reverse shots, i.e. dialog sequences. Moreover shot reverse shot specific mid-level cues are analyzed augmenting the shot reverse shot information with dialog specific descriptions.
Relative toxicity of lead and selected substitute shot types to game farm mallards
Irby, H.D.; Locke, L.N.; Bagley, George E.
1967-01-01
The acute toxicity of lead, three types of plastic-coated lead, two lead-magnesium alloys, iron, copper, zinc-coated iron, and molybdenum-coated iron shot were tested in year-old male game farm mallards. Mallards (Anus platyrhynchos) were fed eight number 6 shot of each type and observed for a period of 60 days. Ducks used totaled 230 and most shot types were tested in three replicates of 8 ducks each. Mortality and losses of body weight were the criteria used for judging toxicity. Three types of plastic-coated lead shot were as toxic (93 percent) as the commercial lead shot (96 percent). The average mortality in mallards fed lead-magnesium alloy shot was less (58 percent) than that occurring in birds fed commercial lead shot. Mortality among mallards fed iron, copper, zinc-coated iron or molybdenum-coated iron shot was significantly less than in birds fed lead shot, and was not significantly greater than the conrtols.
NASA Astrophysics Data System (ADS)
White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.
2012-06-01
Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.
The Problem of Shot Selection in Basketball
Skinner, Brian
2012-01-01
In basketball, every time the offense produces a shot opportunity the player with the ball must decide whether the shot is worth taking. In this article, I explore the question of when a team should shoot and when they should pass up the shot by considering a simple theoretical model of the shot selection process, in which the quality of shot opportunities generated by the offense is assumed to fall randomly within a uniform distribution. Within this model I derive an answer to the question “how likely must the shot be to go in before the player should take it?” and I show that this lower cutoff for shot quality depends crucially on the number of shot opportunities remaining (say, before the shot clock expires), with larger demanding that only higher-quality shots should be taken. The function is also derived in the presence of a finite turnover rate and used to predict the shooting rate of an optimal-shooting team as a function of time. The theoretical prediction for the optimal shooting rate is compared to data from the National Basketball Association (NBA). The comparison highlights some limitations of the theoretical model, while also suggesting that NBA teams may be overly reluctant to shoot the ball early in the shot clock. PMID:22295109
The problem of shot selection in basketball.
Skinner, Brian
2012-01-01
In basketball, every time the offense produces a shot opportunity the player with the ball must decide whether the shot is worth taking. In this article, I explore the question of when a team should shoot and when they should pass up the shot by considering a simple theoretical model of the shot selection process, in which the quality of shot opportunities generated by the offense is assumed to fall randomly within a uniform distribution. Within this model I derive an answer to the question "how likely must the shot be to go in before the player should take it?" and I show that this lower cutoff for shot quality f depends crucially on the number n of shot opportunities remaining (say, before the shot clock expires), with larger n demanding that only higher-quality shots should be taken. The function f(n) is also derived in the presence of a finite turnover rate and used to predict the shooting rate of an optimal-shooting team as a function of time. The theoretical prediction for the optimal shooting rate is compared to data from the National Basketball Association (NBA). The comparison highlights some limitations of the theoretical model, while also suggesting that NBA teams may be overly reluctant to shoot the ball early in the shot clock.
Spent shot availability and ingestion on areas managed for mourning doves
Schulz, J.H.; Millspaugh, J.J.; Washburn, B.E.; Wester, G.R.; Lanigan, J. T.; Franson, J.C.
2002-01-01
Mourning dove (Zenaida macroura) hunting is becoming increasingly popular, especially in managed shooting fields. Given the possible increase in the availability of lead (Pb) shot on these areas, our objective was to estimate availability and ingestion of spent shot at the Eagle Bluffs Conservation Area (EBCA, hunted with nontoxic shot) and the James A. Reed Memorial Wildlife Area (JARWA, hunted with Pb shot) in Missouri. During 1998, we collected soil samples one or 2 weeks prior to the hunting season (prehunt) and after 4 days of dove hunting (posthunt). We also collected information on number of doves harvested, number of shots fired, shotgun gauge, and shotshell size used. Dove carcasses were collected on both areas during 1998-99. At EBCA, 60 hunters deposited an estimated 64,775 pellets/ha of nontoxic shot on or around the managed field. At JARWA, approximately 1,086,275 pellets/ha of Pb shot were deposited by 728 hunters. Our posthunt estimates of spent-shot availability from soil sampling were 0 pellets/ha for EBCA and 6,342 pellets/ha for JARWA. Our findings suggest that existing soil sampling protocols may not provide accurate estimates of spent-shot availability in managed dove shooting fields. During 1998-99, 15 of 310 (4.8%) mourning doves collected from EBCA had ingested nontoxic shot. Of those doves, 6 (40.0%) contained a?Y7 shot pellets. In comparison, only 2 of 574 (0.3%) doves collected from JARWA had ingested Pb shot. Because a greater proportion of doves ingested multiple steel pellets compared to Pb pellets, we suggest that doves feeding in fields hunted with Pb shot may succumb to acute Pb toxicosis and thus become unavailable to harvest, resulting in an underestimate of ingestion rates. Although further research is needed to test this hypothesis, our findings may partially explain why previous studies have shown few doves with ingested Pb shot despite their feeding on areas with high Pb shot availability.
Implementation of a Parameterization Framework for Cybersecurity Laboratories
2017-03-01
designer of laboratory exercises with tools to parameterize labs for each student , and automate some aspects of the grading of laboratory exercises. A...is to provide the designer of laboratory exercises with tools to parameterize labs for each student , and automate some aspects of the grading of...support might assist the designer of laboratory exercises to achieve the following? 1. Verify that students performed lab exercises, with some
Automating Software Design Metrics.
1984-02-01
INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this
Sea Ice Freeboard and Thickness from the 2013 IceBridge ATM and DMS Data in Ross Sea, Antarctica
NASA Astrophysics Data System (ADS)
Xie, H.; Tian, L.; Tang, J.; Ackley, S. F.
2016-12-01
In November (20, 21, 27, and 28) 2013, NASA's IceBridge mission flew over the Ross Sea, Antarctica and collected important sea ice data with the ATM and DMS for the first time. We will present our methods to derive the local sea level and total freeboard for ice thickness retrieval from these two datasets. The methods include (1) leads classification from DMS data using an automated lead detection method, (2) potential leads from the reflectance of less than 0.25 from the ATM laser shots of L1B data, (3) local sea level retrieval based on these qualified ATM laser shots (L1B) within the DMS-derived leads (after outliers removal from the mean ± 2 standard deviation of these ATM elevations), (4) establishment of an empirical equation of local sea level as a function of distance from the starting point of each IceBridge flight, (5) total freeboard retrieval from the ATM L2 elevations by subtracting the local sea level derived from the empirical equation, and (6) ice thickness retrieval. The ice thickness derived from this method will be analyzed and compared with ICESat data (2003-2009) and other available data for the same region at the similar time period. Possible change and potential reasons will be identified and discussed.
NASA Technical Reports Server (NTRS)
Johnson, Eric N.
2012-01-01
Function allocation assigns work functions to all agents in a team, both human and automation. Efforts to guide function allocation systematically have been studied in many fields such as engineering, human factors, team and organization design, management science, cognitive systems engineering. Each field focuses on certain aspects of function allocation, but not all; thus, an independent discussion of each does not address all necessary aspects of function allocation. Four distinctive perspectives have emerged from this comprehensive review of literature on those fields: the technology-centered, human-centered, team-oriented, and work-oriented perspectives. Each perspective focuses on different aspects of function allocation: capabilities and characteristics of agents (automation or human), structure and strategy of a team, and work structure and environment. This report offers eight issues with function allocation that can be used to assess the extent to which each of issues exist on a given function allocation. A modeling framework using formal models and simulation was developed to model work as described by the environment, agents, their inherent dynamics, and relationships among them. Finally, to validate the framework and metrics, a case study modeled four different function allocations between a pilot and flight deck automation during the arrival and approach phases of flight.
Peirone, Laura S; Pereyra Irujo, Gustavo A; Bolton, Alejandro; Erreguerena, Ignacio; Aguirrezábal, Luis A N
2018-01-01
Conventional field phenotyping for drought tolerance, the most important factor limiting yield at a global scale, is labor-intensive and time-consuming. Automated greenhouse platforms can increase the precision and throughput of plant phenotyping and contribute to a faster release of drought tolerant varieties. The aim of this work was to establish a framework of analysis to identify early traits which could be efficiently measured in a greenhouse automated phenotyping platform, for predicting the drought tolerance of field grown soybean genotypes. A group of genotypes was evaluated, which showed variation in their drought susceptibility index (DSI) for final biomass and leaf area. A large number of traits were measured before and after the onset of a water deficit treatment, which were analyzed under several criteria: the significance of the regression with the DSI, phenotyping cost, earliness, and repeatability. The most efficient trait was found to be transpiration efficiency measured at 13 days after emergence. This trait was further tested in a second experiment with different water deficit intensities, and validated using a different set of genotypes against field data from a trial network in a third experiment. The framework applied in this work for assessing traits under different criteria could be helpful for selecting those most efficient for automated phenotyping.
... checkup. Make a plan with your doctor or nurse to get the shots you need. You may also be able to get shots at your local pharmacy. Use this vaccine clinic locator to find out where you can get important shots. Get a seasonal flu shot every year. Remember, everyone age 6 months ...
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.
NASA Astrophysics Data System (ADS)
Yang, R.; Zhang, X.; Mallipeddi, D.; Angelou, N.; Toftegaard, H. L.; Li, Y.; Ahlström, J.; Lorentzen, L.; Wu, G.; Huang, X.
2017-07-01
A martensitic gear steel (18CrNiMo7-6) was annealed at 180 °C for 2h and at ˜ 750 °C for 1h to design two different starting microstructures for shot peening. One maintains the original as-transformed martensite while the other contains irregular-shaped sorbite together with ferrite. These two materials were shot peened using two different peening conditions. The softer sorbite + ferrite microstructure was shot peened using 0.6 mm conditioned cut steel shots at an average speed of 25 m/s in a conventional shot peening machine, while the harder tempered martensite steel was shot peened using 1.5 mm steel shots at a speed of 50 m/s in an in-house developed shot peening machine. The shot speeds in the conventional shot peening machine were measured using an in-house lidar set-up. The microstructure of each sample was characterized by optical and scanning electron microscopy, and the mechanical properties examined by microhardness and tensile testing. The residual stresses were measured using an Xstress 3000 G2R diffractometer equipped with a Cr Kα x-ray source. The correspondence between the residual stress profile and the gradient structure produced by shot peening, and the relationship between the microstructure and strength, are analyzed and discussed.
Basketball Shot Types and Shot Success in Different Levels of Competitive Basketball
2015-01-01
The purpose of our research was to investigate the relative frequencies of different types of basketball shots (above head, hook shot, layup, dunk, tip-in), some details about their technical execution (one-legged, two-legged, drive, cut, …), and shot success in different levels of basketball competitions. We analysed video footage and categorized 5024 basketball shots from 40 basketball games and 5 different levels of competitive basketball (National Basketball Association (NBA), Euroleague, Slovenian 1st Division, and two Youth basketball competitions). Statistical analysis with hierarchical multinomial logistic regression models reveals that there are substantial differences between competitions. However, most differences decrease or disappear entirely after we adjust for differences in situations that arise in different competitions (shot location, player type, and attacks in transition). Differences after adjustment are mostly between the Senior and Youth competitions: more shots executed jumping or standing on one leg, more uncategorised shot types, and more dribbling or cutting to the basket in the Youth competitions, which can all be attributed to lesser technical and physical ability of developing basketball players. The two discernible differences within the Senior competitions are that, in the NBA, dunks are more frequent and hook shots are less frequent compared to European basketball, which can be attributed to better athleticism of NBA players. The effect situational variables have on shot types and shot success are found to be very similar for all competitions. PMID:26038836
Basketball shot types and shot success in different levels of competitive basketball.
Erčulj, Frane; Štrumbelj, Erik
2015-01-01
The purpose of our research was to investigate the relative frequencies of different types of basketball shots (above head, hook shot, layup, dunk, tip-in), some details about their technical execution (one-legged, two-legged, drive, cut, …), and shot success in different levels of basketball competitions. We analysed video footage and categorized 5024 basketball shots from 40 basketball games and 5 different levels of competitive basketball (National Basketball Association (NBA), Euroleague, Slovenian 1st Division, and two Youth basketball competitions). Statistical analysis with hierarchical multinomial logistic regression models reveals that there are substantial differences between competitions. However, most differences decrease or disappear entirely after we adjust for differences in situations that arise in different competitions (shot location, player type, and attacks in transition). Differences after adjustment are mostly between the Senior and Youth competitions: more shots executed jumping or standing on one leg, more uncategorised shot types, and more dribbling or cutting to the basket in the Youth competitions, which can all be attributed to lesser technical and physical ability of developing basketball players. The two discernible differences within the Senior competitions are that, in the NBA, dunks are more frequent and hook shots are less frequent compared to European basketball, which can be attributed to better athleticism of NBA players. The effect situational variables have on shot types and shot success are found to be very similar for all competitions.
E-Services quality assessment framework for collaborative networks
NASA Astrophysics Data System (ADS)
Stegaru, Georgiana; Danila, Cristian; Sacala, Ioan Stefan; Moisescu, Mihnea; Mihai Stanescu, Aurelian
2015-08-01
In a globalised networked economy, collaborative networks (CNs) are formed to take advantage of new business opportunities. Collaboration involves shared resources and capabilities, such as e-Services that can be dynamically composed to automate CN participants' business processes. Quality is essential for the success of business process automation. Current approaches mostly focus on quality of service (QoS)-based service selection and ranking algorithms, overlooking the process of service composition which requires interoperable, adaptable and secure e-Services to ensure seamless collaboration, data confidentiality and integrity. Lack of assessment of these quality attributes can result in e-Service composition failure. The quality of e-Service composition relies on the quality of each e-Service and on the quality of the composition process. Therefore, there is the need for a framework that addresses quality from both views: product and process. We propose a quality of e-Service composition (QoESC) framework for quality assessment of e-Service composition for CNs which comprises of a quality model for e-Service evaluation and guidelines for quality of e-Service composition process. We implemented a prototype considering a simplified telemedicine use case which involves a CN in e-Healthcare domain. To validate the proposed quality-driven framework, we analysed service composition reliability with and without using the proposed framework.
Building an automated SOAP classifier for emergency department reports.
Mowery, Danielle; Wiebe, Janyce; Visweswaran, Shyam; Harkema, Henk; Chapman, Wendy W
2012-02-01
Information extraction applications that extract structured event and entity information from unstructured text can leverage knowledge of clinical report structure to improve performance. The Subjective, Objective, Assessment, Plan (SOAP) framework, used to structure progress notes to facilitate problem-specific, clinical decision making by physicians, is one example of a well-known, canonical structure in the medical domain. Although its applicability to structuring data is understood, its contribution to information extraction tasks has not yet been determined. The first step to evaluating the SOAP framework's usefulness for clinical information extraction is to apply the model to clinical narratives and develop an automated SOAP classifier that classifies sentences from clinical reports. In this quantitative study, we applied the SOAP framework to sentences from emergency department reports, and trained and evaluated SOAP classifiers built with various linguistic features. We found the SOAP framework can be applied manually to emergency department reports with high agreement (Cohen's kappa coefficients over 0.70). Using a variety of features, we found classifiers for each SOAP class can be created with moderate to outstanding performance with F(1) scores of 93.9 (subjective), 94.5 (objective), 75.7 (assessment), and 77.0 (plan). We look forward to expanding the framework and applying the SOAP classification to clinical information extraction tasks. Copyright © 2011. Published by Elsevier Inc.
50 CFR 20.134 - Nontoxic shot.
Code of Federal Regulations, 2010 CFR
2010-10-01
... relevant data, predicting the toxic effect in waterfowl of complete erosion and absorption of one shot or... pellet of lead shot. Dose one group (8 males and 8 females) with eight size No. 4 pellets of steel shot...) and provide commercial breeder mash. Dosing of the 3 groups with one pellet of No. 4 lead shot...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-06
... of Fluoropolymeric Shot Coatings as Nontoxic for Waterfowl Hunting AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of application for nontoxic shot approval. SUMMARY: We, the U.S. Fish and Wildlife Service, announce that Spectra Shot, LLC, of Lafayette, Louisiana, has applied for approval of steel shot...
50 CFR 20.134 - Nontoxic shot.
Code of Federal Regulations, 2013 CFR
2013-10-01
... relevant data, predicting the toxic effect in waterfowl of complete erosion and absorption of one shot or... pellet of lead shot. Dose one group (8 males and 8 females) with eight size No. 4 pellets of steel shot...) and provide commercial breeder mash. Dosing of the 3 groups with one pellet of No. 4 lead shot...
50 CFR 20.134 - Nontoxic shot.
Code of Federal Regulations, 2011 CFR
2011-10-01
... relevant data, predicting the toxic effect in waterfowl of complete erosion and absorption of one shot or... pellet of lead shot. Dose one group (8 males and 8 females) with eight size No. 4 pellets of steel shot...) and provide commercial breeder mash. Dosing of the 3 groups with one pellet of No. 4 lead shot...
50 CFR 20.134 - Nontoxic shot.
Code of Federal Regulations, 2012 CFR
2012-10-01
... relevant data, predicting the toxic effect in waterfowl of complete erosion and absorption of one shot or... pellet of lead shot. Dose one group (8 males and 8 females) with eight size No. 4 pellets of steel shot...) and provide commercial breeder mash. Dosing of the 3 groups with one pellet of No. 4 lead shot...
Continuous video coherence computing model for detecting scene boundaries
NASA Astrophysics Data System (ADS)
Kang, Hang-Bong
2001-07-01
The scene boundary detection is important in the semantic understanding of video data and is usually determined by coherence between shots. To measure the coherence, two approaches have been proposed. One is a discrete approach and the other one is a continuous approach. In this paper, we use the continuous approach and propose some modifications on the causal First-In-First-Out(FIFO) short-term memory-based model. One modification is that we allow dynamic memory size in computing coherence reliably regardless of the size of each shot. Another modification is that some shots can be removed from the memory buffer not by the FIFO rule. These removed shots have no or small foreground objects. Using this model, we detect scene boundaries by computing shot coherence. In computing coherence, we add one new term which is the number of intermediate shots between two comparing shots because the effect of intermediate shots is important in computing shot recall. In addition, we also consider shot activity because this is important to reflect human perception. We experiment our computing model on different genres of videos and have obtained reasonable results.
Java Tool Framework for Automation of Hardware Commissioning and Maintenance Procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, J C; Fisher, J M; Gordon, J B
2007-10-02
The National Ignition Facility (NIF) is a 192-beam laser system designed to study high energy density physics. Each beam line contains a variety of line replaceable units (LRUs) that contain optics, stepping motors, sensors and other devices to control and diagnose the laser. During commissioning and subsequent maintenance of the laser, LRUs undergo a qualification process using the Integrated Computer Control System (ICCS) to verify and calibrate the equipment. The commissioning processes are both repetitive and tedious when we use remote manual computer controls, making them ideal candidates for software automation. Maintenance and Commissioning Tool (MCT) software was developed tomore » improve the efficiency of the qualification process. The tools are implemented in Java, leveraging ICCS services and CORBA to communicate with the control devices. The framework provides easy-to-use mechanisms for handling configuration data, task execution, task progress reporting, and generation of commissioning test reports. The tool framework design and application examples will be discussed.« less
An Aspect-Oriented Framework for Business Process Improvement
NASA Astrophysics Data System (ADS)
Pourshahid, Alireza; Mussbacher, Gunter; Amyot, Daniel; Weiss, Michael
Recently, many organizations invested in Business Process Management Systems (BPMSs) in order to automate and monitor their processes. Business Activity Monitoring is one of the essential modules of a BPMS as it provides the core monitoring capabilities. Although the natural step after process monitoring is process improvement, most of the existing systems do not provide the means to help users with the improvement step. In this paper, we address this issue by proposing an aspect-oriented framework that allows the impact of changes to business processes to be explored with what-if scenarios based on the most appropriate process redesign patterns among several possibilities. As the four cornerstones of a BPMS are process, goal, performance and validation views, these views need to be aligned automatically by any approach that intends to support automated improvement of business processes. Our framework therefore provides means to reflect process changes also in the other views of the business process. A health care case study presented as a proof of concept suggests that this novel approach is feasible.
Transport phenomena in helical edge state interferometers: A Green's function approach
NASA Astrophysics Data System (ADS)
Rizzo, Bruno; Arrachea, Liliana; Moskalets, Michael
2013-10-01
We analyze the current and the shot noise of an electron interferometer made of the helical edge states of a two-dimensional topological insulator within the framework of nonequilibrium Green's functions formalism. We study, in detail, setups with a single and with two quantum point contacts inducing scattering between the different edge states. We consider processes preserving the spin as well as the effect of spin-flip scattering. In the case of a single quantum point contact, a simple test based on the shot-noise measurement is proposed to quantify the strength of the spin-flip scattering. In the case of two single point contacts with the additional ingredient of gate voltages applied within a finite-size region at the top and bottom edges of the sample, we identify two types of interference processes in the behavior of the currents and the noise. One such process is analogous to that taking place in a Fabry-Pérot interferometer, while the second one corresponds to a configuration similar to a Mach-Zehnder interferometer. In the helical interferometer, these two processes compete.
Crowdsourcing-based evaluation of privacy in HDR images
NASA Astrophysics Data System (ADS)
Korshunov, Pavel; Nemoto, Hiromi; Skodras, Athanassios; Ebrahimi, Touradj
2014-05-01
The ability of High Dynamic Range imaging (HDRi) to capture details in high-contrast environments, making both dark and bright regions clearly visible, has a strong implication on privacy. However, the extent to which HDRi affects privacy when it is used instead of typical Standard Dynamic Range imaging (SDRi) is not yet clear. In this paper, we investigate the effect of HDRi on privacy via crowdsourcing evaluation using the Microworkers platform. Due to the lack of HDRi standard privacy evaluation dataset, we have created such dataset containing people of varying gender, race, and age, shot indoor and outdoor and under large range of lighting conditions. We evaluate the tone-mapped versions of these images, obtained by several representative tone-mapping algorithms, using subjective privacy evaluation methodology. Evaluation was performed using crowdsourcing-based framework, because it is a popular and effective alternative to traditional lab-based assessment. The results of the experiments demonstrate a significant loss of privacy when even tone-mapped versions of HDR images are used compared to typical SDR images shot with a standard exposure.
Vučković, Goran; James, Nic; Hughes, Mike; Murray, Stafford; Sporiš, Goran; Perš, Janez
2013-01-01
No previous research in squash has considered the time between shots or the proximity of the ball to a wall, which are two important variables that influence shot outcomes. The aim of this paper was to analyse shot types to determine the extent to which they are played in different court areas and a more detailed analysis to determine whether the time available had an influence on the shot selected. Ten elite matches, contested by fifteen of the world's top right handed squash players (age 27 ± 3.2, height 1.81 ± 0.06 m, weight 76.3 ± 3.7 kg), at the men's World Team Championships were processed using the SAGIT/Squash tracking system with shot information manually added to the system. Results suggested that shot responses were dependent upon court location and the time between shots. When these factors were considered repeatable performance existed to the extent that one of two shots was typically played when there was limited time to play the shot (< 1.20s). For example, it was clear that when players did not have a lot of time to hit the ball (low time i.e. < 1.06s, and mid time i.e. 1.06 - 1.20s) in the front left corner close to the side wall, the crosscourt lob was used frequently (44.30% and 36.31% respectively) whereas when there was more time this shot was seldom used (13.64%). Consequently variant and invariant behaviour were shown to exist in elite squash although for the first time it was suggested that the availability of time to play a shot contributed to which of these behaviours was evident. This analysis could be extended by adopting a case study approach to see how individual differences in strategy and tactics affect shot selections. Key pointsPrevious research has suggested that a playing strategy, elements decided in advance of the match, may be evident for elite players by examining court location and preceding shot type, however these parameters alone are unlikely to be sufficient predictors.At present there is no known analysis in squash, or indeed in any of the racket sports, that has quantified the time available to respond to different shot types. An understanding of the time interval between shots and the movement characteristics of the player responding to different shots according to the court positions might facilitate a better understanding of the dynamics that determine shot selection.Some elements of a general playing strategy were evident e.g. predominately hitting to the back left of the court, but tactical differences in shot selection were also evident on the basis of court location and time available to play a shot.
Vučković, Goran; James, Nic; Hughes, Mike; Murray, Stafford; Sporiš, Goran; Perš, Janez
2013-01-01
No previous research in squash has considered the time between shots or the proximity of the ball to a wall, which are two important variables that influence shot outcomes. The aim of this paper was to analyse shot types to determine the extent to which they are played in different court areas and a more detailed analysis to determine whether the time available had an influence on the shot selected. Ten elite matches, contested by fifteen of the world’s top right handed squash players (age 27 ± 3.2, height 1.81 ± 0.06 m, weight 76.3 ± 3.7 kg), at the men’s World Team Championships were processed using the SAGIT/Squash tracking system with shot information manually added to the system. Results suggested that shot responses were dependent upon court location and the time between shots. When these factors were considered repeatable performance existed to the extent that one of two shots was typically played when there was limited time to play the shot (< 1.20s). For example, it was clear that when players did not have a lot of time to hit the ball (low time i.e. < 1.06s, and mid time i.e. 1.06 - 1.20s) in the front left corner close to the side wall, the crosscourt lob was used frequently (44.30% and 36.31% respectively) whereas when there was more time this shot was seldom used (13.64%). Consequently variant and invariant behaviour were shown to exist in elite squash although for the first time it was suggested that the availability of time to play a shot contributed to which of these behaviours was evident. This analysis could be extended by adopting a case study approach to see how individual differences in strategy and tactics affect shot selections. Key points Previous research has suggested that a playing strategy, elements decided in advance of the match, may be evident for elite players by examining court location and preceding shot type, however these parameters alone are unlikely to be sufficient predictors. At present there is no known analysis in squash, or indeed in any of the racket sports, that has quantified the time available to respond to different shot types. An understanding of the time interval between shots and the movement characteristics of the player responding to different shots according to the court positions might facilitate a better understanding of the dynamics that determine shot selection. Some elements of a general playing strategy were evident e.g. predominately hitting to the back left of the court, but tactical differences in shot selection were also evident on the basis of court location and time available to play a shot. PMID:24149727
Lead in tissues of mallard ducks dosed with two types of lead shot
Finley, M.T.; Dieter, M.P.; Locke, L.N.
1976-01-01
Mallard ducks (Anas platyrhynchos) were sacrificed one month after ingesting one number 4 all-lead shot or one number 4 lead-iron shot. Livers, kidneys, blood, wingbones, and eggs were analyzed for lead by atomic absorption. Necropsy of sacrificed ducks failed to reveal any of the tissue lesions usually associated with lead poisoning in waterfowl. Lead levels in ducks given all-lead shot averaged about twice those in ducks given lead-iron shot, reflecting the amount of lead in the two types of shot. Lead in the blood of ducks dosed with all-lead shot averaged 0.64 ppm, and 0.28 ppm in ducks given lead-iron shot. Lead residues in livers and kidneys of females given all-lead shot were significantly higher than in males. In both dosed groups, lead levels in wingbones of females were about 10 times those in males, and were significantly correlated with the number of eggs laid after dosage. Lead levels in contents and shells of eggs laid by hens dosed with all-lead shot were about twice those in eggs laid by hens dosed with lead-iron shot. Eggshells were found to best reflect levels of lead in the blood. Our results indicate that mallards maintained on a balanced diet and dosed with one lead shot may not accumulate extremely high lead levels in the liver and kidney. However, extremely high lead deposition may result in the bone of laying hens after ingesting sublethal amounts of lead shot as a result of mobilization of calcium from the bone during eggshell formation.
EVALUATION OF PROMPT DOSE ENVIRONMENT IN THE NATIONAL IGNITION FACILITY DURING D-D AND THD SHOTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khater, H; Dauffy, L; Sitaraman, S
2009-04-28
Evaluation of the prompt dose environment expected in the National Ignition Facility (NIF) during Deuterium-Deuterium (D-D) and Tritium-Hydrogen-Deuterium (THD) shots have been completed. D-D shots resulting in the production of an annual fusion yield of up to 2.4 kJ (200 shots with 10{sup 13} neutrons per shot) are considered. During the THD shot campaign, shots generating a total of 2 x 10{sup 14} neutrons per shot are also planned. Monte Carlo simulations have been performed to estimate prompt dose values inside the facility as well as at different locations outside the facility shield walls. The Target Chamber shielding, along withmore » Target Bay and Switchyard walls, roofs, and shield doors (when needed) will reduce dose levels in occupied areas to acceptable values during these shot campaigns. The calculated dose values inside occupied areas are small, estimated at 25 and 85 {micro}rem per shot during the D-D and THD shots, respectively. Dose values outside the facility are insignificant. The nearest building to the NIF facility where co-located workers may reside is at a distance of about 100 m from the Target Chamber Center (TCC). The dose in such a building is estimated at a fraction of a ?rem during a D-D or a THD shot. Dose at the nearest site boundary location (350 m from TCC), is caused by skyshine and to a lesser extent by direct radiation. The maximum off-site dose during any of the shots considered is less than 10 nano rem.« less
Lead shot incidence on a New Mexico public hunting area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schranck, B.W.; Dollahon, G.R.
The incidence of lead shot was investigated on a 20.2-ha (50-acre) seasonal marsh used for waterfowl hunting at the Bitter Lake National Wildlife Refuge, Roswell, New Mexico. Of the 162 soil samples taken randomly, 59% contained 1 to 5 lead shot. A minimum lead shot incidence of 98,985 shot per ha (40,075 per acre) was calculated. More shot was found in firm than in mucky soils. Management recommendations to limit waterfowl exposure to lead ingestion are offered.
Rousanoglou, Elissavet N.; Noutsos, Konstantinos S.; Bayios, Ioannis A.; Boudolos, Konstantinos D.
2015-01-01
The fixed duration of a team-handball game and its continuously changing situations incorporate an inherent temporal pressure. Also, the target’s position is not foreknown but online determined by the player’s interceptive processing of visual information. These ecological limitations do not favour throwing performance, particularly in novice players, and are not reflected in previous experimental settings of self-paced throws with foreknowledge of target position. The study investigated the self-paced and temporally constrained throwing performance without foreknowledge of target position, in team-handball experts and novices in three shot types (Standing Shot, 3Step Shot, Jump Shot). The target position was randomly illuminated on a tabloid surface before (self-paced condition) and after (temporally constrained condition) shot initiation. Response time, throwing velocity and throwing accuracy were measured. A mixed 2 (experience) X 2 (temporal constraint condition) ANOVA was applied. The novices performed with significantly lower throwing velocity and worse throwing accuracy in all shot types (p = 0.000) and, longer response time only in the 3Step Shot (p = 0.013). The temporal constraint (significantly shorter response times in all shot types at p = 0.000) had a shot specific effect with lower throwing velocity only in the 3Step Shot (p = 0.001) and an unexpected greater throwing accuracy only in the Standing Shot (p = 0.002). The significant interaction between experience and temporal constraint condition in throwing accuracy (p = 0.003) revealed a significant temporal constraint effect in the novices (p = 0.002) but not in the experts (p = 0.798). The main findings of the study are the shot specificity of the temporal constraint effect, as well as that, depending on the shot, the novices’ throwing accuracy may benefit rather than worsen under temporal pressure. Key points The temporal constraint induced a shot specific significant difference in throwing velocity in both the experts and the novices. The temporal constraint induced a shot specific significant difference in throwing accuracy only in the novices. Depending on the shot demands, the throwing accuracy of the novices may benefit under temporally constrained situations. PMID:25729288
Use of Annotations for Component and Framework Interoperability
NASA Astrophysics Data System (ADS)
David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.
2009-12-01
The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.
NASA Technical Reports Server (NTRS)
Bates, H. E.; Hill, D. M.; Jewett, D. N.
1983-01-01
Drop length necessary to convert molten silicon to shot reduced by proposed new process. Conversion of silicon from powder or chunks to shot often simplifies processing. Shot is more easily handled in most processing equipment. Drops of liquid silicon fall through protective cloud of argon, then through rapidly cooling bath of methanol, where they quickly turn into solid shot.
... 12 hours after birth, followed by a second shot 1 month later, and the third shot 5 months ... the hepatitis B virus should get the first shot within 1 to 2 months after birth, the second shot ...
NASA Astrophysics Data System (ADS)
Asgari, Ali; Dehestani, Pouya; Poruraminaie, Iman
2018-02-01
Shot peening is a well-known process in applying the residual stress on the surface of industrial parts. The induced residual stress improves fatigue life. In this study, the effects of shot peening parameters such as shot diameter, shot speed, friction coefficient, and the number of impacts on the applied residual stress will be evaluated. To assess these parameters effect, firstly the shot peening process has been simulated by finite element method. Then, effects of the process parameters on the residual stress have been evaluated by response surface method as a statistical approach. Finally, a strong model is presented to predict the maximum residual stress induced by shot peening process in AISI 4340 steel. Also, the optimum parameters for the maximum residual stress are achieved. The results indicate that effect of shot diameter on the induced residual stress is increased by increasing the shot speed. Also, enhancing the friction coefficient magnitude always cannot lead to increase in the residual stress.
100-kHz shot-to-shot broadband data acquisition for high-repetition-rate pump-probe spectroscopy.
Kanal, Florian; Keiber, Sabine; Eck, Reiner; Brixner, Tobias
2014-07-14
Shot-to-shot broadband detection is common in ultrafast pump-probe spectroscopy. Taking advantage of the intensity correlation of subsequent laser pulses improves the signal-to-noise ratio. Finite data readout times of CCD chips in the employed spectrometer and the maximum available speed of mechanical pump-beam choppers typically limit this approach to lasers with repetition rates of a few kHz. For high-repetition (≥ 100 kHz) systems, one typically averages over a larger number of laser shots leading to inferior signal-to-noise ratios or longer measurement times. Here we demonstrate broadband shot-to-shot detection in transient absorption spectroscopy with a 100-kHz femtosecond laser system. This is made possible using a home-built high-speed chopper with external laser synchronization and a fast CCD line camera. Shot-to-shot detection can reduce the data acquisition time by two orders of magnitude compared to few-kHz lasers while keeping the same signal-to-noise ratio.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
Marenco, Luis N.; Wang, Rixin; Bandrowski, Anita E.; Grethe, Jeffrey S.; Shepherd, Gordon M.; Miller, Perry L.
2014-01-01
This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF’s data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO’s current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation. PMID:25018728
Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L
2014-01-01
This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Automated Discovery of Simulation Between Programs
2014-10-18
relation. These relations enable the refinement-step of SimAbs. We have implemented SimAbs using UFO framework and Z3 SMT-solver and applied it to...step of SimAbs. We implemented SimAbs and AE-VAL on the top of the UFO framework [1, 15] and an SMT-solver Z3 [8], respectively. We have evaluated SimAbs...ut 6 Evaluation We have implemented SimAbs in the UFO framework, and evaluated it on the Software Verification Competition (SVCOMP’14) benchmarks and
2008-06-01
executes the avionics test) can run on the new ATS thus creating the common ATS framework . The system will also enable numerous new functional...Enterprise-level architecture that reflects corporate DoD priorities and requirements for business systems, and provides a common framework to ensure that...entire Business Mission Area (BMA) of the DoD. The BEA also contains a set of integrated Department of Defense Architecture Framework (DoDAF
Tackling the x-ray cargo inspection challenge using machine learning
NASA Astrophysics Data System (ADS)
Jaccard, Nicolas; Rogers, Thomas W.; Morton, Edward J.; Griffin, Lewis D.
2016-05-01
The current infrastructure for non-intrusive inspection of cargo containers cannot accommodate exploding com-merce volumes and increasingly stringent regulations. There is a pressing need to develop methods to automate parts of the inspection workflow, enabling expert operators to focus on a manageable number of high-risk images. To tackle this challenge, we developed a modular framework for automated X-ray cargo image inspection. Employing state-of-the-art machine learning approaches, including deep learning, we demonstrate high performance for empty container verification and specific threat detection. This work constitutes a significant step towards the partial automation of X-ray cargo image inspection.
Binakonsky, Jane; Giga, Noreen; Ross, Craig; Siegel, Michael
2011-01-01
We investigated the extent of jello shot consumption among underage youths. We conducted a pilot study among a nonrandom national sample of 108 drinkers, aged 16-20 years, recruited from the Knowledge Networks Internet panel in 2010 by using consecutive sampling. The prevalence of past 30-day jello shot consumption among the 108 drinkers, aged 16-20 years, in our sample was 21.4%, and among those who consumed jello shots, the percentage of alcohol consumption attributable to jello shots averaged 14.5%. We concluded that jello shot use is prevalent among youths, representing a substantial proportion of their alcohol intake. Surveillance of youth alcohol use should include jello shot consumption.
NASA Technical Reports Server (NTRS)
Cabrall, C.; Gomez, A.; Homola, J.; Hunt, S..; Martin, L.; Merccer, J.; Prevott, T.
2013-01-01
As part of an ongoing research effort on separation assurance and functional allocation in NextGen, a controller- in-the-loop study with ground-based automation was conducted at NASA Ames' Airspace Operations Laboratory in August 2012 to investigate the potential impact of introducing self-separating aircraft in progressively advanced NextGen timeframes. From this larger study, the current exploratory analysis of controller-automation interaction styles focuses on the last and most far-term time frame. Measurements were recorded that firstly verified the continued operational validity of this iteration of the ground-based functional allocation automation concept in forecast traffic densities up to 2x that of current day high altitude en-route sectors. Additionally, with greater levels of fully automated conflict detection and resolution as well as the introduction of intervention functionality, objective and subjective analyses showed a range of passive to active controller- automation interaction styles between the participants. Not only did the controllers work with the automation to meet their safety and capacity goals in the simulated future NextGen timeframe, they did so in different ways and with different attitudes of trust/use of the automation. Taken as a whole, the results showed that the prototyped controller-automation functional allocation framework was very flexible and successful overall.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-26
... product from one small source (e.g., ``HOT SHOT'' device from Stream Systems) when metal detection... of complete erosion and absorption of one shot or coated shot in a 24-hour period. Define the nature... Approval of Nontoxic Shot for Use in Waterfowl Hunting AGENCY: Fish and Wildlife Service, Interior. ACTION...
Hepatitis A and the Vaccine (Shot) to Prevent It
... Resources Maternal Immunization Resources Related Links Vaccines & Immunizations Hepatitis A and the Vaccine (Shot) to Prevent It ... the vaccine. Why should my child get the hepatitis A shot? The hepatitis A shot: Protects your ...
Li, Beiwen; Liu, Ziping; Zhang, Song
2016-10-03
We propose a hybrid computational framework to reduce motion-induced measurement error by combining the Fourier transform profilometry (FTP) and phase-shifting profilometry (PSP). The proposed method is composed of three major steps: Step 1 is to extract continuous relative phase maps for each isolated object with single-shot FTP method and spatial phase unwrapping; Step 2 is to obtain an absolute phase map of the entire scene using PSP method, albeit motion-induced errors exist on the extracted absolute phase map; and Step 3 is to shift the continuous relative phase maps from Step 1 to generate final absolute phase maps for each isolated object by referring to the absolute phase map with error from Step 2. Experiments demonstrate the success of the proposed computational framework for measuring multiple isolated rapidly moving objects.
Monna, T; Kanno, T; Marumo, T; Harihara, S; Kuroki, T; Yamamoto, S; Kobayashi, N; Sato, M; Nakamura, K; Nakatsuka, H; Onoyama, Y; Yamada, R
1982-12-01
It has been confirmed gradually that transcatheter arterial embolization is the most effective, conservative therapy for the treatment of unresectable hepatic cell carcinoma (hepatoma). Embolization or one shot therapy was carried out in a clinical trial involving 41 patients with unresectable hepatoma visiting our department. Embolization group (emboli G): 19 cases. 1 to 6 embolizations in each case. One shot group (one shot G): 22 cases. Medications: Mitomycin C 10-40 mg and others. Disappearance rate of icterus after treatment was 50% (emboli G) and 25% (one shot G). Decrease in size of hepatomegaly or tumor was seen in 84% (emboli G) and 32% (one shot G) which was statistically significant (less than 1%). Serum AFP titer after embolization decreased in all cases but in only 5 of 12 cases (ca 41%) after one shot (less than 1%). Effective cases measured by Karnofsky's method were 18 out of 19 cases (95%) in emboli G, but in one shot G only 10 out of 22 cases (ca 45%)(less than 0.1%). Survival rate after each therapy was 67% (emboli G) and 38% (one shot G) after 6 months, and 59% (emboli G) and 19% (one shot G) at 1 year respectively. One study showed that transcatheter arterial embolization therapy was much more effective than one shot therapy.
Effect of shot peening on the microstructure of laser hardened 17-4PH
NASA Astrophysics Data System (ADS)
Wang, Zhou; Jiang, Chuanhai; Gan, Xiaoyan; Chen, Yanhua
2010-12-01
In order to investigate the influence of shot peening on microstructure of laser hardened steel and clarify how much influence of initial microstructure induced by laser hardening treatment on final microstructure of laser hardened steel after shot peening treatment, measurements of retained austenite, measurements of microhardness and microstructural analysis were carried out on three typical areas including laser hardened area, transitional area and matrix area of laser hardened 17-4PH steel. The results showed that shot peening was an efficient cold working method to eliminate the retained austenite on the surface of laser hardened samples. The surface hardness increased dramatically when shot peening treatments were carried out. The analyses of microstructure of laser hardened 17-4PH after shot peening treatment were carried out in matrix area and laser hardened area via Voigt method. With the increasing peening intensity, the influence depth of shot peening on hardness and microstructure increased but the surface hardness and microstructure did not change when certain peening intensity was reached. Influence depth of shot peening on hardness was larger than influence depth of shot peening on microstructure due to the kinetic energy loss along the depth during shot peening treatment. From the microstructural result, it can be shown that the shot peening treatment can influence the domain size and microstrain of treated samples but laser hardening treatment can only influence the microstrain of treated samples.
Systematic evaluation of deep learning based detection frameworks for aerial imagery
NASA Astrophysics Data System (ADS)
Sommer, Lars; Steinmann, Lucas; Schumann, Arne; Beyerer, Jürgen
2018-04-01
Object detection in aerial imagery is crucial for many applications in the civil and military domain. In recent years, deep learning based object detection frameworks significantly outperformed conventional approaches based on hand-crafted features on several datasets. However, these detection frameworks are generally designed and optimized for common benchmark datasets, which considerably differ from aerial imagery especially in object sizes. As already demonstrated for Faster R-CNN, several adaptations are necessary to account for these differences. In this work, we adapt several state-of-the-art detection frameworks including Faster R-CNN, R-FCN, and Single Shot MultiBox Detector (SSD) to aerial imagery. We discuss adaptations that mainly improve the detection accuracy of all frameworks in detail. As the output of deeper convolutional layers comprise more semantic information, these layers are generally used in detection frameworks as feature map to locate and classify objects. However, the resolution of these feature maps is insufficient for handling small object instances, which results in an inaccurate localization or incorrect classification of small objects. Furthermore, state-of-the-art detection frameworks perform bounding box regression to predict the exact object location. Therefore, so called anchor or default boxes are used as reference. We demonstrate how an appropriate choice of anchor box sizes can considerably improve detection performance. Furthermore, we evaluate the impact of the performed adaptations on two publicly available datasets to account for various ground sampling distances or differing backgrounds. The presented adaptations can be used as guideline for further datasets or detection frameworks.
Model-based metrics of human-automation function allocation in complex work environments
NASA Astrophysics Data System (ADS)
Kim, So Young
Function allocation is the design decision which assigns work functions to all agents in a team, both human and automated. Efforts to guide function allocation systematically has been studied in many fields such as engineering, human factors, team and organization design, management science, and cognitive systems engineering. Each field focuses on certain aspects of function allocation, but not all; thus, an independent discussion of each does not address all necessary issues with function allocation. Four distinctive perspectives emerged from a review of these fields: technology-centered, human-centered, team-oriented, and work-oriented. Each perspective focuses on different aspects of function allocation: capabilities and characteristics of agents (automation or human), team structure and processes, and work structure and the work environment. Together, these perspectives identify the following eight issues with function allocation: 1) Workload, 2) Incoherency in function allocations, 3) Mismatches between responsibility and authority, 4) Interruptive automation, 5) Automation boundary conditions, 6) Function allocation preventing human adaptation to context, 7) Function allocation destabilizing the humans' work environment, and 8) Mission Performance. Addressing these issues systematically requires formal models and simulations that include all necessary aspects of human-automation function allocation: the work environment, the dynamics inherent to the work, agents, and relationships among them. Also, addressing these issues requires not only a (static) model, but also a (dynamic) simulation that captures temporal aspects of work such as the timing of actions and their impact on the agent's work. Therefore, with properly modeled work as described by the work environment, the dynamics inherent to the work, agents, and relationships among them, a modeling framework developed by this thesis, which includes static work models and dynamic simulation, can capture the issues with function allocation. Then, based on the eight issues, eight types of metrics are established. The purpose of these metrics is to assess the extent to which each issue exists with a given function allocation. Specifically, the eight types of metrics assess workload, coherency of a function allocation, mismatches between responsibility and authority, interruptive automation, automation boundary conditions, human adaptation to context, stability of the human's work environment, and mission performance. Finally, to validate the modeling framework and the metrics, a case study was conducted modeling four different function allocations between a pilot and flight deck automation during the arrival and approach phases of flight. A range of pilot cognitive control modes and maximum human taskload limits were also included in the model. The metrics were assessed for these four function allocations and analyzed to validate capability of the metrics to identify important issues in given function allocations. In addition, the design insights provided by the metrics are highlighted. This thesis concludes with a discussion of mechanisms for further validating the modeling framework and function allocation metrics developed here, and highlights where these developments can be applied in research and in the design of function allocations in complex work environments such as aviation operations.
Binakonsky, Jane; Giga, Noreen; Ross, Craig; Siegel, Michael
2011-01-01
We investigated the extent of jello shot consumption among underage youth. We conducted a pilot study among a non-random national sample of 108 drinkers, ages 16-20 years, recruited from the Knowledge Networks internet panel in 2010 using consecutive sampling. The prevalence of past 30-day jello shot consumption among the 108 16-20 year-old drinkers in our sample was 21.4% and among those who consumed jello shots, the percentage of alcohol consumption attributable to jello shots averaged 14.5%. We conclude that jello shot use is prevalent among youth, representing a substantial proportion of their alcohol intake. Surveillance of youth alcohol use should include jello shot consumption. PMID:21174500
Comparison of optimization algorithms for the slow shot phase in HPDC
NASA Astrophysics Data System (ADS)
Frings, Markus; Berkels, Benjamin; Behr, Marek; Elgeti, Stefanie
2018-05-01
High-pressure die casting (HPDC) is a popular manufacturing process for aluminum processing. The slow shot phase in HPDC is the first phase of this process. During this phase, the molten metal is pushed towards the cavity under moderate plunger movement. The so-called shot curve describes this plunger movement. A good design of the shot curve is important to produce high-quality cast parts. Three partially competing process goals characterize the slow shot phase: (1) reducing air entrapment, (2) avoiding temperature loss, and (3) minimizing oxide caused by the air-aluminum contact. Due to the rough process conditions with high pressure and temperature, it is hard to design the shot curve experimentally. There exist a few design rules that are based on theoretical considerations. Nevertheless, the quality of the shot curve design still depends on the experience of the machine operator. To improve the shot curve it seems to be natural to use numerical optimization. This work compares different optimization strategies for the slow shot phase optimization. The aim is to find the best optimization approach on a simple test problem.
Takács, Zsanett; Vilmos, Péter; Lénárt, Péter; Röper, Katja; Erdélyi, Miklós
2017-01-01
ABSTRACT Dorsal closure of the Drosophila embryonic epithelium provides an excellent model system for the in vivo analysis of molecular mechanisms regulating cytoskeletal rearrangements. In this study, we investigated the function of the Drosophila spectraplakin Short stop (Shot), a conserved cytoskeletal structural protein, during closure of the dorsal embryonic epithelium. We show that Shot is essential for the efficient final zippering of the opposing epithelial margins. By using isoform-specific mutant alleles and genetic rescue experiments with truncated Shot variants, we demonstrate that Shot functions as an actin–microtubule cross-linker in mediating zippering. At the leading edge of epithelial cells, Shot regulates protrusion dynamics by promoting filopodia formation. Fluorescence recovery after photobleaching (FRAP) analysis and in vivo imaging of microtubule growth revealed that Shot stabilizes dynamic microtubules. The actin- and microtubule-binding activities of Shot are simultaneously required in the same molecule, indicating that Shot is engaged as a physical crosslinker in this process. We propose that Shot-mediated interactions between microtubules and actin filaments facilitate filopodia formation, which promotes zippering by initiating contact between opposing epithelial cells. PMID:28062848
... Safe Videos for Educators Search English Español Birth Control Shot KidsHealth / For Teens / Birth Control Shot What's in this article? What Is It? ... La inyección anticonceptiva What Is It? The birth control shot is a long-acting form of progesterone, ...
Histopathology of mallards dosed with lead and selected substitute shot
Locke, L.N.; Irby, H.D.; Bagley, George E.
1967-01-01
The histopathological response of male game farm mallards fed lead, three types of plastic-coated lead, two lead-magnesium alloys, iron, copper, zinc-coated iron, and molybdenum-coated iron shot was studied. Mallards fed lead, plastic-coated lead, or lead-magnesium alloy shot developed a similar pathological response, including the formation of acid-fast intranuclear inclusion bodies in the kidneys. Birds fed iron or molybdenum-coated iron shot developed hemosiderosis of the liver. Two of four mallards fed zinc-coated iron shot also developed hemosiderosis of the liver. No lesions were found in mallards fed copper shot.
Duration of mentally simulated movement before and after a golf shot.
Koyama, Satoshi; Tsuruhara, Kiyoshi; Yamamoto, Yuji
2009-02-01
This report examined the temporal consistency of preshot routines and the temporal similarity and variability between simulated movements before and after a shot. 12 male amateur golfers ages 32 to 69 years (M=53.4, SD=10.5) were assigned into two groups according to their handicaps: skilled (M=4.0 handicap, SD=3.1) and less-skilled (M=16.0 handicap, SD=6.5). They performed their shots mentally from their preshot routines to the points when the balls came to rest, then performed the same shots physically and again recalled the shots mentally. For each of four par-three holes, participants' performances were filmed, and the durations of mental and actual shots were timed. Analysis showed that the skilled golfers had more consistent preshot routines in actual movement, and they also had longer durations for the ball flight phase than the less-skilled golfers in simulated movement. The present findings support the importance of consistent preshot routines for high performance in golf, however, the duration of simulated movements was underestimated both before and after the shots. This also suggests that skilled golfers attend to performance goals both before and after shots to execute their shots under proceduralized control and to correct their movements for their next shot.
A Communication Framework for Collaborative Defense
2009-02-28
been able to provide sufficient automation to be able to build up the most extensive application signature database in the world with a fraction of...perceived. We have been able to provide sufficient automation to be able to build up the most extensive application signature database in the world with a...that are well understood in the context of databases . These techniques allow users to quickly scan for the existence of a key in a database . 8 To be
Peirone, Laura S.; Pereyra Irujo, Gustavo A.; Bolton, Alejandro; Erreguerena, Ignacio; Aguirrezábal, Luis A. N.
2018-01-01
Conventional field phenotyping for drought tolerance, the most important factor limiting yield at a global scale, is labor-intensive and time-consuming. Automated greenhouse platforms can increase the precision and throughput of plant phenotyping and contribute to a faster release of drought tolerant varieties. The aim of this work was to establish a framework of analysis to identify early traits which could be efficiently measured in a greenhouse automated phenotyping platform, for predicting the drought tolerance of field grown soybean genotypes. A group of genotypes was evaluated, which showed variation in their drought susceptibility index (DSI) for final biomass and leaf area. A large number of traits were measured before and after the onset of a water deficit treatment, which were analyzed under several criteria: the significance of the regression with the DSI, phenotyping cost, earliness, and repeatability. The most efficient trait was found to be transpiration efficiency measured at 13 days after emergence. This trait was further tested in a second experiment with different water deficit intensities, and validated using a different set of genotypes against field data from a trial network in a third experiment. The framework applied in this work for assessing traits under different criteria could be helpful for selecting those most efficient for automated phenotyping. PMID:29774042
Software framework for the upcoming MMT Observatory primary mirror re-aluminization
NASA Astrophysics Data System (ADS)
Gibson, J. Duane; Clark, Dusty; Porter, Dallan
2014-07-01
Details of the software framework for the upcoming in-situ re-aluminization of the 6.5m MMT Observatory (MMTO) primary mirror are presented. This framework includes: 1) a centralized key-value store and data structure server for data exchange between software modules, 2) a newly developed hardware-software interface for faster data sampling and better hardware control, 3) automated control algorithms that are based upon empirical testing, modeling, and simulation of the aluminization process, 4) re-engineered graphical user interfaces (GUI's) that use state-of-the-art web technologies, and 5) redundant relational databases for data logging. Redesign of the software framework has several objectives: 1) automated process control to provide more consistent and uniform mirror coatings, 2) optional manual control of the aluminization process, 3) modular design to allow flexibility in process control and software implementation, 4) faster data sampling and logging rates to better characterize the approximately 100-second aluminization event, and 5) synchronized "real-time" web application GUI's to provide all users with exactly the same data. The framework has been implemented as four modules interconnected by a data store/server. The four modules are integrated into two Linux system services that start automatically at boot-time and remain running at all times. Performance of the software framework is assessed through extensive testing within 2.0 meter and smaller coating chambers at the Sunnyside Test Facility. The redesigned software framework helps ensure that a better performing and longer lasting coating will be achieved during the re-aluminization of the MMTO primary mirror.
Investigation of kinematics of knuckling shot in soccer
NASA Astrophysics Data System (ADS)
Asai, T.; Hong, S.
2017-02-01
In this study, we use four high-speed video cameras to investigate the swing characteristics of the kicking leg while delivering the knuckling shot in soccer. We attempt to elucidate the impact process of the kicking foot at the instant of its impact with the ball and the technical mechanisms of the knuckling shot via comparison of its curved motion with that of the straight and curved shots. Two high-speed cameras (Fastcam, Photron Inc., Tokyo, Japan; 1000 fps, 1024 × 1024 pixels) are set up 2 m away from the site of impact with a line of sight perpendicular to the kicking-leg side. In addition, two semi-high-speed cameras (EX-F1, Casio Computer Co., Ltd., Tokyo, Japan; 300 fps; 720 × 480 pixels) are positioned, one at the rear and the other on the kicking-leg side, to capture the kicking motion. We observe that the ankle joint at impact in the knuckling shot flexes in an approximate L-shape in a manner similar to the joint flexing for the curve shot. The hip's external rotation torque in the knuckling shot is greater than those of other shots, which suggests the tendency of the kicker to push the heel forward and impact with the inside of the foot. The angle of attack in the knuckling shot is smaller than that in other shots, and we speculate that this small attack angle is a factor in soccer kicks which generate shots with smaller rotational frequencies of the ball.
NASA Astrophysics Data System (ADS)
Feygels, Viktor I.; Park, Joong Yong; Wozencraft, Jennifer; Aitken, Jennifer; Macon, Christopher; Mathur, Abhinav; Payment, Andy; Ramnath, Vinod
2013-06-01
CZMIL is an integrated lidar-imagery system and software suite designed for highly automated generation of physical and environmental information products for coastal zone mapping in the framework of the US Army Corps of Engineers (USACE) National Coastal Mapping Program (NCMP). This paper presents the results of CZMIL system validation in turbid water conditions along the Gulf Coast of Mississippi and in relatively clear water conditions in Florida in late spring 2012. Results of the USACE May-October 2012 mission in Green Bay, WI and Lake Erie are presented. The system performance tests show that CZMIL successfully achieved 7-8m depth in Mississippi with Kd =0.46m-1 (Kd is the diffuse attenuation coefficient) and up to 41m in Florida when Kd=0.11m-1. Bathymetric accuracy of CZMIL was measured by comparing CZMIL depths with multi-beam sonar data from Cat Island, MS and from off the coast of Fort. Lauderdale, FL. Validation demonstrated that CZMIL meets USACE specifications (two standard deviation, 2σ, ~30 cm). To measure topographic accuracy we made direct comparisons of CZMIL elevations to GPS-surveyed ground control points and vehicle-based lidar scans of topographic surfaces. Results confirmed that CZMIL meets the USACE topographic requirements (2σ, ~15 cm). Upon completion of the Green Bay and Lake Erie mission there were 89 flights with 2231 flightlines. The general hours of aircraft engine time (which doesn't include all transit/ferry flights) was 441 hours with 173 hours of time on survey flightlines. The 4.8 billion (!) laser shots and 38.6 billion digitized waveforms covered over 1025 miles of shoreline.
Acute effects of countermovement jumping and sprinting on shot put performance.
Terzis, Gerasimos; Karampatsos, Giorgos; Kyriazis, Thomas; Kavouras, Stavros A; Georgiadis, Giorgos
2012-03-01
The purpose of this study was to investigate the acute effects of countermovement jumping and sprinting on shot put performance in experienced shot putters. Ten shot putters (best performance 13.16-20.36 m) participated in the study. After a standard warm-up including jogging, stretching, and 4-6 submaximal puts, they performed 3 shot put attempts with maximum effort, separated with 1.5-minute interval. Three minutes later, they performed 3 maximal consecutive countermovement jumps (CMJs). Immediately after jumping, they performed 3 shot put attempts with maximum effort, separated with a 1.5-minute interval. One week later, they carried out a similar protocol, at similar external conditions, but they performed a bout of 20-m sprinting instead of the CMJs, to potentiate shot put performance. Muscular strength (1 repetition maximum in squat, snatch, bench press, incline bench press) and body composition (dual x-ray absorptiometry) were measured during the same training period (±10 days from the jumping and sprinting protocols). Shot put performance was significantly increased after the CMJs (15.45 ± 2.36 vs. 15.85 ± 2.41 m, p = 0.0003). Similarly, shot put performance was significantly increased after sprinting (15.34 ± 2.41 vs. 15.90 ± 2.46 m, p = 0.0007). The increase in performance after sprinting was significantly higher compared with the increase after jumping (2.64 ± 1.59 vs. 3.74 ± 1.88%, p = 0.02). In conclusion, the results of this study indicate that a standard warm-up protocol followed by 3 maximal bouts of shot put and either 3 consecutive countermovement jumps or a bout of 20-m sprinting induce an acute increase in shot put performance in experienced shot putters.
A new method for assessing squash tactics using 15 court areas for ball locations.
Vučković, Goran; James, Nic; Hughes, Mike; Murray, Stafford; Milanović, Zoran; Perš, Janez; Sporiš, Goran
2014-04-01
Tactics in squash have typically been assessed using the frequency of different shot types played at different locations on the court either without reference to other relevant information or on the basis of the preceding shot. This paper presents a new squash specific method for categorizing court locations in which the ball was played, a novel techniques for assessing the reliability of this method and presents typical shots responses in these new areas controlled for preceding shot as well as the time between shots and the handedness of the players. Twelve games were viewed using the SAGIT/Squash software and 2907 shots viewed a second time from a video image taken from behind the court with an overall agreement of 88.90% for the court location data and 99.52% for shot type. 3192 shots from 9 matches from the 2003 World Team Championships were analyzed in SAGIT/Squash. In the court areas analyzed between 2 and 7 shot responses were predominant suggesting tactical patterns were evident. This was supported by differences evident between shot responses played from the two back corners where the backhand side was characterized by a predominance of straight drives whereas straight and crosscourt drives were played on the forehand side. These results tended to confirm that tactics i.e., consistent shot types, are played although these are only apparent when factors that determine shot selection are accounted for. This paper has controlled for some of these factors but others need to be considered e.g., if individual player profiles are to be ascertained. Copyright © 2014 Elsevier B.V. All rights reserved.
The future of automation for high-volume wafer fabrication and ASIC manufacturing
NASA Astrophysics Data System (ADS)
Hughes, Randall A.; Shott, John D.
1986-12-01
A framework is given to analyze the future trends in semiconductor manufacturing automation systems, focusing specifically on the needs of ASIC (application-specific integrated circuit) or custom integrated circuit manufacturing. Advances in technologies such as gate arrays and standard cells now make it significantly easier to obtain system cost and performance advantages by integrating nonstandard functions on silicon. ASICs are attractive to U.S. manufacturers because they place a premium on sophisticated design tools, familiarity with customer needs and applications, and fast turn-around fabrication. These are areas where U.S. manufacturers believe they have an advantage and, consequently, will not suffer from the severe price/manufacturing competition encountered in conventional high-volume semiconductor products. Previously, automation was often considered viable only for high-volume manufacturing, but automation becomes a necessity in the new ASIC environment.
Social aspects of automation: Some critical insights
NASA Astrophysics Data System (ADS)
Nouzil, Ibrahim; Raza, Ali; Pervaiz, Salman
2017-09-01
Sustainable development has been recognized globally as one of the major driving forces towards the current technological innovations. To achieve sustainable development and attain its associated goals, it is very important to properly address its concerns in different aspects of technological innovations. Several industrial sectors have enjoyed productivity and economic gains due to advent of automation technology. It is important to characterize sustainability for the automation technology. Sustainability is key factor that will determine the future of our neighbours in time and it must be tightly wrapped around the double-edged sword of technology. In this study, different impacts of automation have been addressed using the ‘Circles of Sustainability’ approach as a framework, covering economic, political, cultural and ecological aspects and their implications. A systematic literature review of automation technology from its inception is outlined and plotted against its many outcomes covering a broad spectrum. The study is more focused towards the social aspects of the automation technology. The study also reviews literature to analyse the employment deficiency as one end of the social impact spectrum. On the other end of the spectrum, benefits to society through technological advancements, such as the Internet of Things (IoT) coupled with automation are presented.
A Transparent and Transferable Framework for Tracking Quality Information in Large Datasets
Smith, Derek E.; Metzger, Stefan; Taylor, Jeffrey R.
2014-01-01
The ability to evaluate the validity of data is essential to any investigation, and manual “eyes on” assessments of data quality have dominated in the past. Yet, as the size of collected data continues to increase, so does the effort required to assess their quality. This challenge is of particular concern for networks that automate their data collection, and has resulted in the automation of many quality assurance and quality control analyses. Unfortunately, the interpretation of the resulting data quality flags can become quite challenging with large data sets. We have developed a framework to summarize data quality information and facilitate interpretation by the user. Our framework consists of first compiling data quality information and then presenting it through 2 separate mechanisms; a quality report and a quality summary. The quality report presents the results of specific quality analyses as they relate to individual observations, while the quality summary takes a spatial or temporal aggregate of each quality analysis and provides a summary of the results. Included in the quality summary is a final quality flag, which further condenses data quality information to assess whether a data product is valid or not. This framework has the added flexibility to allow “eyes on” information on data quality to be incorporated for many data types. Furthermore, this framework can aid problem tracking and resolution, should sensor or system malfunctions arise. PMID:25379884
Movement Analysis Applied to the Basketball Jump Shot--Part II.
ERIC Educational Resources Information Center
Martin, Thomas P.
1981-01-01
The jump shot is one of the most important shots in the game of basketball. The movement analysis of the jump shot designates four phases: (1) preparatory position; (2) movement phase I (crouch); (3) movement phase II (jump); and (4) follow-through. (JN)
Surface roughness formation during shot peen forming
NASA Astrophysics Data System (ADS)
Koltsov, V. P.; Vinh, Le Tri; Starodubtseva, D. A.
2018-03-01
Shot peen forming (SPF) is used for forming panels and skins, and for hardening. As a rule, shot peen forming is performed after milling. Surface roughness is a complex structure, a combination of an original microrelief and shot peen forming indentations of different depths and chaotic distribution along the surface. As far as shot peen forming is a random process, surface roughness resulted from milling and shot peen forming is random too. During roughness monitoring, it is difficult to determine the basic surface area which would ensure accurate results. It can be assumed that the basic area depends on the random roughness which is characterized by the degree of shot peen forming coverage. The analysis of depth and shot peen forming indentations distribution along the surface made it possible to identify the shift of an original center profile plane and create a mathematical model for the arithmetic mean deviation of the profile. Experimental testing proved model validity and determined an inversely proportional dependency of the basic area on the degree of coverage.
Shot Peening Numerical Simulation of Aircraft Aluminum Alloy Structure
NASA Astrophysics Data System (ADS)
Liu, Yong; Lv, Sheng-Li; Zhang, Wei
2018-03-01
After shot peening, the 7050 aluminum alloy has good anti-fatigue and anti-stress corrosion properties. In the shot peening process, the pellet collides with target material randomly, and generated residual stress distribution on the target material surface, which has great significance to improve material property. In this paper, a simplified numerical simulation model of shot peening was established. The influence of pellet collision velocity, pellet collision position and pellet collision time interval on the residual stress of shot peening was studied, which is simulated by the ANSYS/LS-DYNA software. The analysis results show that different velocity, different positions and different time intervals have great influence on the residual stress after shot peening. Comparing with the numerical simulation results based on Kriging model, the accuracy of the simulation results in this paper was verified. This study provides a reference for the optimization of the shot peening process, and makes an effective exploration for the precise shot peening numerical simulation.
Re-presentations of space in Hollywood movies: an event-indexing analysis.
Cutting, James; Iricinschi, Catalina
2015-03-01
Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.
Implementation of continuous-variable quantum key distribution with discrete modulation
NASA Astrophysics Data System (ADS)
Hirano, Takuya; Ichikawa, Tsubasa; Matsubara, Takuto; Ono, Motoharu; Oguri, Yusuke; Namiki, Ryo; Kasai, Kenta; Matsumoto, Ryutaroh; Tsurumaru, Toyohiro
2017-06-01
We have developed a continuous-variable quantum key distribution (CV-QKD) system that employs discrete quadrature-amplitude modulation and homodyne detection of coherent states of light. We experimentally demonstrated automated secure key generation with a rate of 50 kbps when a quantum channel is a 10 km optical fibre. The CV-QKD system utilises a four-state and post-selection protocol and generates a secure key against the entangling cloner attack. We used a pulsed light source of 1550 nm wavelength with a repetition rate of 10 MHz. A commercially available balanced receiver is used to realise shot-noise-limited pulsed homodyne detection. We used a non-binary LDPC code for error correction (reverse reconciliation) and the Toeplitz matrix multiplication for privacy amplification. A graphical processing unit card is used to accelerate the software-based post-processing.
Investigations of shot reproducibility for the SMP diode at 4.5 MV.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Nichelle; Crain, Marlon D.; Droemer, Darryl W.
In experiments conducted on the RITS-6 accelerator, the SMP diode exhibits sig- ni cant shot-to-shot variability. Speci cally, for identical hardware operated at the same voltage, some shots exhibit a catastrophic drop in diode impedance. A study is underway to identify sources of shot-to-shot variations which correlate with diode impedance collapse. To remove knob emission as a source, only data from a shot series conducted with a 4.5-MV peak voltage are considered. The scope of this report is limited to sources of variability which occur away from the diode, such as power ow emission and trajectory changes, variations in pulsedmore » power, dustbin and transmission line alignment, and di erent knob shapes. We nd no changes in the transmission line hardware, alignment, or hardware preparation methods which correlate with impedance collapse. However, in classifying good versus poor shots, we nd that there is not a continuous spectrum of diode impedance behavior but that the good and poor shots can be grouped into two distinct impedance pro les. This result forms the basis of a follow-on study focusing on the variability resulting from diode physics. 3« less
Chen, Nan-kuei; Guidon, Arnaud; Chang, Hing-Chiu; Song, Allen W.
2013-01-01
Diffusion weighted magnetic resonance imaging (DWI) data have been mostly acquired with single-shot echo-planar imaging (EPI) to minimize motion induced artifacts. The spatial resolution, however, is inherently limited in single-shot EPI, even when the parallel imaging (usually at an acceleration factor of 2) is incorporated. Multi-shot acquisition strategies could potentially achieve higher spatial resolution and fidelity, but they are generally susceptible to motion-induced phase errors among excitations that are exacerbated by diffusion sensitizing gradients, rendering the reconstructed images unusable. It has been shown that shot-to-shot phase variations may be corrected using navigator echoes, but at the cost of imaging throughput. To address these challenges, a novel and robust multi-shot DWI technique, termed multiplexed sensitivity-encoding (MUSE), is developed here to reliably and inherently correct nonlinear shot-to-shot phase variations without the use of navigator echoes. The performance of the MUSE technique is confirmed experimentally in healthy adult volunteers on 3 Tesla MRI systems. This newly developed technique should prove highly valuable for mapping brain structures and connectivities at high spatial resolution for neuroscience studies. PMID:23370063
Vehicle automation: a remedy for driver stress?
Funke, G; Matthews, G; Warm, J S; Emo, A K
2007-08-01
The present study addressed the effects of stress, vehicle automation and subjective state on driver performance and mood in a simulated driving task. A total of 168 college students participated. Participants in the stress-induction condition completed a 'winter' drive, which included periodic loss of control episodes. Participants in the no-stress-induction condition were not exposed to loss of control. An additional, independent manipulation of vehicle speed was also conducted, consisting of two control conditions requiring manual speed regulation and a third in which vehicle speed was automatically regulated by the simulation. Stress and automation both influenced subjective distress, but the two factors did not interact. Driver performance data indicated that vehicle automation impacted performance similarly in the stress and no-stress conditions. Individual differences in subjective stress response and performance were also investigated. Resource theory provides a framework that partially but not completely explains the relationship between vehicle automation and driver stress. Implications for driver workload, safety and training are discussed.
Harnessing Vehicle Automation for Public Mobility -- An Overview of Ongoing Efforts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, Stanley E.
This presentation takes a look at the efforts to harness automated vehicle technology for public transport. The European CityMobil2 is the leading demonstration project in which automated shuttles were, or are planned to be, demonstrated in several cities and regions. The presentation provides a brief overview of the demonstrations at Oristano, Italy (July 2014), LaRochelle, France (Dec 2014), Lausanne, Switzerland (Apr 2015), Vantaa, Finland (July 2015), and Trikala, Greece (Sept 2015). In addition to technology exposition, the objectives included generating a legal framework for operation in each location and gaging the reaction of the public to unmanned shuttles, both ofmore » which were successfully achieved. Several such demonstrations are planned throughout the world, including efforts in North America in conjunction with the GoMentum Station in California. These early demonstration with low-speed automated shuttles provide a glimpse of the possible with a fully automated fleet of driverless vehicle providing a public transit service.« less
Bayesian ISOLA: new tool for automated centroid moment tensor inversion
NASA Astrophysics Data System (ADS)
Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John
2017-08-01
We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances are rejected and full-waveform inversion in a space-time grid around a provided hypocentre. A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequency ranges. The method is tested on synthetic and observed data. It is applied on a data set from the Swiss seismic network and the results are compared with the existing high-quality MT catalogue. The software package programmed in Python is designed to be as versatile as possible in order to be applicable in various networks ranging from local to regional. The method can be applied either to the everyday network data flow, or to process large pre-existing earthquake catalogues and data sets.
Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications
NASA Technical Reports Server (NTRS)
Chaki, Sagar; Gurfinkel, Arie
2010-01-01
We develop a learning-based automated Assume-Guarantee (AG) reasoning framework for verifying omega-regular properties of concurrent systems. We study the applicability of non-circular (AGNC) and circular (AG-C) AG proof rules in the context of systems with infinite behaviors. In particular, we show that AG-NC is incomplete when assumptions are restricted to strictly infinite behaviors, while AG-C remains complete. We present a general formalization, called LAG, of the learning based automated AG paradigm. We show how existing approaches for automated AG reasoning are special instances of LAG.We develop two learning algorithms for a class of systems, called infinite regular systems, that combine finite and infinite behaviors. We show that for infinity-regular systems, both AG-NC and AG-C are sound and complete. Finally, we show how to instantiate LAG to do automated AG reasoning for infinite regular, and omega-regular, systems using both AG-NC and AG-C as proof rules
Automating Access Control Logics in Simple Type Theory with LEO-II
NASA Astrophysics Data System (ADS)
Benzmüller, Christoph
Garg and Abadi recently proved that prominent access control logics can be translated in a sound and complete way into modal logic S4. We have previously outlined how normal multimodal logics, including monomodal logics K and S4, can be embedded in simple type theory and we have demonstrated that the higher-order theorem prover LEO-II can automate reasoning in and about them. In this paper we combine these results and describe a sound (and complete) embedding of different access control logics in simple type theory. Employing this framework we show that the off the shelf theorem prover LEO-II can be applied to automate reasoning in and about prominent access control logics.
Application-level regression testing framework using Jenkins
Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen
2017-09-26
Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less
NASA Technical Reports Server (NTRS)
Bensalem, Saddek; Ganesh, Vijay; Lakhnech, Yassine; Munoz, Cesar; Owre, Sam; Ruess, Harald; Rushby, John; Rusu, Vlad; Saiedi, Hassen; Shankar, N.
2000-01-01
To become practical for assurance, automated formal methods must be made more scalable, automatic, and cost-effective. Such an increase in scope, scale, automation, and utility can be derived from an emphasis on a systematic separation of concerns during verification. SAL (Symbolic Analysis Laboratory) attempts to address these issues. It is a framework for combining different tools to calculate properties of concurrent systems. The heart of SAL is a language, developed in collaboration with Stanford, Berkeley, and Verimag for specifying concurrent systems in a compositional way. Our instantiation of the SAL framework augments PVS with tools for abstraction, invariant generation, program analysis (such as slicing), theorem proving, and model checking to separate concerns as well as calculate properties (i.e., perform, symbolic analysis) of concurrent systems. We. describe the motivation, the language, the tools, their integration in SAL/PAS, and some preliminary experience of their use.
Application-level regression testing framework using Jenkins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen
Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less
Gray, Sara; Borgundvaag, Bjug; Sirvastava, Anita; Randall, Ian; Kahan, Meldon
2010-10-01
Use of a symptom-triggered scale to measure the severity of alcohol withdrawal could reduce the rate of seizures and other complications. The current standard scale, the Clinical Institute of Withdrawal Assessment (CIWA), takes a mean (±SD) of 5 minutes to complete, requiring 30 minutes of nursing time per patient when multiple measures are required. The objective was to assess the feasibility and reliability of a brief scale of alcohol withdrawal severity. The SHOT is a brief scale designed to assess alcohol withdrawal in the emergency department (ED). It includes four items: sweating, hallucinations, orientation, and tremor (SHOT). It was developed based on a literature review and a consensus process by emergency and addiction physicians. The SHOT was first piloted in one ED, and then a prospective observational study was conducted at a different ED to measure its feasibility and reliability. Subjects included patients who were in alcohol withdrawal. One nurse administered the SHOT and CIWA, and the physician repeated the SHOT independently. The SHOT was done only at baseline, before treatment was administered. In the pilot study (12 patients), the SHOT took 1 minute to complete on average, and the CIWA took 5 minutes. Sixty-one patients participated in the prospective study. For the SHOT and the CIWA done by the same nurse, the kappa was 0.88 (95% confidence interval [CI] = 0.52 to 1.0; p < 0.0001), and the Pearson's r was 0.71 (p < 0.001). The kappa for the nurse's CIWA score and the physician's SHOT score was 0.61 (95% CI = 0.25 to 0.97; p < 0.0006), and the Pearson's r was 0.48 (p = 0.002). The SHOTs performed by the nurse and physician agreed on the need for benzodiazepine treatment in 30 of 37 cases (82% agreement, kappa = 0.35, 95% CI = 0.03 to 0.67; p < 0.02). The mean (±SD) time taken by nurses and physicians to complete the SHOT was 1 (± 0.52) minute (median = 0.6 minutes). Seventeen percent of patients scored positive on the SHOT for hallucinations or disorientation. The SHOT has potential as a feasible and acceptable tool for measuring pretreatment alcohol withdrawal severity in the ED. Further research is needed to validate the SHOT, to assess the utility of serial measurements of the SHOT, and to demonstrate that its use reduces length of stay and improves clinical outcomes. © 2010 by the Society for Academic Emergency Medicine.
NASA Technical Reports Server (NTRS)
Franck, Bruno M.
1990-01-01
The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.
Kyriazis, Thomas A; Terzis, Gerasimos; Boudolos, Konstantinos; Georgiadis, Georgios
2009-09-01
The aim of this study was to investigate changes in shot put performance, muscular power, and neuromuscular activation of the lower extremities, between the preseason and the competition period, in skilled shot put athletes using the rotational technique. Shot put performance was assessed at the start of the pre-season period as well as after 12 weeks, at the competition period, in nine shot putters. Electromyographic (EMG) activity of the right vastus lateralis muscle was recorded during all shot put trials. Maximum squat strength (1RM) and mechanical parameters during the countermovement jump (CMJ) on a force platform were also determined at pre-season and at competition period. Shot put performance increased 4.7% (p < 0.05), while 1RM squat increased 6.5% (p < 0.025). EMG activity during the delivery phase was increased significantly (p < 0.025) after the training period. Shot put performance was significantly related with muscular power and takeoff velocity during the CMJ, at competition period (r = 0.66, p < 0.05 and 0.70, p < 0.05), but not with maximum vertical force. One RM squat was not related significantly with shot put performance. These results suggest that muscular power of the lower extremities is a better predictor of rotational shot put performance than absolute muscular strength in skilled athletes, at least during the competition period.
Indirect monitoring shot-to-shot shock waves strength reproducibility during pump-probe experiments
NASA Astrophysics Data System (ADS)
Pikuz, T. A.; Faenov, A. Ya.; Ozaki, N.; Hartley, N. J.; Albertazzi, B.; Matsuoka, T.; Takahashi, K.; Habara, H.; Tange, Y.; Matsuyama, S.; Yamauchi, K.; Ochante, R.; Sueda, K.; Sakata, O.; Sekine, T.; Sato, T.; Umeda, Y.; Inubushi, Y.; Yabuuchi, T.; Togashi, T.; Katayama, T.; Yabashi, M.; Harmand, M.; Morard, G.; Koenig, M.; Zhakhovsky, V.; Inogamov, N.; Safronova, A. S.; Stafford, A.; Skobelev, I. Yu.; Pikuz, S. A.; Okuchi, T.; Seto, Y.; Tanaka, K. A.; Ishikawa, T.; Kodama, R.
2016-07-01
We present an indirect method of estimating the strength of a shock wave, allowing on line monitoring of its reproducibility in each laser shot. This method is based on a shot-to-shot measurement of the X-ray emission from the ablated plasma by a high resolution, spatially resolved focusing spectrometer. An optical pump laser with energy of 1.0 J and pulse duration of ˜660 ps was used to irradiate solid targets or foils with various thicknesses containing Oxygen, Aluminum, Iron, and Tantalum. The high sensitivity and resolving power of the X-ray spectrometer allowed spectra to be obtained on each laser shot and to control fluctuations of the spectral intensity emitted by different plasmas with an accuracy of ˜2%, implying an accuracy in the derived electron plasma temperature of 5%-10% in pump-probe high energy density science experiments. At nano- and sub-nanosecond duration of laser pulse with relatively low laser intensities and ratio Z/A ˜ 0.5, the electron temperature follows Te ˜ Ilas2/3. Thus, measurements of the electron plasma temperature allow indirect estimation of the laser flux on the target and control its shot-to-shot fluctuation. Knowing the laser flux intensity and its fluctuation gives us the possibility of monitoring shot-to-shot reproducibility of shock wave strength generation with high accuracy.
The Use of Match Statistics that Discriminate Between Successful and Unsuccessful Soccer Teams
Castellano, Julen; Casamichana, David; Lago, Carlos
2012-01-01
Three soccer World Cups were analysed with the aim of identifying the match statistics which best discriminated between winning, drawing and losing teams. The analysis was based on 177 matches played during the three most recent World Cup tournaments: Korea/Japan 2002 (59), Germany 2006 (59) and South Africa 2010 (59). Two categories of variables were studied: 1) those related to attacking play: goals scored, total shots, shots on target, shots off target, ball possession, number of off-sides committed, fouls received and corners; and 2) those related to defence: total shots received, shots on target received, shots off target received, off-sides received, fouls committed, corners against, yellow cards and red cards. Discriminant analysis of these matches revealed the following: (a) the variables related to attacking play that best differentiated between winning, drawing and losing teams were total shots, shots on target and ball possession; and (b) the most discriminating variables related to defence were total shots received and shots on target received. These results suggest that winning, drawing and losing national teams may be discriminated from one another on the basis of variables such as ball possession and the effectiveness of their attacking play. This information may be of benefit to both coaches and players, adding to their knowledge about soccer performance indicators and helping to guide the training process. PMID:23487020
Waterfowl exposure to lead and steel shot on selected hunting areas
White, D.H.; Stendell, R.C.
1977-01-01
Gizzards and wingbones from immature mallards (Anas platyrhynchos), pintails (Anas acuta), black ducks (A. rubripes), and Canada geese (Branta canadensis) were collected from 12 national and stat hunting are.as during the hunting season of 1974-75. The gizzards were examined for the occurrence of lead and steel shot and the wingbones were analyzed for lead residues. Incidence of lead shot in gizzards ranged from 1.3 percent in mallards from Monte Vista National Wildlife Refuge to 29 percent in pintails from Sauvie Island Wildlife Management Area. Lead in wingbones ranged from trace residues (<0.5 ppm) to 345 ppm. The incidence of steel shot in gizzards surpassed lead shot on some refuges that have had mandatory steel shot programs. There was a significant correlation between frequency of lead shot in gizzards and lead residues in wingbones.
GENERALITY OF THE MATCHING LAW AS A DESCRIPTOR OF SHOT SELECTION IN BASKETBALL
Alferink, Larry A; Critchfield, Thomas S; Hitt, Jennifer L; Higgins, William J
2009-01-01
Based on a small sample of highly successful teams, past studies suggested that shot selection (two- vs. three-point field goals) in basketball corresponds to predictions of the generalized matching law. We examined the generality of this finding by evaluating shot selection of college (Study 1) and professional (Study 3) players. The matching law accounted for the majority of variance in shot selection, with undermatching and a bias for taking three-point shots. Shot-selection matching varied systematically for players who (a) were members of successful versus unsuccessful teams, (b) competed at different levels of collegiate play, and (c) served as regulars versus substitutes (Study 2). These findings suggest that the matching law is a robust descriptor of basketball shot selection, although the mechanism that produces matching is unknown. PMID:20190921
Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Rallabhandi, Sriram K.
2010-01-01
A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Criteria for Claims Relating to Leukemia § 79.11 Definitions. (a) Affected area means one of the following... Island, Christmas Island, the test site for the shot during Operation Wigwam, the test site for Shot Yucca during Operation Hardtack I, and the test sites for Shot Frigate Bird and Shot Swordfish during...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Criteria for Claims Relating to Leukemia § 79.11 Definitions. (a) Affected area means one of the following... Island, Christmas Island, the test site for the shot during Operation Wigwam, the test site for Shot Yucca during Operation Hardtack I, and the test sites for Shot Frigate Bird and Shot Swordfish during...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Criteria for Claims Relating to Leukemia § 79.11 Definitions. (a) Affected area means one of the following... Island, Christmas Island, the test site for the shot during Operation Wigwam, the test site for Shot Yucca during Operation Hardtack I, and the test sites for Shot Frigate Bird and Shot Swordfish during...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Criteria for Claims Relating to Leukemia § 79.11 Definitions. (a) Affected area means one of the following... Island, Christmas Island, the test site for the shot during Operation Wigwam, the test site for Shot Yucca during Operation Hardtack I, and the test sites for Shot Frigate Bird and Shot Swordfish during...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Criteria for Claims Relating to Leukemia § 79.11 Definitions. (a) Affected area means one of the following... Island, Christmas Island, the test site for the shot during Operation Wigwam, the test site for Shot Yucca during Operation Hardtack I, and the test sites for Shot Frigate Bird and Shot Swordfish during...
NASA Astrophysics Data System (ADS)
Wollman, Adam J. M.; Miller, Helen; Foster, Simon; Leake, Mark C.
2016-10-01
Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphologically complex structures of fluorescently labelled proteins present in clusters of other types of cells.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.
1992-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).
The impact of automation on pharmacy staff experience of workplace stressors.
James, K Lynette; Barlow, Dave; Bithell, Anne; Hiom, Sarah; Lord, Sue; Oakley, Pat; Pollard, Mike; Roberts, Dave; Way, Cheryl; Whittlesea, Cate
2013-04-01
Determine the effect of installing an original pack automated dispensing system (ADS) on staff experience of occupational stressors. Pharmacy staff in a National Health Service hospital in Wales, UK, were administered an anonymous occupational stressor questionnaire pre- (n = 45) and post-automation (n = 32). Survey responses pre- and post-automation were compared using Mann-Whitney U test. Statistical significance was P ≤ 0.05. Four focus groups were conducted (two groups of accredited checking technicians (ACTs) (group 1: n = 4; group 2: n = 6), one group of pharmacists (n = 17), and one group of technicians (n = 4) post-automation to explore staff experiences of occupational stressors. Focus group transcripts were analysed according to framework analysis. Survey response rate pre-automation was 78% (n = 35) and 49% (n = 16) post-automation. Automation had a positive impact on staff experience of stress (P = 0.023), illogical workload allocation (P = 0.004) and work-life balance (P = 0.05). All focus-group participants reported that automation had created a spacious working environment. Pharmacists and ACTs reported that automation had enabled the expansion of their roles. Technicians felt like 'production-line workers.' Robot malfunction was a source of stress. The findings suggest that automation had a positive impact on staff experience of stressors, improving working conditions and workload. Technicians reported that ADS devalued their skills. When installing ADS, pharmacy managers must consider the impact of automation on staff. Strategies to reduce stressors associated with automation include rotating staff activities and role expansions. © 2012 The Authors. IJPP © 2012 Royal Pharmaceutical Society.
Comparison of two drug safety signals in a pharmacovigilance data mining framework.
Tubert-Bitter, Pascale; Bégaud, Bernard; Ahmed, Ismaïl
2016-04-01
Since adverse drug reactions are a major public health concern, early detection of drug safety signals has become a top priority for regulatory agencies and the pharmaceutical industry. Quantitative methods for analyzing spontaneous reporting material recorded in pharmacovigilance databases through data mining have been proposed in the last decades and are increasingly used to flag potential safety problems. While automated data mining is motivated by the usually huge size of pharmacovigilance databases, it does not systematically produce relevant alerts. Moreover, each detected signal requires appropriate assessment that may involve investigation of the whole therapeutic class. The goal of this article is to provide a methodology for comparing two detected signals. It is nested within the automated surveillance framework as (1) no extra information is required and (2) no simple inference on the actual risks can be extrapolated from spontaneous reporting data. We designed our methodology on the basis of two classical methods used for automated signal detection: the Bayesian Gamma Poisson Shrinker and the frequentist Proportional Reporting Ratio. A simulation study was conducted to assess the performances of both proposed methods. The latter were used to compare cardiovascular signals for two HIV treatments from the French pharmacovigilance database. © The Author(s) 2012.
Framework for Automated GD&T Inspection Using 3D Scanner
NASA Astrophysics Data System (ADS)
Pathak, Vimal Kumar; Singh, Amit Kumar; Sivadasan, M.; Singh, N. K.
2018-04-01
Geometric Dimensioning and Tolerancing (GD&T) is a typical dialect that helps designers, production faculty and quality monitors to convey design specifications in an effective and efficient manner. GD&T has been practiced since the start of machine component assembly but without overly naming it. However, in recent times industries have started increasingly emphasizing on it. One prominent area where most of the industries struggle with is quality inspection. Complete inspection process is mostly human intensive. Also, the use of conventional gauges and templates for inspection purpose highly depends on skill of workers and quality inspectors. In industries, the concept of 3D scanning is not new but is used only for creating 3D drawings or modelling of physical parts. However, the potential of 3D scanning as a powerful inspection tool is hardly explored. This study is centred on designing a procedure for automated inspection using 3D scanner. Linear, geometric and dimensional inspection of the most popular test bar-stepped bar, as a simple example was also carried out as per the new framework. The new generation engineering industries would definitely welcome this automated inspection procedure being quick and reliable with reduced human intervention.
OPPL-Galaxy, a Galaxy tool for enhancing ontology exploitation as part of bioinformatics workflows
2013-01-01
Background Biomedical ontologies are key elements for building up the Life Sciences Semantic Web. Reusing and building biomedical ontologies requires flexible and versatile tools to manipulate them efficiently, in particular for enriching their axiomatic content. The Ontology Pre Processor Language (OPPL) is an OWL-based language for automating the changes to be performed in an ontology. OPPL augments the ontologists’ toolbox by providing a more efficient, and less error-prone, mechanism for enriching a biomedical ontology than that obtained by a manual treatment. Results We present OPPL-Galaxy, a wrapper for using OPPL within Galaxy. The functionality delivered by OPPL (i.e. automated ontology manipulation) can be combined with the tools and workflows devised within the Galaxy framework, resulting in an enhancement of OPPL. Use cases are provided in order to demonstrate OPPL-Galaxy’s capability for enriching, modifying and querying biomedical ontologies. Conclusions Coupling OPPL-Galaxy with other bioinformatics tools of the Galaxy framework results in a system that is more than the sum of its parts. OPPL-Galaxy opens a new dimension of analyses and exploitation of biomedical ontologies, including automated reasoning, paving the way towards advanced biological data analyses. PMID:23286517
NASA Astrophysics Data System (ADS)
Dolly, Steven R.; Anastasio, Mark A.; Yu, Lifeng; Li, Hua
2017-03-01
In current radiation therapy practice, image quality is still assessed subjectively or by utilizing physically-based metrics. Recently, a methodology for objective task-based image quality (IQ) assessment in radiation therapy was proposed by Barrett et al.1 In this work, we present a comprehensive implementation and evaluation of this new IQ assessment methodology. A modular simulation framework was designed to perform an automated, computer-simulated end-to-end radiation therapy treatment. A fully simulated framework was created that utilizes new learning-based stochastic object models (SOM) to obtain known organ boundaries, generates a set of images directly from the numerical phantoms created with the SOM, and automates the image segmentation and treatment planning steps of a radiation therapy work ow. By use of this computational framework, therapeutic operating characteristic (TOC) curves can be computed and the area under the TOC curve (AUTOC) can be employed as a figure-of-merit to guide optimization of different components of the treatment planning process. The developed computational framework is employed to optimize X-ray CT pre-treatment imaging. We demonstrate that use of the radiation therapy-based-based IQ measures lead to different imaging parameters than obtained by use of physical-based measures.
Video Shot Boundary Detection Using QR-Decomposition and Gaussian Transition Detection
NASA Astrophysics Data System (ADS)
Amiri, Ali; Fathy, Mahmood
2010-12-01
This article explores the problem of video shot boundary detection and examines a novel shot boundary detection algorithm by using QR-decomposition and modeling of gradual transitions by Gaussian functions. Specifically, the authors attend to the challenges of detecting gradual shots and extracting appropriate spatiotemporal features that affect the ability of algorithms to efficiently detect shot boundaries. The algorithm utilizes the properties of QR-decomposition and extracts a block-wise probability function that illustrates the probability of video frames to be in shot transitions. The probability function has abrupt changes in hard cut transitions, and semi-Gaussian behavior in gradual transitions. The algorithm detects these transitions by analyzing the probability function. Finally, we will report the results of the experiments using large-scale test sets provided by the TRECVID 2006, which has assessments for hard cut and gradual shot boundary detection. These results confirm the high performance of the proposed algorithm.
Influence of Running on Pistol Shot Hit Patterns.
Kerkhoff, Wim; Bolck, Annabel; Mattijssen, Erwin J A T
2016-01-01
In shooting scene reconstructions, risk assessment of the situation can be important for the legal system. Shooting accuracy and precision, and thus risk assessment, might be correlated with the shooter's physical movement and experience. The hit patterns of inexperienced and experienced shooters, while shooting stationary (10 shots) and in running motion (10 shots) with a semi-automatic pistol, were compared visually (with confidence ellipses) and statistically. The results show a significant difference in precision (circumference of the hit patterns) between stationary shots and shots fired in motion for both inexperienced and experienced shooters. The decrease in precision for all shooters was significantly larger in the y-direction than in the x-direction. The precision of the experienced shooters is overall better than that of the inexperienced shooters. No significant change in accuracy (shift in the hit pattern center) between stationary shots and shots fired in motion can be seen for all shooters. © 2015 American Academy of Forensic Sciences.
30 CFR 75.1320 - Multiple-shot blasting.
Code of Federal Regulations, 2013 CFR
2013-07-01
... in a round shall be initiated in sequence from the opener hole or holes. (e) Arrangement of detonator... blasting coal off the solid— (i) Each shot in the round shall be initiated in sequence from the opener hole or holes; and (ii) After the first shot or shots, the interval between the designated delay periods...
30 CFR 75.1320 - Multiple-shot blasting.
Code of Federal Regulations, 2014 CFR
2014-07-01
... in a round shall be initiated in sequence from the opener hole or holes. (e) Arrangement of detonator... blasting coal off the solid— (i) Each shot in the round shall be initiated in sequence from the opener hole or holes; and (ii) After the first shot or shots, the interval between the designated delay periods...
30 CFR 75.1320 - Multiple-shot blasting.
Code of Federal Regulations, 2012 CFR
2012-07-01
... in a round shall be initiated in sequence from the opener hole or holes. (e) Arrangement of detonator... blasting coal off the solid— (i) Each shot in the round shall be initiated in sequence from the opener hole or holes; and (ii) After the first shot or shots, the interval between the designated delay periods...
30 CFR 75.1320 - Multiple-shot blasting.
Code of Federal Regulations, 2011 CFR
2011-07-01
... in a round shall be initiated in sequence from the opener hole or holes. (e) Arrangement of detonator... blasting coal off the solid— (i) Each shot in the round shall be initiated in sequence from the opener hole or holes; and (ii) After the first shot or shots, the interval between the designated delay periods...
30 CFR 75.1320 - Multiple-shot blasting.
Code of Federal Regulations, 2010 CFR
2010-07-01
... in a round shall be initiated in sequence from the opener hole or holes. (e) Arrangement of detonator... blasting coal off the solid— (i) Each shot in the round shall be initiated in sequence from the opener hole or holes; and (ii) After the first shot or shots, the interval between the designated delay periods...
Code of Federal Regulations, 2013 CFR
2013-10-01
... approved nontoxic shot (see § 32.2(k)). 2. We allow upland game hunting on the 131-acre mainland unit of... Statewide seasons using archery methods or shotguns using shot no larger than BB. C. Big Game Hunting. We... with shot no larger than BB. 6. You may possess only approved nontoxic shot while hunting on the refuge...
Code of Federal Regulations, 2014 CFR
2014-10-01
... approved nontoxic shot (see § 32.2(k)). 2. We allow upland game hunting on the 131-acre mainland unit of... Statewide seasons using archery methods or shotguns using shot no larger than BB. C. Big Game Hunting. We... with shot no larger than BB. 6. You may possess only approved nontoxic shot while hunting on the refuge...
There is increasing concern that birds in terrestrial ecosystems may be exposed to spent lead shot. Evidence exists that upland birds, particularly mourning doves (Zenaida macroura), ingest spent lead shot and that raptors ingest lead shot by consuming wounded game. Mortality, ne...
Code of Federal Regulations, 2012 CFR
2012-10-01
... approved nontoxic shot (see § 32.2(k)). 2. We allow upland game hunting on the 131-acre mainland unit of... Statewide seasons using archery methods or shotguns using shot no larger than BB. C. Big Game Hunting. We... with shot no larger than BB. 6. You may possess only approved nontoxic shot while hunting on the refuge...
Using persuasive messages to encourage hunters to support regulation of lead shot
Schroeder, Susan A.; Fulton, David C.; Penning, William; Doncarlos, Kathy
2012-01-01
Lead shot from hunting adds the toxic metal to environments worldwide. The United States banned lead shot for hunting waterfowl in 1991 and 26 states have lead shot restrictions beyond those mandated for waterfowl hunting. The Minnesota Department of Natural Resources (MDNR) was interested in studying hunter attitudes about expanded restrictions on the use of lead shot for hunting small game to understand what communication strategies might increase public support for potential restrictions on lead shot. We mailed messages about lead shot, including 1,200 control messages and 400 of each of 9 treatment messages, and surveys to 4,800 resident small game hunters. We compared attitudes and intentions related to a possible ban among control and treatment groups. Compared to the control message, all treatment messages elicited more positive attitudes and intentions to support a ban. A basic factual message, messages with references to Ducks Unlimited, and a first-person narrative message generated the strongest support for a ban. Results also demonstrated a substantial relationship between the use of lead shot and response to persuasive messages supporting a ban.
Shot-by-shot Spectrum Model for Rod-pinch, Pulsed Radiography Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, William Monford
A simplified model of bremsstrahlung production is developed for determining the x-ray spectrum output of a rod-pinch radiography machine, on a shot-by-shot basis, using the measured voltage, V(t), and current, I(t). The motivation for this model is the need for an agile means of providing shot-by-shot spectrum prediction, from a laptop or desktop computer, for quantitative radiographic analysis. Simplifying assumptions are discussed, and the model is applied to the Cygnus rod-pinch machine. Output is compared to wedge transmission data for a series of radiographs from shots with identical target objects. Resulting model enables variation of parameters in real time, thusmore » allowing for rapid optimization of the model across many shots. “Goodness of fit” is compared with output from LSP Particle-In-Cell code, as well as the Monte Carlo Neutron Propagation with Xrays (“MCNPX”) model codes, and is shown to provide an excellent predictive representation of the spectral output of the Cygnus machine. In conclusion, improvements to the model, specifically for application to other geometries, are discussed.« less
Shot-by-shot Spectrum Model for Rod-pinch, Pulsed Radiography Machines
Wood, William Monford
2018-02-07
A simplified model of bremsstrahlung production is developed for determining the x-ray spectrum output of a rod-pinch radiography machine, on a shot-by-shot basis, using the measured voltage, V(t), and current, I(t). The motivation for this model is the need for an agile means of providing shot-by-shot spectrum prediction, from a laptop or desktop computer, for quantitative radiographic analysis. Simplifying assumptions are discussed, and the model is applied to the Cygnus rod-pinch machine. Output is compared to wedge transmission data for a series of radiographs from shots with identical target objects. Resulting model enables variation of parameters in real time, thusmore » allowing for rapid optimization of the model across many shots. “Goodness of fit” is compared with output from LSP Particle-In-Cell code, as well as the Monte Carlo Neutron Propagation with Xrays (“MCNPX”) model codes, and is shown to provide an excellent predictive representation of the spectral output of the Cygnus machine. In conclusion, improvements to the model, specifically for application to other geometries, are discussed.« less
Effect of shot peening on surface fatigue life of carburized and hardened AISI 9310 spur gears
NASA Technical Reports Server (NTRS)
Townsend, D. P.; Zaretsky, E. V.
1982-01-01
Surface fatigue tests were conducted on two groups of AISI 9310 spur gears. Both groups were manufactured with standard ground tooth surfaces, with the second group subjected to an additional shot peening process on the gear tooth flanks. The gear pitch diameter was 8.89 cm (3.5 in.). Test conditions were a gear temperature of 350 K (170 F), a maximum Hertz stress of 1.71 billion N/sq m (248,000 psi), and a speed of 10,000 rpm. The shot peened gears exhibited pitting fatigue lives 1.6 times the life of standard gears without shot peening. Residual stress measurements and analysis indicate that the longer fatigue life is the result of the higher compressive stress produced by the shot peening. The life for the shot peened gear was calculated to be 1.5 times that for the plain gear by using the measured residual stress difference for the standard and shot peened gears. The measured residual stress for the shot peened gears was much higher than that for the standard gears.
Shot-by-shot spectrum model for rod-pinch, pulsed radiography machines
NASA Astrophysics Data System (ADS)
Wood, Wm M.
2018-02-01
A simplified model of bremsstrahlung production is developed for determining the x-ray spectrum output of a rod-pinch radiography machine, on a shot-by-shot basis, using the measured voltage, V(t), and current, I(t). The motivation for this model is the need for an agile means of providing shot-by-shot spectrum prediction, from a laptop or desktop computer, for quantitative radiographic analysis. Simplifying assumptions are discussed, and the model is applied to the Cygnus rod-pinch machine. Output is compared to wedge transmission data for a series of radiographs from shots with identical target objects. Resulting model enables variation of parameters in real time, thus allowing for rapid optimization of the model across many shots. "Goodness of fit" is compared with output from LSP Particle-In-Cell code, as well as the Monte Carlo Neutron Propagation with Xrays ("MCNPX") model codes, and is shown to provide an excellent predictive representation of the spectral output of the Cygnus machine. Improvements to the model, specifically for application to other geometries, are discussed.
Comparative toxicity of lead shot in black ducks and mallards
Rattner, B.A.; Fleming, W.J.
1988-01-01
An extreme sensitivity of pen-reared black ducks (BDs) to lead shot was observed incidental to development of an enzyme assay (Pain & Rattner, 1988). Intubation of pen-reared BDs with one no. 4 lead shot resulted in 60% mortality in 6 days. It was concluded that BDs were more sensitive to lead shot than expected, or that lead toxicity may be exacerbated by stressful conditions (elevated temperature, confinement in small pens). We reexamined lead shot toxicity in BDs and mallards (MLs). In winter 1986 (Ta=1.7-14.6? C), pen-reared and wild BDs, and game-farm and wild MLs were sham-dosed or given one no. 4 shot. After 14 days, dosed birds were redosed with two or four additional shot. Since the original observation of enhanced. shot toxicity to BDs occurred during summer, the study was also repeated in summer 1987 (Ta=I7:6-30.9?C), with pen-reared BDs and game-farm MLs. Mortality, overt intoxication, weight change, aminolevulinic acid dehydratase activity, and protoporphyrin concentration were used to compare sensitivity among groups. Sensitivity to lead shot was similar between BDs and MLs. However, the wild ducks appeared more vulnerable than their domesticated counterparts, and signs of intoxication were more pronounced in winter than in summer.
Finding the gap: An empirical study of the most effective shots in elite goalball.
Link, Daniel; Weber, Christoph
2018-01-01
This research identifies which shots types in goalball are most likely to lead to a goal and herby provides background information for improving training and competition. Therefore, we observed 117 elite level matches including 20,541 shots played in the regular situation (3 vs. 3) using notational analysis. We characterized the shots by using their target sector (A-E), technique (traditional, rotation), trajectory (flat, bounce), angle (straight, diagonal and outcome (goal, violation, out, blocked). In our data, a χ2-test showed a significantly higher goal rate for men (3.9%) compared to women (3.0%). For men, we found a significantly higher goal rate in the intersection sectors between players C (5.6%), D (4.9%), and in the outer sector A. In sector A, goal rate was higher only for straight shots (6.6%). Technique and trajectory did not affect goal rate for men, but flat shots showed a higher violation rate (3.2%) compared to bounce shouts (2.0%). In women's goalball, goal rate was higher only on sector D (4.4%). Bounce-rotation shots were the most successful (5.5%). We conclude that men should focus on shots to sectors C and D (called pocket) and straight shots to sector A, as long as there are no other tactical considerations. Women should shoot primarily towards the pocket. It might also be worth playing more bounce-rotation shots and practicing them in training.
Motor and Gaze Behaviors of Youth Basketball Players Taking Contested and Uncontested Jump Shots
van Maarseveen, Mariëtte J. J.; Oudejans, Raôul R. D.
2018-01-01
In this study, we examined the effects of a defender contesting jump shots on performance and gaze behaviors of basketball players taking jump shots. Thirteen skilled youth basketball players performed 48 shots from about 5 m from the basket; 24 uncontested and 24 contested. The participants wore mobile eye tracking glasses to measure their gaze behavior. As expected, an approaching defender trying to contest the shot led to significant changes in movement execution and gaze behavior including shorter shot execution time, longer jump time, longer ball flight time, later final fixation onset, and longer fixation on the defender. Overall, no effects were found for shooting accuracy. However, the effects on shot accuracy were not similar for all participants: six participants showed worse performance and six participants showed better performance in the contested compared to the uncontested condition. These changes in performance were accompanied by differences in gaze behavior. The participants with worse performance showed shorter absolute and relative final fixation duration and a tendency for an earlier final fixation offset in the contested condition compared to the uncontested condition, whereas gaze behavior of the participants with better performance for contested shots was relatively unaffected. The results confirm that a defender contesting the shot is a relevant constraint for basketball shooting suggesting that representative training designs should also include contested shots, and more generally other constraints that are representative of the actual performance setting such as time or mental pressure. PMID:29867671
Indirect monitoring shot-to-shot shock waves strength reproducibility during pump–probe experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pikuz, T. A., E-mail: tatiana.pikuz@eie.eng.osaka-u.ac.jp; Photon Pioneers Center, Osaka University, Suita, Osaka 565-0871 Japan; Joint Institute for High Temperatures, Russian Academy of Sciences, Moscow 125412
We present an indirect method of estimating the strength of a shock wave, allowing on line monitoring of its reproducibility in each laser shot. This method is based on a shot-to-shot measurement of the X-ray emission from the ablated plasma by a high resolution, spatially resolved focusing spectrometer. An optical pump laser with energy of 1.0 J and pulse duration of ∼660 ps was used to irradiate solid targets or foils with various thicknesses containing Oxygen, Aluminum, Iron, and Tantalum. The high sensitivity and resolving power of the X-ray spectrometer allowed spectra to be obtained on each laser shot and tomore » control fluctuations of the spectral intensity emitted by different plasmas with an accuracy of ∼2%, implying an accuracy in the derived electron plasma temperature of 5%–10% in pump–probe high energy density science experiments. At nano- and sub-nanosecond duration of laser pulse with relatively low laser intensities and ratio Z/A ∼ 0.5, the electron temperature follows T{sub e} ∼ I{sub las}{sup 2/3}. Thus, measurements of the electron plasma temperature allow indirect estimation of the laser flux on the target and control its shot-to-shot fluctuation. Knowing the laser flux intensity and its fluctuation gives us the possibility of monitoring shot-to-shot reproducibility of shock wave strength generation with high accuracy.« less
Finding the gap: An empirical study of the most effective shots in elite goalball
Weber, Christoph
2018-01-01
This research identifies which shots types in goalball are most likely to lead to a goal and herby provides background information for improving training and competition. Therefore, we observed 117 elite level matches including 20,541 shots played in the regular situation (3 vs. 3) using notational analysis. We characterized the shots by using their target sector (A-E), technique (traditional, rotation), trajectory (flat, bounce), angle (straight, diagonal and outcome (goal, violation, out, blocked). In our data, a χ2-test showed a significantly higher goal rate for men (3.9%) compared to women (3.0%). For men, we found a significantly higher goal rate in the intersection sectors between players C (5.6%), D (4.9%), and in the outer sector A. In sector A, goal rate was higher only for straight shots (6.6%). Technique and trajectory did not affect goal rate for men, but flat shots showed a higher violation rate (3.2%) compared to bounce shouts (2.0%). In women's goalball, goal rate was higher only on sector D (4.4%). Bounce-rotation shots were the most successful (5.5%). We conclude that men should focus on shots to sectors C and D (called pocket) and straight shots to sector A, as long as there are no other tactical considerations. Women should shoot primarily towards the pocket. It might also be worth playing more bounce-rotation shots and practicing them in training. PMID:29698479
Collider shot setup for Run 2 observations and suggestions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Annala, J.; Joshel, B.
1996-01-31
This note is intended to provoke discussion on Collider Run II shot setup. We hope this is a start of activities that will converge on a functional description of what is needed for shot setups in Collider Run II. We will draw on observations of the present shot setup to raise questions and make suggestions for the next Collider run. It is assumed that the reader has some familiarity with the Collider operational issues. Shot setup is defined to be the time between the end of a store and the time the Main Control Room declares colliding beams. This ismore » the time between Tevatron clock events SCE and SCB. This definition does not consider the time experiments use to turn on their detectors. This analysis was suggested by David Finley. The operational scenarios for Run II will require higher levels of reliability and speed for shot setup. See Appendix I and II. For example, we estimate that a loss of 3 pb{sup {minus}1}/week (with 8 hour stores) will occur if shot setups take 90 minutes instead of 30 minutes. In other words: If you do 12 shots for one week and accept an added delay of one minute in each shot, you will loose more than 60 nb{sup {minus}1} for that week alone (based on a normal shot setup of 30 minutes). These demands should lead us to be much more pedantic about all the factors that affect shot setups. Shot setup will be viewed as a distinct process that is composed of several inter- dependent `components`: procedures, hardware, controls, and sociology. These components don`t directly align with the different Accelerator Division departments, but are topical groupings of the needed accelerator functions. Defining these components, and categorizing our suggestions within them, are part of the goal of this document. Of course, some suggestions span several of these components.« less
Weinberg, Meghan; Dietz, Stephanie; Potter, Rachel; Swanson, Robert; Miller, Corinne; McFadden, Jevon
2017-02-15
Concerns regarding vaccine safety and pain have prompted certain parents to limit the number of shots their child receives per visit. We estimated the prevalence of shot-limited children in Michigan, described their characteristics, assessed whether shot-limited children were up-to-date on recommended vaccinations, and investigated possible intervention points for vaccination education. We analyzed vaccination registry and birth record data of children born in Michigan during 2012 who had ⩾2 vaccination visits, with ⩾1 visits after age 5months. Shot-limited was defined as receiving ⩽2 shots at all visits through age 24months. Nonlimited children received >2 shots at ⩾1 visits. Up-to-date vaccination was based on receipt of a seven-vaccine series and was determined at ages 24months and 35months. Risk ratios (RR) were calculated using risk regression. Of 101,443 children, a total of 2,967 (3%) children were shot-limited. Mothers of shot-limited children were more likely to be white (RR: 1.2; 95% confidence interval [CI]: 1.2-1.2), college graduate (RR: 1.9; 95% CI: 1.9-2.0), and married (RR: 1.5; 95% CI: 1.5-1.5). Compared with nonlimited children, shot-limited children were more likely to be born in a nonhospital setting (RR: 11.7; 95% CI: 9.4-14.6) and have a midwife attendant (RR: 1.9; 95% CI: 1.7-2.1). Shot-limited children were less likely to be up-to-date on recommended vaccinations (RR: 0.2; 95% CI: 0.2-0.3); this association was stronger for those with a midwife birth attendant (RR: 0.1; 95% CI: 0.1-0.2) rather than a medical doctor (RR: 0.3; 95% CI: 0.2-0.3). Shot-limited children are less likely to be up-to-date on vaccinations, possibly leading to increased risk for vaccine-preventable diseases. This association was stronger for those with a midwife birth attendant. This analysis should prompt targeted education, such as to midwives, concerning risks associated with shot-limiting behavior. Published by Elsevier Ltd.
Weinberg, Meghan; Dietz, Stephanie; Potter, Rachel; Swanson, Robert; Miller, Corinne; McFadden, Jevon
2017-01-01
Background Concerns regarding vaccine safety and pain have prompted certain parents to limit the number of shots their child receives per visit. We estimated the prevalence of shot-limited children in Michigan, described their characteristics, assessed whether shot-limited children were up-to-date on recommended vaccinations, and investigated possible intervention points for vaccination education. Methods We analyzed vaccination registry and birth record data of children born in Michigan during 2012 who had ⩾2 vaccination visits, with ⩾1 visits after age 5 months. Shot-limited was defined as receiving ≤2 shots at all visits through age 24 months. Nonlimited children received >2 shots at ⩾1 visits. Up-to-date vaccination was based on receipt of a seven-vaccine series and was determined at ages 24 months and 35 months. Risk ratios (RR) were calculated using risk regression. Results Of 101,443 children, a total of 2,967 (3%) children were shot-limited. Mothers of shot-limited children were more likely to be white (RR: 1.2; 95% confidence interval [CI]: 1.2–1.2), college graduate (RR: 1.9; 95% CI: 1.9–2.0), and married (RR: 1.5; 95% CI: 1.5–1.5). Compared with nonlimited children, shot-limited children were more likely to be born in a nonhospital setting (RR: 11.7; 95% CI: 9.4–14.6) and have a midwife attendant (RR: 1.9; 95% CI: 1.7–2.1). Shot-limited children were less likely to be up-to-date on recommended vaccinations (RR: 0.2; 95% CI: 0.2–0.3); this association was stronger for those with a midwife birth attendant (RR: 0.1; 95% CI: 0.1–0.2) rather than a medical doctor (RR: 0.3; 95% CI: 0.2–0.3). Conclusions Shot-limited children are less likely to be up-to-date on vaccinations, possibly leading to increased risk for vaccine-preventable diseases. This association was stronger for those with a midwife birth attendant. This analysis should prompt targeted education, such as to midwives, concerning risks associated with shot-limiting behavior. PMID:28108229
Synthesis of many different types of organic small molecules using one automated process.
Li, Junqi; Ballmer, Steven G; Gillis, Eric P; Fujii, Seiko; Schmidt, Michael J; Palazzolo, Andrea M E; Lehmann, Jonathan W; Morehouse, Greg F; Burke, Martin D
2015-03-13
Small-molecule synthesis usually relies on procedures that are highly customized for each target. A broadly applicable automated process could greatly increase the accessibility of this class of compounds to enable investigations of their practical potential. Here we report the synthesis of 14 distinct classes of small molecules using the same fully automated process. This was achieved by strategically expanding the scope of a building block-based synthesis platform to include even C(sp3)-rich polycyclic natural product frameworks and discovering a catch-and-release chromatographic purification protocol applicable to all of the corresponding intermediates. With thousands of compatible building blocks already commercially available, many small molecules are now accessible with this platform. More broadly, these findings illuminate an actionable roadmap to a more general and automated approach for small-molecule synthesis. Copyright © 2015, American Association for the Advancement of Science.
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
Energy Assessment of Automated Mobility Districts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yuche
Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displace private automobiles for day-to-day travel in dense activity districts. This project examines such a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMDs). The project reviews several such districts including airport, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technologymore » and others with more traditional transit based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs.« less
Modeling the Energy Use of a Connected and Automated Transportation System (Poster)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonder, J.; Brown, A.
Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing trafficmore » flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.« less
A hybrid human and machine resource curation pipeline for the Neuroscience Information Framework.
Bandrowski, A E; Cachat, J; Li, Y; Müller, H M; Sternberg, P W; Ciccarese, P; Clark, T; Marenco, L; Wang, R; Astakhov, V; Grethe, J S; Martone, M E
2012-01-01
The breadth of information resources available to researchers on the Internet continues to expand, particularly in light of recently implemented data-sharing policies required by funding agencies. However, the nature of dense, multifaceted neuroscience data and the design of contemporary search engine systems makes efficient, reliable and relevant discovery of such information a significant challenge. This challenge is specifically pertinent for online databases, whose dynamic content is 'hidden' from search engines. The Neuroscience Information Framework (NIF; http://www.neuinfo.org) was funded by the NIH Blueprint for Neuroscience Research to address the problem of finding and utilizing neuroscience-relevant resources such as software tools, data sets, experimental animals and antibodies across the Internet. From the outset, NIF sought to provide an accounting of available resources, whereas developing technical solutions to finding, accessing and utilizing them. The curators therefore, are tasked with identifying and registering resources, examining data, writing configuration files to index and display data and keeping the contents current. In the initial phases of the project, all aspects of the registration and curation processes were manual. However, as the number of resources grew, manual curation became impractical. This report describes our experiences and successes with developing automated resource discovery and semiautomated type characterization with text-mining scripts that facilitate curation team efforts to discover, integrate and display new content. We also describe the DISCO framework, a suite of automated web services that significantly reduce manual curation efforts to periodically check for resource updates. Lastly, we discuss DOMEO, a semi-automated annotation tool that improves the discovery and curation of resources that are not necessarily website-based (i.e. reagents, software tools). Although the ultimate goal of automation was to reduce the workload of the curators, it has resulted in valuable analytic by-products that address accessibility, use and citation of resources that can now be shared with resource owners and the larger scientific community. DATABASE URL: http://neuinfo.org.
A hybrid human and machine resource curation pipeline for the Neuroscience Information Framework
Bandrowski, A. E.; Cachat, J.; Li, Y.; Müller, H. M.; Sternberg, P. W.; Ciccarese, P.; Clark, T.; Marenco, L.; Wang, R.; Astakhov, V.; Grethe, J. S.; Martone, M. E.
2012-01-01
The breadth of information resources available to researchers on the Internet continues to expand, particularly in light of recently implemented data-sharing policies required by funding agencies. However, the nature of dense, multifaceted neuroscience data and the design of contemporary search engine systems makes efficient, reliable and relevant discovery of such information a significant challenge. This challenge is specifically pertinent for online databases, whose dynamic content is ‘hidden’ from search engines. The Neuroscience Information Framework (NIF; http://www.neuinfo.org) was funded by the NIH Blueprint for Neuroscience Research to address the problem of finding and utilizing neuroscience-relevant resources such as software tools, data sets, experimental animals and antibodies across the Internet. From the outset, NIF sought to provide an accounting of available resources, whereas developing technical solutions to finding, accessing and utilizing them. The curators therefore, are tasked with identifying and registering resources, examining data, writing configuration files to index and display data and keeping the contents current. In the initial phases of the project, all aspects of the registration and curation processes were manual. However, as the number of resources grew, manual curation became impractical. This report describes our experiences and successes with developing automated resource discovery and semiautomated type characterization with text-mining scripts that facilitate curation team efforts to discover, integrate and display new content. We also describe the DISCO framework, a suite of automated web services that significantly reduce manual curation efforts to periodically check for resource updates. Lastly, we discuss DOMEO, a semi-automated annotation tool that improves the discovery and curation of resources that are not necessarily website-based (i.e. reagents, software tools). Although the ultimate goal of automation was to reduce the workload of the curators, it has resulted in valuable analytic by-products that address accessibility, use and citation of resources that can now be shared with resource owners and the larger scientific community. Database URL: http://neuinfo.org PMID:22434839
Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang
2015-04-01
Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.
Gettings, M.E.
1982-01-01
The heat-flow profile across the Arabian Shield from Ar Riyad to Ad Darb and across the Red Sea is examined for compatibility with the lithospheric structure of the area as deduced from geologic and other geophysical data. Broad continental uplift associated with Red Sea rifting is symmetric about the Red Sea axis, and geologic and geochronologic evidence indicate that uplift has occurred mainly in the interval 25-13 Ma (mega-annum) ago. Thermal-profile changes in the upper mantle resulting from an influx of hot material associated with rifting yield the correct order of magnitude of uplift, and this mechanism is suggested as the explanation for the regional doming. A lithospheric section, constructed from seismic refraction, gravity, and regional geologic data, provides the framework for construction of thermal models. Thermal gradient measurements were made in drill holes at five shot points. Geotherms for the Shield, which assume a radiogenic heat-source distribution that decreases exponentially with depth, yield temperatures of about 450?C at a depth of 40 km (base of the crust) for shot points 2 (Sabhah) and 3. The geotherm for shot point 4 (near Bishah) yields a distinctly higher temperature (about 580?C) for the same depth. Static models used to model the heat flow in the oceanic crust of the Red Sea shelf and coastal plain either yield too small a heat flow to match the observed heat flow or give lithosphere thicknesses that are so thin as to be improbable. Dynamic (solid-state accretion) models, which account for mantle flow at the base of the lithosphere, adequately match the observed heat-flow values. In the deep-water trough of the Red Sea, which is presently undergoing active sea-floor spreading, classical models of heat flow for a moving slab with accretion at the spreading center are adequate to explain the average heat-flow level. At shot point 5 (Ad Darb), the anomalous heat flow of 2 HFU (heat-flow units) can be explained in terms of a Shield component (0.8-1.0 HFU) and a component related to heating by the abutting oceanic crust a few kilometers away for periods exceeding 10 Ma. Analytical results are included for: 1) the cooling of a static sheet with an initial temperature distribution characteristic of a moving slab in a sea-floor spreading environment, and 2) the heating of a homogeneous quarter-space at its vertical boundary.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... of Copper-Clad Iron Shot as Nontoxic for Waterfowl Hunting AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of application for nontoxic shot approval. SUMMARY: We, the U.S. Fish and Wildlife Service, announce that Environ- Metal, Inc., of Sweet Home, Oregon, has applied for our approval of shot...
[Contact shot from infantry weapons with a flash-suppressor].
Perdekamp, Markus Grosse; Braunwarth, Roland; Schmidt, Ulrike; Schmidt, Wolfgang; Pollak, Stefan
2003-01-01
The number of reports on contact shots from firearms with a flash suppressor attached to the muzzle is small. On the basis of a case report (suicidal shot to the forehead with a Kalschnikow AKMS 47 assault rifle) the morphological peculiarities (characteristics soot pattern, relatively small powder cavity and only minor skin tears in the presence of a bony support) are presented and the conclusions to be drawn from the findings regarding the flash-suppressor, the shot distance, the angle of the shot and the way of holding the weapon are discussed.
Characterization of Bird Impacts on a Rigid Plate: Part 1
1975-01-01
a breech block at the breech end and a sabot stopper at the muzzle. Four longitudinal slits 46 cm long, 0. 318 cm wide, terminating 36 cm from the...series, the two PCB 118 transducers performed ratisfactorily for 71 shots and displayod no indications of imminent failure, 20 I1I Iz AFFDL-TR-75-5...75-5 Shot No. 4951; velocity 215 rn/s Shot No. 4965; velocity Z01 rn/s A-3 AFFDL-TR-75-5 Shot No. 2986; velocity 71 rn/s Shot No. 4987; velocity 105 i
Dysart, J E; Lindsay, R C; Hammond, R; Dupuis, P
2001-12-01
The effects of viewing mug shots on subsequent identification performance are as yet unclear. Two experiments used a live staged-crime paradigm to determine if interpolated eyewitness exposure to mug shots caused interference, unconscious transference, or commitment effects influencing subsequent lineup accuracy. Experiment 1 (N = 104) tested interference effects. Similar correct decision rates were obtained for the mug shot and no mug shot groups from both perpetrator-present and absent lineups. Experiment 2 (N = 132) tested for commitment and transference effects. Results showed that the commitment group made significantly more incorrect identifications than either the control or the transference group, which had similar false-identification rates. Commitment effects present a serious threat to identification accuracy from lineups following mug shot searches.
Billion shot flashlamp for spaceborne lasers
NASA Technical Reports Server (NTRS)
Richter, Linda; Schuda, Felix; Degnan, John
1990-01-01
A billion-shot flashlamp developed under a NASA contract for spaceborne laser missions is presented. Lifetime-limiting mechanisms are identified and addressed. Two energy loadings of 15 and 44 Joules were selected for the initial accelerated life testing. A fluorescence-efficiency test station was used for measuring the useful-light output degradation of the lamps. The design characteristics meeting NASA specifications are outlined. Attention is focused on the physical properties of tungsten-matrix cathodes, the chemistry of dispenser cathodes, and anode degradation. It is reported that out of the total 83 lamps tested in the program, 4 lamps reached a billion shots and one lamp is beyond 1.7 billion shots, while at 44 Joules, 4 lamps went beyond 100 million shots and one lamp reached 500 million shots.
An upper limit on ultraviolet shot noise from Cygnus X-1
NASA Technical Reports Server (NTRS)
Duthie, J. G.; Mcmillan, R. S.
1979-01-01
Rapid photometry of Cygnus X-1 through an ultraviolet filter centered on 0.35 micron has been obtained at 100-ms sampling intervals. The autocorrelation function of these data has been examined for shot noise analogous to the behavior of the X-ray light curve. The ultraviolet data are entirely consistent with white noise. Considering randomly occurring ultraviolet shots with the same duration (0.5 s) and average rate (1 per sec) as the X-ray shots, a 3-sigma upper limit on the ratio of optical to X-ray energies per shot is estimated to be 0.13, before the ultraviolet light is attenuated by interstellar dust. This limit is then generalized for shots of arbitrary duration and rate.
Managing complexity in simulations of land surface and near-surface processes
Coon, Ethan T.; Moulton, J. David; Painter, Scott L.
2016-01-12
Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less
Ruohonen, Toni; Ennejmy, Mohammed
2013-01-01
Making reliable and justified operational and strategic decisions is a really challenging task in the health care domain. So far, the decisions have been made based on the experience of managers and staff, or they are evaluated with traditional methods, using inadequate data. As a result of this kind of decision-making process, attempts to improve operations usually have failed or led to only local improvements. Health care organizations have a lot of operational data, in addition to clinical data, which is the key element for making reliable and justified decisions. However, it is progressively problematic to access it and make usage of it. In this paper we discuss about the possibilities how to exploit operational data in the most efficient way in the decision-making process. We'll share our future visions and propose a conceptual framework for automating the decision-making process.
SBROME: a scalable optimization and module matching framework for automated biosystems design.
Huynh, Linh; Tsoukalas, Athanasios; Köppe, Matthias; Tagkopoulos, Ilias
2013-05-17
The development of a scalable framework for biodesign automation is a formidable challenge given the expected increase in part availability and the ever-growing complexity of synthetic circuits. To allow for (a) the use of previously constructed and characterized circuits or modules and (b) the implementation of designs that can scale up to hundreds of nodes, we here propose a divide-and-conquer Synthetic Biology Reusable Optimization Methodology (SBROME). An abstract user-defined circuit is first transformed and matched against a module database that incorporates circuits that have previously been experimentally characterized. Then the resulting circuit is decomposed to subcircuits that are populated with the set of parts that best approximate the desired function. Finally, all subcircuits are subsequently characterized and deposited back to the module database for future reuse. We successfully applied SBROME toward two alternative designs of a modular 3-input multiplexer that utilize pre-existing logic gates and characterized biological parts.
NASA Technical Reports Server (NTRS)
Westmoreland, Sally; Stow, Douglas A.
1992-01-01
A framework is proposed for analyzing ancillary data and developing procedures for incorporating ancillary data to aid interactive identification of land-use categories in land-use updates. The procedures were developed for use within an integrated image processsing/geographic information systems (GIS) that permits simultaneous display of digital image data with the vector land-use data to be updated. With such systems and procedures, automated techniques are integrated with visual-based manual interpretation to exploit the capabilities of both. The procedural framework developed was applied as part of a case study to update a portion of the land-use layer in a regional scale GIS. About 75 percent of the area in the study site that experienced a change in land use was correctly labeled into 19 categories using the combination of automated and visual interpretation procedures developed in the study.
Web-based automation of green building rating index and life cycle cost analysis
NASA Astrophysics Data System (ADS)
Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul
2018-04-01
Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.
Alves-Silva, Juliana; Sánchez-Soriano, Natalia; Beaven, Robin; Klein, Melanie; Parkin, Jill; Millard, Thomas H; Bellen, Hugo J; Venken, Koen J T; Ballestrem, Christoph; Kammerer, Richard A; Prokop, Andreas
2012-07-04
The correct outgrowth of axons is essential for the development and regeneration of nervous systems. Axon growth is primarily driven by microtubules. Key regulators of microtubules in this context are the spectraplakins, a family of evolutionarily conserved actin-microtubule linkers. Loss of function of the mouse spectraplakin ACF7 or of its close Drosophila homolog Short stop/Shot similarly cause severe axon shortening and microtubule disorganization. How spectraplakins perform these functions is not known. Here we show that axonal growth-promoting roles of Shot require interaction with EB1 (End binding protein) at polymerizing plus ends of microtubules. We show that binding of Shot to EB1 requires SxIP motifs in Shot's C-terminal tail (Ctail), mutations of these motifs abolish Shot functions in axonal growth, loss of EB1 function phenocopies Shot loss, and genetic interaction studies reveal strong functional links between Shot and EB1 in axonal growth and microtubule organization. In addition, we report that Shot localizes along microtubule shafts and stabilizes them against pharmacologically induced depolymerization. This function is EB1-independent but requires net positive charges within Ctail which essentially contribute to the microtubule shaft association of Shot. Therefore, spectraplakins are true members of two important classes of neuronal microtubule regulating proteins: +TIPs (tip interacting proteins; plus end regulators) and structural MAPs (microtubule-associated proteins). From our data we deduce a model that relates the different features of the spectraplakin C terminus to the two functions of Shot during axonal growth.
The kinematics of table tennis racquet: differences between topspin strokes.
Bańkosz, Ziemowit; Winiarski, Sławomir
2017-03-01
Studies of shot kinematics in table tennis have not been sufficiently described in the literature. The assessment of the racquet trajectory, its speed and time characteristics makes it possible to emphasize on certain technical elements in the training process in order, for example, to increase strength, speed of rotation or speed of the shot while maintaining its accuracy. The aim of this work was to measure selected kinematic parameters of table tennis racquet during forehand and backhand topspin shots, while considering the differences between these strokes in table tennis. The measurements took place in a certified biomechanical laboratory using a motion analysis system. The study involved 12 female table tennis players in high-level sports training and performance. Each subject had to complete series of six tasks, presenting different varieties of topspin shots. The longest racquet trajectory was related to forehand shots, shots played against a ball with backspin and winner shots. The maximum racquet velocity was precisely in the moment of impact with the ball. The individual of velocity and distance were larger in the direction of the acting force, depending on the individual shot. Changing the type of topspin shot requires changes of time, velocity and primarily distance parameters as well as the direction of the playing racquet. The maximum speed of the racquet occurring at the moment of the impact is probably the most important principle in playing technique. The results can be directly used in improving training of table tennis techniques, especially in the application and use of topspin shots.
Single-shot spiral imaging at 7 T.
Engel, Maria; Kasper, Lars; Barmet, Christoph; Schmid, Thomas; Vionnet, Laetitia; Wilm, Bertram; Pruessmann, Klaas P
2018-03-25
The purpose of this work is to explore the feasibility and performance of single-shot spiral MRI at 7 T, using an expanded signal model for reconstruction. Gradient-echo brain imaging is performed on a 7 T system using high-resolution single-shot spiral readouts and half-shot spirals that perform dual-image acquisition after a single excitation. Image reconstruction is based on an expanded signal model including the encoding effects of coil sensitivity, static off-resonance, and magnetic field dynamics. The latter are recorded concurrently with image acquisition, using NMR field probes. The resulting image resolution is assessed by point spread function analysis. Single-shot spiral imaging is achieved at a nominal resolution of 0.8 mm, using spiral-out readouts of 53-ms duration. High depiction fidelity is achieved without conspicuous blurring or distortion. Effective resolutions are assessed as 0.8, 0.94, and 0.98 mm in CSF, gray matter and white matter, respectively. High image quality is also achieved with half-shot acquisition yielding image pairs at 1.5-mm resolution. Use of an expanded signal model enables single-shot spiral imaging at 7 T with unprecedented image quality. Single-shot and half-shot spiral readouts deploy the sensitivity benefit of high field for rapid high-resolution imaging, particularly for functional MRI and arterial spin labeling. © 2018 International Society for Magnetic Resonance in Medicine.
Toxicity of Lead and Proposed Substitute Shot to Mallards
Longcore, J.R.; Andrews, R.; Locke, L.N.; Bagley, George E.; Young, L.T.
1974-01-01
Poisoning of North American waterfowl resulting from the ingestion of lead shot by ducks, geese, and swans causes an estimated annual mortality of 2 to 3% of the population (Bellrose 1959). To alleviate this problem the search for a suitable substitute for lead has been underway since the early 1950's. Proposed substitutes for lead shot were evaluated in a series of acute toxicity tests with pen-reared mallards (Anas platyrhynchos). Most candidate materials were as toxic to ducks as commercial lead shot. Coating or alloying lead with other metals only delayed mortality among dosed ducks. The reputedly 'disintegrable' lead shot with the water-soluble binder and the lead containing biochemical additives were also as toxic to mallards as the commercial lead shot. Mortality was not significantly different among lead-dosed adult or first-year hen and drake pen-reared mallards; lead-dosed adult, wild mallards of both sexes; and lead-dosed adult, male black ducks (Anas rubripes). The ingestion of one lead shot, size 4, by each of 80 pen-reared mallards caused an average 19% mortality. The presence and type of grit in the gizzard had a measurable effect on erosion of ingested shot and on shot retention among dosed mallards. Significantly fewer lead-dosed ducks died when fed crushed oystershell grit than when fed either quartz grit or no grit.
A focal-spot diagnostic for on-shot characterization of high-energy petawatt lasers.
Bromage, J; Bahk, S-W; Irwin, D; Kwiatkowski, J; Pruyne, A; Millecchia, M; Moore, M; Zuegel, J D
2008-10-13
An on-shot focal-spot diagnostic for characterizing high-energy, petawatt-class laser systems is presented. Accurate measurements at full energy are demonstrated using high-resolution wavefront sensing in combination with techniques to calibrate on-shot measurements with low-power sample beams. Results are shown for full-energy activation shots of the OMEGA EP Laser System.
Banks, Victoria A; Stanton, Neville A
2015-01-01
Automated assistance in driving emergencies aims to improve the safety of our roads by avoiding or mitigating the effects of accidents. However, the behavioural implications of such systems remain unknown. This paper introduces the driver decision-making in emergencies (DDMiEs) framework to investigate how the level and type of automation may affect driver decision-making and subsequent responses to critical braking events using network analysis to interrogate retrospective verbalisations. Four DDMiE models were constructed to represent different levels of automation within the driving task and its effects on driver decision-making. Findings suggest that whilst automation does not alter the decision-making pathway (e.g. the processes between hazard detection and response remain similar), it does appear to significantly weaken the links between information-processing nodes. This reflects an unintended yet emergent property within the task network that could mean that we may not be improving safety in the way we expect. This paper contrasts models of driver decision-making in emergencies at varying levels of automation using the Southampton University Driving Simulator. Network analysis of retrospective verbalisations indicates that increasing the level of automation in driving emergencies weakens the link between information-processing nodes essential for effective decision-making.
Space power subsystem automation technology
NASA Technical Reports Server (NTRS)
Graves, J. R. (Compiler)
1982-01-01
The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.
NASA Astrophysics Data System (ADS)
Zhan, Ke; Wu, Yihao; Li, Jiongli; Zhao, Bin; Yan, Ya; Xie, Lechun; Wang, Lianbo; Ji, V.
2018-03-01
Graphene reinforced Al composite with high mechanical property was successfully reported. However, there are quite limited studies about shot peening effect on this new type material. Here, 1.0 wt% graphene reinforced Al composite was produced by powder metallurgy and treated by shot peening. The surface layer characteristics of shot peened composite was investigated by X-ray diffraction line profile analysis. The microstructure including domain size, micro-strain, dislocation density and crystalline texture were analyzed. The results showed that after surface shot-peening, the domain size were refined, the dislocation density of the composite was increased sharply to 9.0 × 1011/cm2 at the top surface. The original strong texture was diminished after shot peening. Based on the calculated results, the microstructure variation of composite was more severe than that of Al without graphene reinforcement after shot peening. Besides, the micro-hardness of composite at the top surface increased up to 75HV one time higher than that of matrix. It is concluded that shot peening can be considered as an essential process of improving the surface properties of graphene reinforced Al composite.
Seismic waveform tomography with shot-encoding using a restarted L-BFGS algorithm.
Rao, Ying; Wang, Yanghua
2017-08-17
In seismic waveform tomography, or full-waveform inversion (FWI), one effective strategy used to reduce the computational cost is shot-encoding, which encodes all shots randomly and sums them into one super shot to significantly reduce the number of wavefield simulations in the inversion. However, this process will induce instability in the iterative inversion regardless of whether it uses a robust limited-memory BFGS (L-BFGS) algorithm. The restarted L-BFGS algorithm proposed here is both stable and efficient. This breakthrough ensures, for the first time, the applicability of advanced FWI methods to three-dimensional seismic field data. In a standard L-BFGS algorithm, if the shot-encoding remains unchanged, it will generate a crosstalk effect between different shots. This crosstalk effect can only be suppressed by employing sufficient randomness in the shot-encoding. Therefore, the implementation of the L-BFGS algorithm is restarted at every segment. Each segment consists of a number of iterations; the first few iterations use an invariant encoding, while the remainder use random re-coding. This restarted L-BFGS algorithm balances the computational efficiency of shot-encoding, the convergence stability of the L-BFGS algorithm, and the inversion quality characteristic of random encoding in FWI.
Zhao, Lei; Lim Choi Keung, Sarah N; Taweel, Adel; Tyler, Edward; Ogunsina, Ire; Rossiter, James; Delaney, Brendan C; Peterson, Kevin A; Hobbs, F D Richard; Arvanitis, Theodoros N
2012-01-01
Heterogeneous data models and coding schemes for electronic health records present challenges for automated search across distributed data sources. This paper describes a loosely coupled software framework based on the terminology controlled approach to enable the interoperation between the search interface and heterogeneous data sources. Software components interoperate via common terminology service and abstract criteria model so as to promote component reuse and incremental system evolution.
2008-05-01
communicate with other weapon models In a mission-level simulation; (3) introduces the four configuration levels of the M&S framework; and (4) presents a cost ...and Disadvantages ....................................................................... 26 6 COST -EFFECTIVE M&S LABORATORY PLAN...25 23 Weapon Model Sample Time and Average TET Displayed on the Target PC ..... 26 24 Design and Cost of an
Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems
NASA Technical Reports Server (NTRS)
Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael
2013-01-01
The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.
Shots by STFM: value of immunization software to family medicine residency directors: a CERA study.
Nowalk, Mary Patricia; Clinch, C Randall; Tarn, Derjung M; Chang, Tammy; Ncube, Collette N; Troy, Judith A; Zimmerman, Richard K
2012-01-01
The Group on Immunization Education (GIE) of the Society of Teachers of Family Medicine (STFM) has developed Shots by STFM immunization software, which is available free of charge for a variety of platforms. It is routinely updated with the Center for Disease Control and Prevention's (CDC's) most recent immunization schedules. Successful development and marketing of teaching resources requires periodic evaluation of their use and value to their target audience. This study was undertaken to evaluate the 2011 version of Shots by STFM. Family medicine residency directors were surveyed about their use of Shots by STFM for teaching residents and their ratings of its features. The response rate for the survey was 38% (172/452). While awareness of Shots by STFM among responding residency directors was low (57%), ratings by those using the resource were excellent. Thirty percent of respondents recommend or require their residents to use Shots by STFM. Better marketing of Shots by STFM to family medicine residency directors seems to be indicated.
Jello Shot Consumption among Underage Youths in the United States
SIEGEL, MICHAEL; GALLOWAY, ASHLEY; ROSS, CRAIG S.; BINAKONSKY, JANE; JERNIGAN, DAVID H.
2015-01-01
We sought, for the first time, to identify the extent of jello shot consumption among underage youth. We conducted a study among a national sample of 1,031 youth, aged 13 to 20, using a pre-recruited internet panel maintained by GfK Knowledge Networks to assess past 30-day consumption of jello shots. Nearly one-fifth of underage youth have consumed jello shots in the past 30 days and jello shots make up an average of nearly 20% of their overall alcohol intake. Jello shot users in our sample were approximately 1.5 times more likely to binge drink, consumed approximately 1.6 times as many drinks per month, and were 1.7 times more likely to have been in a physical fight related to their alcohol use as drinkers in general. Ascertainment of jello shot use should become a standard part of youth alcohol surveillance and states should consider banning the sale of these products. PMID:27087771
NASA Astrophysics Data System (ADS)
Kozawa, Takahiro; Santillan, Julius Joseph; Itani, Toshiro
2017-06-01
In lithography using high-energy photons such as an extreme ultraviolet (EUV) radiation, the shot noise of photons is a critical issue. The shot noise is a cause of line edge/width roughness (LER/LWR) and stochastic defect generation and limits the resist performance. In this study, the effects of photodecomposable quenchers were investigated from the viewpoint of the shot noise limit. The latent images of line-and-space patterns with 11 nm half-pitch were calculated using a Monte Carlo method. In the simulation, the effect of secondary electron blur was eliminated to clarify the shot noise limits regarding stochastic phenomena such as LER. The shot noise limit for chemically amplified resists with acid generators and photodecomposable quenchers was approximately the same as that for chemically amplified resists with acid generators and conventional quenchers when the total sensitizer concentration was the same. The effect of photodecomposable quenchers on the shot noise limit was essentially the same as that of acid generators.
Seasonal ingestion of toxic and nontoxic shot by Canada geese
DeStefano, S.; Brand, C.J.; Samuel, M.D.
1995-01-01
We used rates of ingested shot and elevated blood-lead levels (≥0.18 ppm) to estimate the proportion of Canada geese (Branta canadensis) exposed to lead on 3 study areas in Manitoba, Minnesota, and Missouri. Lead exposure was prevalent on all areas and was common after the hunting season closed, when up to 15% of geese could have been exposed to lead shot. However, the proportion of steel shot ingested by geese has increased during the past 2 decades. We suggest that lead exposure is still a source of indirect hunting mortality in Canada geese but project that the prevalence of lead exposure in the Eastern Prairie Population and other waterfowl populations will decrease as nontoxic shot regulations persist and hunters use steel or other nontoxic shot.
NASA Astrophysics Data System (ADS)
Les, A.; Klemperer, S. L.; Keranen, K.; Khan, A.; Maguire, P.
2003-12-01
In January 2003, as part of the Ethiopia-Afar Geoscientific Lithospheric Experiment (EAGLE) we conducted a refraction and wide-angle reflection survey of the Main Ethiopian Rift. 757 RefTek "Texan" seismographs with vertical geophones were deployed in 400 km-long axial and cross-rift lines, with another 231 in a central 3D array 100 km in diameter. An 80-instrument passive array of intermediate and broadband sensors was active during our experiment. We recorded 19 borehole shots loaded in nominal 50-meter boreholes, 2 quarry shots, and 2 lake shots. The shots ranged in size from 50-5750 kg, with the most common shot size being 1 tonne. Prior to loading each shot-hole, we measured distances between shots and the nearest structure, typically un-reinforced mud-and-wood houses, occasionally concrete irrigation ditches and aqueducts. We then used semi-empirical formulae derived by Oriard (Hendron and Oriard, 1972) to calculate expected maximum and minimum bounds on ground velocity at these structures, and selected an appropriate shot size to keep the predicted velocity below the "threshold for cosmetic damage", or 2 inches per second, at the most vulnerable structure. The Oriard formulae are derived from measurements associated with blasting for mining and civil engineering purposes and may not accurately predict the ground velocity from the source depths and explosive type used in the EAGLE and other controlled-source experiments. A detailed, trace-by-trace analysis of maximum ground velocities at our closest seismographs can provide data that will be useful in planning future large-scale seismic experiments. Preliminary results from traces within 20 km of our borehole shots suggest that maximum recorded ground velocities were within or below the maximum-minimum range predicted by Oriard, and hence that larger shot sizes could have been used with acceptable risks. A lake shot fired at the optimum depth (84 m for a 1 tonne shot) produced ground velocities that exceeded the predicted maximum at a few recodrers. However, optimum-depth shots are typically a significant distance offshore (c. 2.3 km for our shot) because of the required depth, so are unlikely to present a hazard to onshore structures. A lake shot fired in a shallower lake at half the optimum depth did not produce ground-velocities that exceed the Oriard maximum. Although we fired shots within 100 m of an unreinforced concrete aqueduct, and within 200 m of poorly engineered native buildings in poor structural condition, no damage was recorded. Our "Texan" seismometers recorded only vertical component velocity, using 4.5 Hz geophones. After removal of the geophone response the peak vertical velocity is typically measured at about 3 Hz and occurs shortly after the first arrival, presumably due to surface waves (ground roll). We are currently extending our analysis to include data from broadband, three-component recorders.
A Genetic Algorithm and Fuzzy Logic Approach for Video Shot Boundary Detection
Thounaojam, Dalton Meitei; Khelchandra, Thongam; Singh, Kh. Manglem; Roy, Sudipta
2016-01-01
This paper proposed a shot boundary detection approach using Genetic Algorithm and Fuzzy Logic. In this, the membership functions of the fuzzy system are calculated using Genetic Algorithm by taking preobserved actual values for shot boundaries. The classification of the types of shot transitions is done by the fuzzy system. Experimental results show that the accuracy of the shot boundary detection increases with the increase in iterations or generations of the GA optimization process. The proposed system is compared to latest techniques and yields better result in terms of F1score parameter. PMID:27127500
Multilevel Contextual 3-D CNNs for False Positive Reduction in Pulmonary Nodule Detection.
Dou, Qi; Chen, Hao; Yu, Lequan; Qin, Jing; Heng, Pheng-Ann
2017-07-01
False positive reduction is one of the most crucial components in an automated pulmonary nodule detection system, which plays an important role in lung cancer diagnosis and early treatment. The objective of this paper is to effectively address the challenges in this task and therefore to accurately discriminate the true nodules from a large number of candidates. We propose a novel method employing three-dimensional (3-D) convolutional neural networks (CNNs) for false positive reduction in automated pulmonary nodule detection from volumetric computed tomography (CT) scans. Compared with its 2-D counterparts, the 3-D CNNs can encode richer spatial information and extract more representative features via their hierarchical architecture trained with 3-D samples. More importantly, we further propose a simple yet effective strategy to encode multilevel contextual information to meet the challenges coming with the large variations and hard mimics of pulmonary nodules. The proposed framework has been extensively validated in the LUNA16 challenge held in conjunction with ISBI 2016, where we achieved the highest competition performance metric (CPM) score in the false positive reduction track. Experimental results demonstrated the importance and effectiveness of integrating multilevel contextual information into 3-D CNN framework for automated pulmonary nodule detection in volumetric CT data. While our method is tailored for pulmonary nodule detection, the proposed framework is general and can be easily extended to many other 3-D object detection tasks from volumetric medical images, where the targeting objects have large variations and are accompanied by a number of hard mimics.
Coates, D
2001-05-01
Microbiological tests were carried out to evaluate a new chlorine dioxide sterilant: Tristel OneShot. Preliminary in vitro suspension tests showed that solutions containing around 140 ppm chlorine dioxide achieved a reduction factor exceeding 10(6) of Staphylococcus aureus in 1min and of Bacillus subtilis spores in 2.5 min in the presence of 3g/L bovine albumin. Subsequent tests evaluated the effectiveness of Tristel One-Shot in a Medivator washer/disinfector fitted with a Tristel Generator for processing flexible endoscopes. Each test run involved three stages. In the first, the instrument and air-water channels of a gastroscope were inoculated with a suspension of Pseudomonas aeruginosa (10(8)cfu/ml) in 10% sodium glutamate and serum (0, 5 or 10%) and then drained, partially dried, and saline flushed through for total viable counts (TVCs). In the second stage, the channels were re-inoculated with test organisms; detergent was flushed through the channels which were then brushed; and saline was flushed through for TVCs. In the third stage, the channels were re-inoculated; detergent was flushed through the channels which were then brushed; the endoscope was processed in the Medivator; and saline was flushed through for TVCs. Carrying out all three stages enabled determination of (1) the contribution played by manual cleaning of channels prior to processing in the Medivator, and (2) the combined effect of manual cleaning followed by processing. Two series of test runs were done. In the first, the Tristel Generator was set to generate 230ppm chlorine dioxide, and in the second 150ppm. In the first, cleaning followed by processing in the Medivator consistently achieved a >/= 10(6)-fold reduction of test organisms, and in the second a >/= 10(5)-fold reduction. Pre-cleaning of channels was very important-when done the initial concentration of serum in the inoculum (0-10%) had no affect on the results obtained after processing. Copyright 2001 The Hospital infection Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pachuilo, Andrew R; Ragan, Eric; Goodall, John R
Visualization tools can take advantage of multiple coordinated views to support analysis of large, multidimensional data sets. Effective design of such views and layouts can be challenging, but understanding users analysis strategies can inform design improvements. We outline an approach for intelligent design configuration of visualization tools with multiple coordinated views, and we discuss a proposed software framework to support the approach. The proposed software framework could capture and learn from user interaction data to automate new compositions of views and widgets. Such a framework could reduce the time needed for meta analysis of the visualization use and lead tomore » more effective visualization design.« less
Doukas, Charalampos; Goudas, Theodosis; Fischer, Simon; Mierswa, Ingo; Chatziioannou, Aristotle; Maglogiannis, Ilias
2010-01-01
This paper presents an open image-mining framework that provides access to tools and methods for the characterization of medical images. Several image processing and feature extraction operators have been implemented and exposed through Web Services. Rapid-Miner, an open source data mining system has been utilized for applying classification operators and creating the essential processing workflows. The proposed framework has been applied for the detection of salient objects in Obstructive Nephropathy microscopy images. Initial classification results are quite promising demonstrating the feasibility of automated characterization of kidney biopsy images.
Multisample conversion of water to hydrogen by zinc for stable isotope determination
Kendall, C.; Coplen, T.B.
1985-01-01
Two techniques for the conversion of water to hydrogen for stable isotope ratio determination have been developed that are especially suited for automated multisample analysis. Both procedures involve reaction of zinc shot with a water sample at 450 ??C. in one method designed for water samples in bottles, the water is put in capillaries and is reduced by zinc in reaction vessels; overall savings in sample preparation labor of 75% have been realized over the standard uranium reduction technique. The second technique is for waters evolved under vacuum and is a sealed-tube method employing 9 mm o.d. quartz tubing. Problems inherent with zinc reduction include surface inhomogeneity of the zinc and exchange of hydrogen both with the zinc and with the glass walls of the vessels. For best results, water/zinc and water/glass surface area ratios of vessels should be kept as large as possible.
ERIC Educational Resources Information Center
Lipscombe, Trevor C.; Mungan, Carl E.
2012-01-01
In the late 18th and throughout the 19th century, lead shot for muskets was prepared by use of a shot tower. Molten lead was poured from the top of a tower and, during its fall, the drops became spherical under the action of surface tension. In this article, we ask and answer the question: "How does the size of the lead shot depend on the height…
Oura, Masaki; Wagai, Tatsuya; Chainani, Ashish; Miyawaki, Jun; Sato, Hiromi; Matsunami, Masaharu; Eguchi, Ritsuko; Kiss, Takayuki; Yamaguchi, Takashi; Nakatani, Yasuhiro; Togashi, Tadashi; Katayama, Tetsuo; Ogawa, Kanade; Yabashi, Makina; Tanaka, Yoshihito; Kohmura, Yoshiki; Tamasaku, Kenji; Shin, Shik; Ishikawa, Tetsuya
2014-01-01
In order to utilize high-brilliance photon sources, such as X-ray free-electron lasers (XFELs), for advanced time-resolved photoelectron spectroscopy (TR-PES), a single-shot CCD-based data acquisition system combined with a high-resolution hemispherical electron energy analyzer has been developed. The system’s design enables it to be controlled by an external trigger signal for single-shot pump–probe-type TR-PES. The basic performance of the system is demonstrated with an offline test, followed by online core-level photoelectron and Auger electron spectroscopy in ‘single-shot image’, ‘shot-to-shot image (image-to-image storage or block storage)’ and ‘shot-to-shot sweep’ modes at soft X-ray undulator beamline BL17SU of SPring-8. In the offline test the typical repetition rate for image-to-image storage mode has been confirmed to be about 15 Hz using a conventional pulse-generator. The function for correcting the shot-to-shot intensity fluctuations of the exciting photon beam, an important requirement for the TR-PES experiments at FEL sources, has been successfully tested at BL17SU by measuring Au 4f photoelectrons with intentionally controlled photon flux. The system has also been applied to hard X-ray PES (HAXPES) in ‘ordinary sweep’ mode as well as shot-to-shot image mode at the 27 m-long undulator beamline BL19LXU of SPring-8 and also at the SACLA XFEL facility. The XFEL-induced Ti 1s core-level spectrum of La-doped SrTiO3 is reported as a function of incident power density. The Ti 1s core-level spectrum obtained at low power density is consistent with the spectrum obtained using the synchrotron source. At high power densities the Ti 1s core-level spectra show space-charge effects which are analysed using a known mean-field model for ultrafast electron packet propagation. The results successfully confirm the capability of the present data acquisition system for carrying out the core-level HAXPES studies of condensed matter induced by the XFEL. PMID:24365935
NASA's Solar System Treks Image Mosaic Pipeline
NASA Astrophysics Data System (ADS)
Trautman, M. R.; Malhotra, S.; Nainan, C.; Kim, R. M.; Bui, B. X.; Sadaqathullah, S.; Sharma, P.; Gallegos, N.; Law, E. S.; Day, B. H.
2018-06-01
This study details the efforts of the NASA Solar System Treks project to design a framework for automated systems capable of producing quality mosaics from high resolution orbital imagery. The primary focus is on NAC, CTX, and HiRISE imagery.
Computer vision for microscopy diagnosis of malaria.
Tek, F Boray; Dempster, Andrew G; Kale, Izzet
2009-07-13
This paper reviews computer vision and image analysis studies aiming at automated diagnosis or screening of malaria infection in microscope images of thin blood film smears. Existing works interpret the diagnosis problem differently or propose partial solutions to the problem. A critique of these works is furnished. In addition, a general pattern recognition framework to perform diagnosis, which includes image acquisition, pre-processing, segmentation, and pattern classification components, is described. The open problems are addressed and a perspective of the future work for realization of automated microscopy diagnosis of malaria is provided.
2010-10-01
Requirements Application Server BEA Weblogic Express 9.2 or higher Java v5Apache Struts v2 Hibernate v2 C3PO SQL*Net client / JDBC Database Server...designed for the desktop o An HTML and JavaScript browser-based front end designed for mobile Smartphones - A Java -based framework utilizing Apache...Technology Requirements The recommended technologies are as follows: Technology Use Requirements Java Application Provides the backend application
Shot-to-shot reproducibility of a self-magnetically insulated ion diode.
Pushkarev, A I; Isakova, Yu I; Khailov, I P
2012-07-01
In this paper we present the analysis of shot to shot reproducibility of the ion beam which is formed by a self-magnetically insulated ion diode with an explosive emission graphite cathode. The experiments were carried out with the TEMP-4M accelerator operating in double-pulse mode: the first pulse is of negative polarity (300-500 ns, 100-150 kV), and this is followed by a second pulse of positive polarity (150 ns, 250-300 kV). The ion current density was 10-70 A/cm(2) depending on the diode geometry. The beam was composed from carbon ions (80%-85%) and protons. It was found that shot to shot variation in the ion current density was about 35%-40%, whilst the diode voltage and current were comparatively stable with the variation limited to no more than 10%. It was shown that focusing of the ion beam can improve the stability of the ion current generation and reduces the variation to 18%-20%. In order to find out the reason for the shot-to-shot variation in ion current density we examined the statistical correlation between the current density of the accelerated beam and other measured characteristics of the diode, such as the accelerating voltage, total current, and first pulse duration. The correlation between the ion current density measured simultaneously at different positions within the cross-section of the beam was also investigated. It was shown that the shot-to-shot variation in ion current density is mainly attributed to the variation in the density of electrons diffusing from the drift region into the A-K gap.
Mateo, Rafael; Vallverdú-Coll, Núria; López-Antia, Ana; Taggart, Mark A; Martínez-Haro, Monica; Guitart, Raimon; Ortiz-Santaliestra, Manuel E
2014-02-01
The use of lead (Pb) ammunition in the form of shot pellets has been identified as a Pb exposure risk in wildlife and their human consumers. We explore the hypothesis that Pb shot ban enforcement reduces the risk of avian Pb poisoning as well as Pb exposure in game meat consumers. We assessed compliance with a partial ban on Pb shot commencing in 2003 by examination of 937 waterbirds harvested by hunters between 2007 and 2012 in the Ebro delta (Spain). Prevalence of Pb shot ingestion was determined, as were Pb concentrations in liver and muscle tissue to evaluate the potential for Pb exposure in game meat consumers. Hunted birds with only embedded Pb shot (no steel) declined from 26.9% in 2007-08 to <2% over the following three hunting seasons after ban reinforcement. Pb shot ingestion in mallards decreased from a pre-ban value of 30.2% to 15.5% in the post-ban period. Liver Pb levels were predominantly defined by the presence of ingested shot, whereas muscle levels were defined by the presence of both ingested and embedded shot. Only 2.5% of mallard muscle tissue had Pb levels above European Union regulations for meat (0.1μg/g wet weight) in the 2008-09 season, when Pb shot ingestion prevalence was also at a minimum (5.1%). Effective restrictions in Pb ammunition use have a dual benefit since this reduces Pb exposure for game meat consumers due to embedded ammunition as well as reducing Pb poisoning in waterbirds. Copyright © 2013 Elsevier Ltd. All rights reserved.
Johnson, Perry B; Monterroso, Maria I; Yang, Fei; Mellon, Eric
2017-11-25
This work explores how the choice of prescription isodose line (IDL) affects the dose gradient, target coverage, and treatment time for Gamma Knife radiosurgery when a smaller shot is encompassed within a larger shot at the same stereotactic coordinates (shot within shot technique). Beam profiles for the 4, 8, and 16 mm collimator settings were extracted from the treatment planning system and characterized using Gaussian fits. The characterized data were used to create over 10,000 shot within shot configurations by systematically changing collimator weighting and choice of prescription IDL. Each configuration was quantified in terms of the dose gradient, target coverage, and beam-on time. By analyzing these configurations, it was found that there are regions of overlap in target size where a higher prescription IDL provides equivalent dose fall-off to a plan prescribed at the 50% IDL. Furthermore, the data indicate that treatment times within these regions can be reduced by up to 40%. An optimization strategy was devised to realize these gains. The strategy was tested for seven patients treated for 1-4 brain metastases (20 lesions total). For a single collimator setting, the gradient in the axial plane was steepest when prescribed to the 56-63% (4 mm), 62-70% (8 mm), and 77-84% (16 mm) IDL, respectively. Through utilization of the optimization technique, beam-on time was reduced by more than 15% in 16/20 lesions. The volume of normal brain receiving 12 Gy or above also decreased in many cases, and in only one instance increased by more than 0.5 cm 3 . This work demonstrates that IDL optimization using the shot within shot technique can reduce treatment times without degrading treatment plan quality.
... In the lungs ( pneumonia ) You need at least one shot. A second shot may be needed if you ... given to healthy people 6 months or older. One type of shot is injected into a muscle (often the upper ...
Cases of death caused by gas or warning firearms.
Rothschild, M A; Maxeiner, H; Schneider, V
1994-01-01
Five cases of lethal injuries caused by gas or warning firearms are discussed. In one suicide case a modified weapon (elongated barrel) and steel bullets were used to fire a shot into the head, the bullets lodged in the skull and lethal bleeding resulted. In the other cases conventional gas weapons without evidence of alteration were used for contact shots; injuries were caused by the effect of propelling powder gases. Two of these cases were suicides (temporal contact shot and back of the neck contact shot), one was an accident (inguinal contact shot with lethal bleeding), and one was an attack by another person with a contact shot against the neck with bilateral tears of the hypopharynx. After successful surgery, a delayed death occurred 12 days later caused by bleeding into the airways from the ruptured external carotid artery.
Sex hormone-dependent tRNA halves enhance cell proliferation in breast and prostate cancers.
Honda, Shozo; Loher, Phillipe; Shigematsu, Megumi; Palazzo, Juan P; Suzuki, Ryusuke; Imoto, Issei; Rigoutsos, Isidore; Kirino, Yohei
2015-07-21
Sex hormones and their receptors play critical roles in the development and progression of the breast and prostate cancers. Here we report that a novel type of transfer RNA (tRNA)-derived small RNA, termed Sex HOrmone-dependent TRNA-derived RNAs (SHOT-RNAs), are specifically and abundantly expressed in estrogen receptor (ER)-positive breast cancer and androgen receptor (AR)-positive prostate cancer cell lines. SHOT-RNAs are not abundantly present in ER(-) breast cancer, AR(-) prostate cancer, or other examined cancer cell lines from other tissues. ER-dependent accumulation of SHOT-RNAs is not limited to a cell culture system, but it also occurs in luminal-type breast cancer patient tissues. SHOT-RNAs are produced from aminoacylated mature tRNAs by angiogenin-mediated anticodon cleavage, which is promoted by sex hormones and their receptors. Resultant 5'- and 3'-SHOT-RNAs, corresponding to 5'- and 3'-tRNA halves, bear a cyclic phosphate (cP) and an amino acid at the 3'-end, respectively. By devising a "cP-RNA-seq" method that is able to exclusively amplify and sequence cP-containing RNAs, we identified the complete repertoire of 5'-SHOT-RNAs. Furthermore, 5'-SHOT-RNA, but not 3'-SHOT-RNA, has significant functional involvement in cell proliferation. These results have unveiled a novel tRNA-engaged pathway in tumorigenesis of hormone-dependent cancers and implicate SHOT-RNAs as potential candidates for biomarkers and therapeutic targets.
Time-resolved single-shot terahertz time-domain spectroscopy for ultrafast irreversible processes
NASA Astrophysics Data System (ADS)
Zhai, Zhao-Hui; Zhong, Sen-Cheng; Li, Jun; Zhu, Li-Guo; Meng, Kun; Li, Jiang; Liu, Qiao; Peng, Qi-Xian; Li, Ze-Ren; Zhao, Jian-Heng
2016-09-01
Pulsed terahertz spectroscopy is suitable for spectroscopic diagnostics of ultrafast events. However, the study of irreversible or single shot ultrafast events requires ability to record transient properties at multiple time delays, i.e., time resolved at single shot level, which is not available currently. Here by angular multiplexing use of femtosecond laser pulses, we developed and demonstrated a time resolved, transient terahertz time domain spectroscopy technique, where burst mode THz pulses were generated and then detected in a single shot measurement manner. The burst mode THz pulses contain 2 sub-THz pulses, and the time gap between them is adjustable up to 1 ns with picosecond accuracy, thus it can be used to probe the single shot event at two different time delays. The system can detect the sub-THz pulses at 0.1 THz-2.5 THz range with signal to noise ratio (SNR) of ˜400 and spectrum resolution of 0.05 THz. System design was described here, and optimizations of single shot measurement of THz pulses were discussed in detail. Methods to improve SNR were also discussed in detail. A system application was demonstrated where pulsed THz signals at different time delays of the ultrafast process were successfully acquired within single shot measurement. This time resolved transient terahertz time domain spectroscopy technique provides a new diagnostic tool for irreversible or single shot ultrafast events where dynamic information can be extracted at terahertz range within one-shot experiment.
Alves-Silva, J.; Sánchez-Soriano, N.; Beaven, R.; Klein, M.; Parkin, J.; Millard, T.H.; Bellen, H. J; Venken, K. J.T.; Ballestrem, C.; Kammerer, R.A.; Prokop, A.
2013-01-01
The correct outgrowth of axons is essential for the development and regeneration of nervous systems. Axon growth is primarily driven by microtubules. Key regulators of microtubules in this context are the spectraplakins, a family of evolutionarily conserved actin-microtubule linkers. Loss of function of the mouse spectraplakin ACF7 or of its close Drosophila homologue Short stop/Shot similarly cause severe axon shortening and microtubule disorganisation. How spectraplakins perform these functions is not known. Here we show that axonal growth promoting roles of Shot require interaction with EB1 (End binding protein) at polymerising plus ends of microtubules. We show that binding of Shot to EB1 requires SxIP motifs in Shot’s carboxyterminal tail (Ctail), mutations of these motifs abolish Shot functions in axonal growth, loss of EB1 function phenocopies Shot loss, and genetic interaction studies reveal strong functional links between Shot and EB1 in axonal growth and microtubule organisation. In addition, we report that Shot localises along microtubule shafts and stabilises them against pharmacologically induced depolymerisation. This function is EB1-independent but requires net positive charges within Ctail which essentially contribute to the microtubule shaft association of Shot. Therefore, spectraplakins are true members of two important classes of neuronal microtubule regulating proteins: +TIPs (plus end regulators) and structural MAPs (microtubule associated proteins). From our data we deduce a model that relates the different features of the spectraplakin carboxy-terminus to the two functions of Shot during axonal growth. PMID:22764224
Sex hormone-dependent tRNA halves enhance cell proliferation in breast and prostate cancers
Honda, Shozo; Loher, Phillipe; Shigematsu, Megumi; Palazzo, Juan P.; Suzuki, Ryusuke; Imoto, Issei; Rigoutsos, Isidore; Kirino, Yohei
2015-01-01
Sex hormones and their receptors play critical roles in the development and progression of the breast and prostate cancers. Here we report that a novel type of transfer RNA (tRNA)-derived small RNA, termed Sex HOrmone-dependent TRNA-derived RNAs (SHOT-RNAs), are specifically and abundantly expressed in estrogen receptor (ER)-positive breast cancer and androgen receptor (AR)-positive prostate cancer cell lines. SHOT-RNAs are not abundantly present in ER− breast cancer, AR− prostate cancer, or other examined cancer cell lines from other tissues. ER-dependent accumulation of SHOT-RNAs is not limited to a cell culture system, but it also occurs in luminal-type breast cancer patient tissues. SHOT-RNAs are produced from aminoacylated mature tRNAs by angiogenin-mediated anticodon cleavage, which is promoted by sex hormones and their receptors. Resultant 5′- and 3′-SHOT-RNAs, corresponding to 5′- and 3′-tRNA halves, bear a cyclic phosphate (cP) and an amino acid at the 3′-end, respectively. By devising a “cP-RNA-seq” method that is able to exclusively amplify and sequence cP-containing RNAs, we identified the complete repertoire of 5′-SHOT-RNAs. Furthermore, 5′-SHOT-RNA, but not 3′-SHOT-RNA, has significant functional involvement in cell proliferation. These results have unveiled a novel tRNA-engaged pathway in tumorigenesis of hormone-dependent cancers and implicate SHOT-RNAs as potential candidates for biomarkers and therapeutic targets. PMID:26124144
Time-resolved single-shot terahertz time-domain spectroscopy for ultrafast irreversible processes.
Zhai, Zhao-Hui; Zhong, Sen-Cheng; Li, Jun; Zhu, Li-Guo; Meng, Kun; Li, Jiang; Liu, Qiao; Peng, Qi-Xian; Li, Ze-Ren; Zhao, Jian-Heng
2016-09-01
Pulsed terahertz spectroscopy is suitable for spectroscopic diagnostics of ultrafast events. However, the study of irreversible or single shot ultrafast events requires ability to record transient properties at multiple time delays, i.e., time resolved at single shot level, which is not available currently. Here by angular multiplexing use of femtosecond laser pulses, we developed and demonstrated a time resolved, transient terahertz time domain spectroscopy technique, where burst mode THz pulses were generated and then detected in a single shot measurement manner. The burst mode THz pulses contain 2 sub-THz pulses, and the time gap between them is adjustable up to 1 ns with picosecond accuracy, thus it can be used to probe the single shot event at two different time delays. The system can detect the sub-THz pulses at 0.1 THz-2.5 THz range with signal to noise ratio (SNR) of ∼400 and spectrum resolution of 0.05 THz. System design was described here, and optimizations of single shot measurement of THz pulses were discussed in detail. Methods to improve SNR were also discussed in detail. A system application was demonstrated where pulsed THz signals at different time delays of the ultrafast process were successfully acquired within single shot measurement. This time resolved transient terahertz time domain spectroscopy technique provides a new diagnostic tool for irreversible or single shot ultrafast events where dynamic information can be extracted at terahertz range within one-shot experiment.
2014-01-01
Background The ability of science to produce experimental data has outpaced the ability to effectively visualize and integrate the data into a conceptual framework that can further higher order understanding. Multidimensional and shape-based observational data of regenerative biology presents a particularly daunting challenge in this regard. Large amounts of data are available in regenerative biology, but little progress has been made in understanding how organisms such as planaria robustly achieve and maintain body form. An example of this kind of data can be found in a new repository (PlanformDB) that encodes descriptions of planaria experiments and morphological outcomes using a graph formalism. Results We are developing a model discovery framework that uses a cell-based modeling platform combined with evolutionary search to automatically search for and identify plausible mechanisms for the biological behavior described in PlanformDB. To automate the evolutionary search we developed a way to compare the output of the modeling platform to the morphological descriptions stored in PlanformDB. We used a flexible connected component algorithm to create a graph representation of the virtual worm from the robust, cell-based simulation data. These graphs can then be validated and compared with target data from PlanformDB using the well-known graph-edit distance calculation, which provides a quantitative metric of similarity between graphs. The graph edit distance calculation was integrated into a fitness function that was able to guide automated searches for unbiased models of planarian regeneration. We present a cell-based model of planarian that can regenerate anatomical regions following bisection of the organism, and show that the automated model discovery framework is capable of searching for and finding models of planarian regeneration that match experimental data stored in PlanformDB. Conclusion The work presented here, including our algorithm for converting cell-based models into graphs for comparison with data stored in an external data repository, has made feasible the automated development, training, and validation of computational models using morphology-based data. This work is part of an ongoing project to automate the search process, which will greatly expand our ability to identify, consider, and test biological mechanisms in the field of regenerative biology. PMID:24917489
Using a situation awareness approach to determine decision-making behaviour in squash.
Murray, Stafford; James, Nic; Perš, Janez; Mandeljc, Rok; Vučković, Goran
2018-06-01
Situation awareness (SA) refers to the awareness of all relevant sources of information, an ability to synthesise this information using domain knowledge gained from past experiences and the ability to physically respond to a situation. Expert-novice differences have been widely reported in decision-making in complex situations although determining the small differences in expert behaviour are more elusive. This study considered how expert squash players use SA to decide on what shot to play. Matches at the 2010 (n = 14) and 2011 (n = 27) Rowe British Grand Prix were recorded and processed using Tracker software. Shot type, ball location, players' positions on court and movement parameters between the time an opponent played a shot prior to the player's shot to the time of the opponent's following shot were captured 25 times per second. Six SA clusters were named to relate to the outcome of a shot ranging from a defensive shot played under pressure to create time to an attempted winner played under no pressure with the opponent out of position. This new methodology found fine-grained SA differences in expert behaviour, even for the same shot type played from the same court area, beyond the usual expert-novice differences.
NASA Astrophysics Data System (ADS)
Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram
2016-04-01
Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.
A visual analytic framework for data fusion in investigative intelligence
NASA Astrophysics Data System (ADS)
Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David
2014-05-01
Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goetz, J.; Klemm, J.; Ortlieb, E.
The radiation dose is reconstructed for 3d MCPAEB personnel participating in exercises involving helicopter-lifted assaults in conjunction with Shot Bee of Operation Teapot, Exercise Desert Rock VI. Brigade personnel were exposed to initial radiation while in trenches at the time of the Shot Bee detonation. They were also exposed to residual radiation from an earlier test shot (Shot Turk) during their subsequent maneuvers and to residual radiation from Shot Bee during an inspection of equipment displays. The calculated total gamma doses to the bulk of the participating troops range from about 0.57-0.85 rem.
What You Can Expect with a Cortisone Shot
... should avoid before your cortisone shot. What you can expect During the cortisone shot Your doctor might ... ll then be positioned so that your doctor can easily insert the needle. The area around the ...
DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin; Uram, Thomas D.; Benson, Andrew J.; Campbell, Duncan; Cora, Sofía A.; DeRose, Joseph; Di Matteo, Tiziana; Habib, Salman; Hearin, Andrew P.; Bryce Kalmbach, J.; Krughoff, K. Simon; Lanusse, François; Lukić, Zarija; Mandelbaum, Rachel; Newman, Jeffrey A.; Padilla, Nelson; Paillas, Enrique; Pope, Adrian; Ricker, Paul M.; Ruiz, Andrés N.; Tenneti, Ananth; Vega-Martínez, Cristian A.; Wechsler, Risa H.; Zhou, Rongpu; Zu, Ying; The LSST Dark Energy Science Collaboration
2018-02-01
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palutke, S., E-mail: steffen.palutke@desy.de; Wurth, W.; Deutsches Elekronen Synchrotron
The setup and first results from commissioning of a fast online photon energy spectrometer for the vacuum ultraviolet free electron laser at Hamburg (FLASH) at DESY are presented. With the use of the latest advances in detector development, the presented spectrometer reaches readout frequencies up to 1 MHz. In this paper, we demonstrate the ability to record online photon energy spectra on a shot-to-shot base in the multi-bunch mode of FLASH. Clearly resolved shifts in the mean wavelength over the pulse train as well as shot-to-shot wavelength fluctuations arising from the statistical nature of the photon generating self-amplified spontaneous emissionmore » process have been observed. In addition to an online tool for beam calibration and photon diagnostics, the spectrometer enables the determination and selection of spectral data taken with a transparent experiment up front over the photon energy of every shot. This leads to higher spectral resolutions without the loss of efficiency or photon flux by using single-bunch mode or monochromators.« less
An Automation Framework for Neural Nets that Learn
ERIC Educational Resources Information Center
Kilmer, W. L.; Arbib, M. A.
1973-01-01
A discussion of several types of formal neurons, many of whose functions are modifiable by their own input stimuli. The language of finite automata is used to mathematicize the problem of adaptation sufficiently to remove some ambiguities of Brindley's approach. (Author)
Ghani, Milad; Font Picó, Maria Francesca; Salehinia, Shima; Palomino Cabello, Carlos; Maya, Fernando; Berlier, Gloria; Saraji, Mohammad; Cerdà, Víctor; Turnes Palomino, Gemma
2017-03-10
We present for the first time the application of metal-organic framework (MOF) mixed-matrix disks (MMD) for the automated flow-through solid-phase extraction (SPE) of environmental pollutants. Zirconium terephthalate UiO-66 and UiO-66-NH 2 MOFs with different size (90, 200 and 300nm) have been incorporated into mechanically stable polyvinylidene difluoride (PVDF) disks. The performance of the MOF-MMDs for automated SPE of seven substituted phenols prior to HPLC analysis has been evaluated using the sequential injection analysis technique. MOF-MMDs enabled the simultaneous extraction of phenols with the concomitant size exclusion of molecules of larger size. The best extraction performance was obtained using a MOF-MMD containing 90nm UiO-66-NH 2 crystals. Using the selected MOF-MMD, detection limits ranging from 0.1 to 0.2μgL -1 were obtained. Relative standard deviations ranged from 3.9 to 5.3% intra-day, and 4.7-5.7% inter-day. Membrane batch-to-batch reproducibility was from 5.2 to 6.4%. Three different groundwater samples were analyzed with the proposed method using MOF-MMDs, obtaining recoveries ranging from 90 to 98% for all tested analytes. Copyright © 2017 Elsevier B.V. All rights reserved.
Facts about Vitamin K Deficiency Bleeding
... K shot into a muscle in the thigh. One shot given just after birth will protect your baby ... easily preventable with just a single vitamin K shot at birth. References 1. Zipursky A. Prevention of vitamin K deficiency bleeding ...
Protect Your Child against Hib Disease
... 4 months old. With both vaccines, children need one booster shot when they are 12 through 15 months old. ... series of Hib shots as babies and need one booster shot when they are 12 through 15 months old. ...
... Ask your provider for a chart. Keep your shots 1 inch (2.5 centimeters) away from scars and 2 inches (5 centimeters) away from your navel. DO NOT put a shot in a spot that is bruised, swollen, or ...
Removal of Retained Lead Shot Through Laparoscopic Appendectomy
Lloyd, D. M.
2003-01-01
We describe a patient presenting with lead shot in his appendix. A plain radiograph of his lumbar spine was performed for back pain, and an incidental finding of lead shot retained within the appendix was seen. Lead shot in the appendix is associated with appendicitis, and 2 cases have been reported of lead intoxication. We suggest that an elective laparoscopic appendectomy should be offered to patients as a possible management option. PMID:12856854
A compact and versatile tender X-ray single-shot spectrometer for online XFEL diagnostics.
Rehanek, Jens; Milne, Christopher J; Szlachetko, Jakub; Czapla-Masztafiak, Joanna; Schneider, Jörg; Huthwelker, Thomas; Borca, Camelia N; Wetter, Reto; Patthey, Luc; Juranić, Pavle
2018-01-01
One of the remaining challenges for accurate photon diagnostics at X-ray free-electron lasers (FELs) is the shot-to-shot, non-destructive, high-resolution characterization of the FEL pulse spectrum at photon energies between 2 keV and 4 keV, the so-called tender X-ray range. Here, a spectrometer setup is reported, based on the von Hamos geometry and using elastic scattering as a fingerprint of the FEL-generated spectrum. It is capable of pulse-to-pulse measurement of the spectrum with an energy resolution (ΔE/E) of 10 -4 , within a bandwidth of 2%. The Tender X-ray Single-Shot Spectrometer (TXS) will grant to experimental scientists the freedom to measure the spectrum in a single-shot measurement, keeping the transmitted beam undisturbed. It will enable single-shot reconstructions for easier and faster data analysis.
NASA Astrophysics Data System (ADS)
Tamaru, S.; Kubota, H.; Yakushiji, K.; Fukushima, A.; Yuasa, S.
2017-11-01
This work presents a technique to calibrate the spin torque oscillator (STO) measurement system by utilizing the whiteness of shot noise. The raw shot noise spectrum in a magnetic tunnel junction based STO in the microwave frequency range is obtained by first subtracting the baseline noise, and then excluding the field dependent mag-noise components reflecting the thermally excited spin wave resonances. As the shot noise is guaranteed to be completely white, the total gain of the signal path should be proportional to the shot noise spectrum obtained by the above procedure, which allows for an accurate gain calibration of the system and a quantitative determination of each noise power. The power spectral density of the shot noise as a function of bias voltage obtained by this technique was compared with a theoretical calculation, which showed excellent agreement when the Fano factor was assumed to be 0.99.
High precision automated face localization in thermal images: oral cancer dataset as test case
NASA Astrophysics Data System (ADS)
Chakraborty, M.; Raman, S. K.; Mukhopadhyay, S.; Patsa, S.; Anjum, N.; Ray, J. G.
2017-02-01
Automated face detection is the pivotal step in computer vision aided facial medical diagnosis and biometrics. This paper presents an automatic, subject adaptive framework for accurate face detection in the long infrared spectrum on our database for oral cancer detection consisting of malignant, precancerous and normal subjects of varied age group. Previous works on oral cancer detection using Digital Infrared Thermal Imaging(DITI) reveals that patients and normal subjects differ significantly in their facial thermal distribution. Therefore, it is a challenging task to formulate a completely adaptive framework to veraciously localize face from such a subject specific modality. Our model consists of first extracting the most probable facial regions by minimum error thresholding followed by ingenious adaptive methods to leverage the horizontal and vertical projections of the segmented thermal image. Additionally, the model incorporates our domain knowledge of exploiting temperature difference between strategic locations of the face. To our best knowledge, this is the pioneering work on detecting faces in thermal facial images comprising both patients and normal subjects. Previous works on face detection have not specifically targeted automated medical diagnosis; face bounding box returned by those algorithms are thus loose and not apt for further medical automation. Our algorithm significantly outperforms contemporary face detection algorithms in terms of commonly used metrics for evaluating face detection accuracy. Since our method has been tested on challenging dataset consisting of both patients and normal subjects of diverse age groups, it can be seamlessly adapted in any DITI guided facial healthcare or biometric applications.
... inside the cheek or under the tongue. ■■ Injections (shots): There are two different kinds of shots: • Under the skin: Medicine is placed just under ... Subcutaneous (sub-kyu-TAY-nee-yus) injection : A shot under the skin. Sublingual : Under the tongue. Supplements : ...
Your Guide to Medicare's Preventive Services
... often is it covered? Most people only need one shot once in their lifetime. A different, second shot, ... Welcome to Medicare” preventive visit Medicare covers a one-time ... on important screenings and shots and to talk with your doctor about your ...
35. INTERIOR VIEW, WHEELBRATORFRYE SHOT PEENER FOR REMOVAL OF RUST ...
35. INTERIOR VIEW, WHEELBRATOR-FRYE SHOT PEENER FOR REMOVAL OF RUST AND SCALE; NOTE TOOLS ARE TUMBLED WITH BLASTED WITH LEAD SHOT TO CLEAN SURFACES - Warwood Tool Company, Foot of Nineteenth Street, Wheeling, Ohio County, WV
An information extraction framework for cohort identification using electronic health records.
Liu, Hongfang; Bielinski, Suzette J; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B; Jonnalagadda, Siddhartha R; Ravikumar, K E; Wu, Stephen T; Kullo, Iftikhar J; Chute, Christopher G
2013-01-01
Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework.
Experimental research control software system
NASA Astrophysics Data System (ADS)
Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.
2014-05-01
A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.
Hermes III endpoint energy calculation from photonuclear activation of 197Au and 58Ni foils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parzyck, Christopher Thomas
2014-09-01
A new process has been developed to characterize the endpoint energy of HERMES III on a shot-to-shot basis using standard dosimetry tools from the Sandia Radiation Measurements Laboratory. Photonuclear activation readings from nickel and gold foils are used in conjunction with calcium fluoride thermoluminescent dosimeters to derive estimated electron endpoint energies for a series of HERMES shots. The results are reasonably consistent with the expected endpoint voltages on those shots.
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.
2017-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.
Hsieh, Ru-Lan; Lee, Wen-Chung
2002-11-01
To investigate the therapeutic effects of one shot of low-frequency percutaneous electrical nerve stimulation one shot of transcutaneous electrical nerve stimulation in patients with low back pain. In total, 133 low back pain patients were recruited for this randomized, control study. Group 1 patients received medication only. Group 2 patients received medication plus one shot of percutaneous electrical nerve stimulation. Group 3 patients received medication plus one shot of transcutaneous electrical nerve stimulation. Therapeutic effects were measured using a visual analog scale, body surface score, pain pressure threshold, and the Quebec Back Pain Disability Scale. Immediately after one-shot treatment, the visual analog scale improved 1.53 units and the body surface score improved 3.06 units in the percutaneous electrical nerve stimulation group. In the transcutaneous electrical nerve stimulation group, the visual analog scale improved 1.50 units and the body surface score improved 3.98 units. The improvements did not differ between the two groups. There were no differences in improvement at 3 days or 1 wk after the treatment among the three groups. Simple one-shot treatment with percutaneous electrical nerve stimulation or transcutaneous electrical nerve stimulation provided immediate pain relief for low back pain patients. One-shot transcutaneous electrical nerve stimulation treatment is recommended due to the rarity of side effects and its convenient application.
High resolution human diffusion tensor imaging using 2-D navigated multi-shot SENSE EPI at 7 Tesla
Jeong, Ha-Kyu; Gore, John C.; Anderson, Adam W.
2012-01-01
The combination of parallel imaging with partial Fourier acquisition has greatly improved the performance of diffusion-weighted single-shot EPI and is the preferred method for acquisitions at low to medium magnetic field strength such as 1.5 or 3 Tesla. Increased off-resonance effects and reduced transverse relaxation times at 7 Tesla, however, generate more significant artifacts than at lower magnetic field strength and limit data acquisition. Additional acceleration of k-space traversal using a multi-shot approach, which acquires a subset of k-space data after each excitation, reduces these artifacts relative to conventional single-shot acquisitions. However, corrections for motion-induced phase errors are not straightforward in accelerated, diffusion-weighted multi-shot EPI because of phase aliasing. In this study, we introduce a simple acquisition and corresponding reconstruction method for diffusion-weighted multi-shot EPI with parallel imaging suitable for use at high field. The reconstruction uses a simple modification of the standard SENSE algorithm to account for shot-to-shot phase errors; the method is called Image Reconstruction using Image-space Sampling functions (IRIS). Using this approach, reconstruction from highly aliased in vivo image data using 2-D navigator phase information is demonstrated for human diffusion-weighted imaging studies at 7 Tesla. The final reconstructed images show submillimeter in-plane resolution with no ghosts and much reduced blurring and off-resonance artifacts. PMID:22592941
Rousanoglou, Elissavet; Noutsos, Konstantinos; Bayios, Ioannis; Boudolos, Konstantinos
2014-01-01
The purpose of this study was to examine the differences in the ground reaction force (GRF) patterns between elite and novice players during two types of handball shots, as well as the relationships between throwing performance and the GRF variables. Ball velocity and throwing accuracy were measured during jump shots and 3-step shots performed by 15 elite and 15 novice players. The GRF pattern was recorded for the vertical and the anterior-posterior GRF components (Kistler forceplate type-9281, 750Hz). One-way ANOVA was used for the group differences and the Pearson coefficient for the correlation between throwing performance and GRF variables (SPSS 21.0, p ≤ 0.05). The elite players performed better in both types of shot. Both groups developed consistent and similar GRF patterns, except for the novices’ inconsistent Fz pattern in the 3-step shot. The GRF variables differed significantly between groups in the 3-step shot (p ≤ 0.05). Significant correlations were found only for ball velocity and predominantly for the novice players during the 3-step shot (p ≤ 0.05). The results possibly highlight a shortage in the novice ability to effectively reduce their forward momentum so as to provide a stable base of support for the momentum transfer up the kinetic chain, a situation that may predispose athletes to injury. PMID:25031672
Rousanoglou, Elissavet; Noutsos, Konstantinos; Bayios, Ioannis; Boudolos, Konstantinos
2014-03-27
The purpose of this study was to examine the differences in the ground reaction force (GRF) patterns between elite and novice players during two types of handball shots, as well as the relationships between throwing performance and the GRF variables. Ball velocity and throwing accuracy were measured during jump shots and 3-step shots performed by 15 elite and 15 novice players. The GRF pattern was recorded for the vertical and the anterior-posterior GRF components (Kistler forceplate type-9281, 750Hz). One-way ANOVA was used for the group differences and the Pearson coefficient for the correlation between throwing performance and GRF variables (SPSS 21.0, p ≤ 0.05). The elite players performed better in both types of shot. Both groups developed consistent and similar GRF patterns, except for the novices' inconsistent Fz pattern in the 3-step shot. The GRF variables differed significantly between groups in the 3-step shot (p ≤ 0.05). Significant correlations were found only for ball velocity and predominantly for the novice players during the 3-step shot (p ≤ 0.05). The results possibly highlight a shortage in the novice ability to effectively reduce their forward momentum so as to provide a stable base of support for the momentum transfer up the kinetic chain, a situation that may predispose athletes to injury.
DOT National Transportation Integrated Search
2010-01-01
The current project, funded by MIOH-UTC for the period 1/1/2009- 4/30/2010, is concerned : with the development of the framework for a transportation facility inspection system using : advanced image processing techniques. The focus of this study is ...
Linguistics and Information Science
ERIC Educational Resources Information Center
Montgomery, Christine A.
1972-01-01
This paper defines the relationship between linguistics and information science in terms of a common interest in natural language. The concept of a natural language information system is introduced as a framework for reviewing automated language processing efforts by computational linguists and information scientists. (96 references) (Author)
Get Your Flu Shot!| NIH MedlinePlus the Magazine
... of this page please turn Javascript on. Feature: Flu Shot Get Your Flu Shot! Past Issues / Winter 2011 Table of Contents ... failure, or lung disease "For the 2010–2011 flu season, the flu vaccine provides protection against the ...
Allergy Shots: Could They Help Your Allergies?
... do I have to get? Most people get 1 or 2 shots each week at first. After about 6 months ... Teens, Procedures & Devices, Your Health ResourcesTags: allergy, allergy shots April 1, 1998 Copyright © American Academy of Family Physicians This ...
What Vaccinations Do You Need?
... antibodies. This is why some vaccines might need one shot, while others need more than one shot. In some cases, a blood test is used ... Depending on the vaccine, you may need only one shot to protect you for life. Other vaccines may ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yao; Balaprakash, Prasanna; Meng, Jiayuan
We present Raexplore, a performance modeling framework for architecture exploration. Raexplore enables rapid, automated, and systematic search of architecture design space by combining hardware counter-based performance characterization and analytical performance modeling. We demonstrate Raexplore for two recent manycore processors IBM Blue- Gene/Q compute chip and Intel Xeon Phi, targeting a set of scientific applications. Our framework is able to capture complex interactions between architectural components including instruction pipeline, cache, and memory, and to achieve a 3–22% error for same-architecture and cross-architecture performance predictions. Furthermore, we apply our framework to assess the two processors, and discover and evaluate a list ofmore » architectural scaling options for future processor designs.« less
Formal Compiler Implementation in a Logical Framework
2003-04-29
variable set [], we omit the brackets and use the simpler notation v. MetaPRL is a tactic-based prover that uses OCaml [20] as its meta-language. When a...rewrite is defined in MetaPRL, the framework creates an OCaml expression that can be used to apply the rewrite. Code to guide the application of...rewrites is written in OCaml , using a rich set of primitives provided by MetaPRL. MetaPRL automates the construction of most guidance code; we describe
AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, Yingssu; Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080; McPhillips, Scott E.
New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data,more » performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference Fourier maps.« less
Pain, Deborah J.; Cromie, Ruth L.; Newth, Julia; Brown, Martin J.; Crutcher, Eric; Hardman, Pippa; Hurst, Louise; Mateo, Rafael; Meharg, Andrew A.; Moran, Annette C.; Raab, Andrea; Taggart, Mark A.; Green, Rhys E.
2010-01-01
Background Lead is highly toxic to animals. Humans eating game killed using lead ammunition generally avoid swallowing shot or bullets and dietary lead exposure from this source has been considered low. Recent evidence illustrates that lead bullets fragment on impact, leaving small lead particles widely distributed in game tissues. Our paper asks whether lead gunshot pellets also fragment upon impact, and whether lead derived from spent gunshot and bullets in the tissues of game animals could pose a threat to human health. Methodology/Principal Findings Wild-shot gamebirds (6 species) obtained in the UK were X-rayed to determine the number of shot and shot fragments present, and cooked using typical methods. Shot were then removed to simulate realistic practice before consumption, and lead concentrations determined. Data from the Veterinary Medicines Directorate Statutory Surveillance Programme documenting lead levels in raw tissues of wild gamebirds and deer, without shot being removed, are also presented. Gamebirds containing ≥5 shot had high tissue lead concentrations, but some with fewer or no shot also had high lead concentrations, confirming X-ray results indicating that small lead fragments remain in the flesh of birds even when the shot exits the body. A high proportion of samples from both surveys had lead concentrations exceeding the European Union Maximum Level of 100 ppb w.w. (0.1 mg kg−1 w.w.) for meat from bovine animals, sheep, pigs and poultry (no level is set for game meat), some by several orders of magnitude. High, but feasible, levels of consumption of some species could result in the current FAO/WHO Provisional Weekly Tolerable Intake of lead being exceeded. Conclusions/Significance The potential health hazard from lead ingested in the meat of game animals may be larger than previous risk assessments indicated, especially for vulnerable groups, such as children, and those consuming large amounts of game. PMID:20436670
Real-time inverse planning for Gamma Knife radiosurgery.
Wu, Q Jackie; Chankong, Vira; Jitprapaikulsarn, Suradet; Wessels, Barry W; Einstein, Douglas B; Mathayomchan, Boonyanit; Kinsella, Timothy J
2003-11-01
The challenges of real-time Gamma Knife inverse planning are the large number of variables involved and the unknown search space a priori. With limited collimator sizes, shots have to be heavily overlapped to form a smooth prescription isodose line that conforms to the irregular target shape. Such overlaps greatly influence the total number of shots per plan, making pre-determination of the total number of shots impractical. However, this total number of shots usually defines the search space, a pre-requisite for most of the optimization methods. Since each shot only covers part of the target, a collection of shots in different locations and various collimator sizes selected makes up the global dose distribution that conforms to the target. Hence, planning or placing these shots is a combinatorial optimization process that is computationally expensive by nature. We have previously developed a theory of shot placement and optimization based on skeletonization. The real-time inverse planning process, reported in this paper, is an expansion and the clinical implementation of this theory. The complete planning process consists of two steps. The first step is to determine an optimal number of shots including locations and sizes and to assign initial collimator size to each of the shots. The second step is to fine-tune the weights using a linear-programming technique. The objective function is to minimize the total dose to the target boundary (i.e., maximize the dose conformity). Results of an ellipsoid test target and ten clinical cases are presented. The clinical cases are also compared with physician's manual plans. The target coverage is more than 99% for manual plans and 97% for all the inverse plans. The RTOG PITV conformity indices for the manual plans are between 1.16 and 3.46, compared to 1.36 to 2.4 for the inverse plans. All the inverse plans are generated in less than 2 min, making real-time inverse planning a reality.
Pain, Deborah J; Cromie, Ruth L; Newth, Julia; Brown, Martin J; Crutcher, Eric; Hardman, Pippa; Hurst, Louise; Mateo, Rafael; Meharg, Andrew A; Moran, Annette C; Raab, Andrea; Taggart, Mark A; Green, Rhys E
2010-04-26
Lead is highly toxic to animals. Humans eating game killed using lead ammunition generally avoid swallowing shot or bullets and dietary lead exposure from this source has been considered low. Recent evidence illustrates that lead bullets fragment on impact, leaving small lead particles widely distributed in game tissues. Our paper asks whether lead gunshot pellets also fragment upon impact, and whether lead derived from spent gunshot and bullets in the tissues of game animals could pose a threat to human health. Wild-shot gamebirds (6 species) obtained in the UK were X-rayed to determine the number of shot and shot fragments present, and cooked using typical methods. Shot were then removed to simulate realistic practice before consumption, and lead concentrations determined. Data from the Veterinary Medicines Directorate Statutory Surveillance Programme documenting lead levels in raw tissues of wild gamebirds and deer, without shot being removed, are also presented. Gamebirds containing > or =5 shot had high tissue lead concentrations, but some with fewer or no shot also had high lead concentrations, confirming X-ray results indicating that small lead fragments remain in the flesh of birds even when the shot exits the body. A high proportion of samples from both surveys had lead concentrations exceeding the European Union Maximum Level of 100 ppb w.w. (0.1 mg kg(-1) w.w.) for meat from bovine animals, sheep, pigs and poultry (no level is set for game meat), some by several orders of magnitude. High, but feasible, levels of consumption of some species could result in the current FAO/WHO Provisional Weekly Tolerable Intake of lead being exceeded. The potential health hazard from lead ingested in the meat of game animals may be larger than previous risk assessments indicated, especially for vulnerable groups, such as children, and those consuming large amounts of game.
Query by example video based on fuzzy c-means initialized by fixed clustering center
NASA Astrophysics Data System (ADS)
Hou, Sujuan; Zhou, Shangbo; Siddique, Muhammad Abubakar
2012-04-01
Currently, the high complexity of video contents has posed the following major challenges for fast retrieval: (1) efficient similarity measurements, and (2) efficient indexing on the compact representations. A video-retrieval strategy based on fuzzy c-means (FCM) is presented for querying by example. Initially, the query video is segmented and represented by a set of shots, each shot can be represented by a key frame, and then we used video processing techniques to find visual cues to represent the key frame. Next, because the FCM algorithm is sensitive to the initializations, here we initialized the cluster center by the shots of query video so that users could achieve appropriate convergence. After an FCM cluster was initialized by the query video, each shot of query video was considered a benchmark point in the aforesaid cluster, and each shot in the database possessed a class label. The similarity between the shots in the database with the same class label and benchmark point can be transformed into the distance between them. Finally, the similarity between the query video and the video in database was transformed into the number of similar shots. Our experimental results demonstrated the performance of this proposed approach.
Detection of hidden shot balls in a gas-cooled turbine blade with an NRT gadolinium tagging method
NASA Astrophysics Data System (ADS)
Sim, Cheul Muu; Kim, Yi Kyung; Kim, TaeJoo; Lee, Kye Hong; Kim, Jeong Uk
2009-06-01
This report provides a preliminary insight into the benefits and effectiveness of neutron radiography in identifying alien materials, namely shot balls hidden in a turbine blade that are otherwise undetected using other methods. The detection of 0.2-mm-diameter shot balls in gas-cooled turbine blades is possible for thermal neutron radiography. A tagging processing is more useful for a distinctive image of newer turbine blades. Areas of concern for the tagging process include the solution concentration and the possibility of a slight washing of the blades. The location of the shot balls within the turbine blades tagged with Gd((2%, 5%)+water) was shown. Shot balls were placed externally on a turbine blade (F100-700, F100-200) surface in order to check for a dead zone from a surface examination. The image is produced from neutron radiography after a 5 min exposure time. When the blade is tagged with 2% and 5% Gd with slight washing, the shot can also be effectively seen on the SR-45 film. Shot balls are more obvious on a neutron image SR-45 film than an image plate or a DR film.
Deep learning based beat event detection in action movie franchises
NASA Astrophysics Data System (ADS)
Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.
2018-04-01
Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ancora, Stefania; Bianchi, Nicola; Leonzio, Claudio
2008-06-15
Waterbirds are exposed to many contaminants, including lead from ingestion of shot and fishing sinkers. Lead poisoning had never been reported in flamingos wintering in Italian wetlands. Our investigation stems from a case of four flamingos found dead in Tuscany in 2002 with numerous lead shot in their gizzards. We therefore considered other specimens found dead in different Italian wetlands. Many lead shot found in gizzards and lead tissue concentrations confirmed the hypothesis of lead poisoning in two of the seven specimens analysed: concentrations in liver, kidney, and bone were 361.3, 265.09, and 43.31 {mu}g/g d.w., respectively. Lead organotropism wasmore » typical of acute poisoning. Cadmium and mercury were also determined, and found to be in line with what little data are available on this species in the literature. Although Italy recently endorsed the African-Eurasian Waterbird Agreement (AEWA) prohibiting use of lead shot for hunting in wetlands, our results reveal a first case of lead shot poisoning in flamingos wintering in Italian wetlands. This evidence sounds a further warning of the problem of spent lead shot in countries where hunting in wetlands is not strictly regulated.« less
Kinematic and kinetic analysis of overhand, sidearm and underhand lacrosse shot techniques.
Macaulay, Charles A J; Katz, Larry; Stergiou, Pro; Stefanyshyn, Darren; Tomaghelli, Luciano
2017-12-01
Lacrosse requires the coordinated performance of many complex skills. One of these skills is shooting on the opponents' net using one of three techniques: overhand, sidearm or underhand. The purpose of this study was to (i) determine which technique generated the highest ball velocity and greatest shot accuracy and (ii) identify kinematic and kinetic variables that contribute to a high velocity and high accuracy shot. Twelve elite male lacrosse players participated in this study. Kinematic data were sampled at 250 Hz, while two-dimensional force plates collected ground reaction force data (1000 Hz). Statistical analysis showed significantly greater ball velocity for the sidearm technique than overhand (P < 0.001) and underhand (P < 0.001) techniques. No statistical difference was found for shot accuracy (P > 0.05). Kinematic and kinetic variables were not significantly correlated to shot accuracy or velocity across all shot types; however, when analysed independently, the lead foot horizontal impulse showed a negative correlation with underhand ball velocity (P = 0.042). This study identifies the technique with the highest ball velocity, defines kinematic and kinetic predictors related to ball velocity and provides information to coaches and athletes concerned with improving lacrosse shot performance.
Single-shot spectroscopy of broadband Yb fiber laser
NASA Astrophysics Data System (ADS)
Suzuki, Masayuki; Yoneya, Shin; Kuroda, Hiroto
2017-02-01
We have experimentally reported on a real-time single-shot spectroscopy of a broadband Yb-doped fiber (YDF) laser which based on a nonlinear polarization evolution by using a time-stretched dispersive Fourier transformation technique. We have measured an 8000 consecutive single-shot spectra of mode locking and noise-like pulse (NLP), because our developed broadband YDF oscillator can individually operate the mode locking and NLP by controlling a pump LD power and angle of waveplates. A shot-to-shot spectral fluctuation was observed in NLP. For the investigation of pulse formation dynamics, we have measured the spectral evolution in an initial fluctuations of mode locked broadband YDF laser at an intracavity dispersion of 1500 and 6200 fs2 for the first time. In both case, a build-up time between cw and steady-state mode locking was estimated to be 50 us, the dynamics of spectral evolution between cw and mode locking, however, was completely different. A shot-to-shot strong spectral fluctuation, as can be seen in NLP spectra, was observed in the initial timescale of 20 us at the intracavity dispersion of 1500 fs2. These new findings would impact on understanding the birth of the broadband spectral formation in fiber laser oscillator.
A design automation framework for computational bioenergetics in biological networks.
Angione, Claudio; Costanza, Jole; Carapezza, Giovanni; Lió, Pietro; Nicosia, Giuseppe
2013-10-01
The bioenergetic activity of mitochondria can be thoroughly investigated by using computational methods. In particular, in our work we focus on ATP and NADH, namely the metabolites representing the production of energy in the cell. We develop a computational framework to perform an exhaustive investigation at the level of species, reactions, genes and metabolic pathways. The framework integrates several methods implementing the state-of-the-art algorithms for many-objective optimization, sensitivity, and identifiability analysis applied to biological systems. We use this computational framework to analyze three case studies related to the human mitochondria and the algal metabolism of Chlamydomonas reinhardtii, formally described with algebraic differential equations or flux balance analysis. Integrating the results of our framework applied to interacting organelles would provide a general-purpose method for assessing the production of energy in a biological network.
NASA Astrophysics Data System (ADS)
Jansson, Samuel; Brydegaard, Mikkel; Papayannis, Alexandros; Tsaknakis, Georgios; Åkesson, Susanne
2017-07-01
The migration of aerofauna is a seasonal phenomenon of global scale, engaging billions of individuals in long-distance movements every year. Multiband lidar systems are commonly employed for the monitoring of aerosols and atmospheric gases, and a number of systems are operated regularly across Europe in the framework of the European Aerosol Lidar Network (EARLINET). This work examines the feasibility of utilizing EARLINET for the monitoring and classification of migratory fauna based on their pigmentation. An EARLINET Raman lidar system in Athens transmits laser pulses in three bands. By installing a four-channel digital oscilloscope on the system, the backscattered light from single-laser shots is measured. Roughly 100 h of data were gathered in the summer of 2013. The data were examined for aerofauna observations, and a total of 1735 observations interpreted as airborne organisms intercepting the laser beam were found during the study period in July to August 2013. The properties of the observations were analyzed spectrally and intercompared. A spectral multimodality that could be related to different observed species is shown. The system used in this pilot study is located in Athens, Greece. It is concluded that monitoring aerial migration using it and other similar systems is feasible with minor modifications, and that in-flight species classification could be possible.
Hofmann, Kerstin M; Masood, Umar; Pawelke, Joerg; Wilkens, Jan J
2015-09-01
Laser-driven proton acceleration is suggested as a cost- and space-efficient alternative for future radiation therapy centers, although the properties of these beams are fairly different compared to conventionally accelerated proton beams. The laser-driven proton beam is extremely pulsed containing a very high proton number within ultrashort bunches at low bunch repetition rates of few Hz and the energy spectrum of the protons per bunch is very broad. Moreover, these laser accelerated bunches are subject to shot-to-shot fluctuations. Therefore, the aim of this study was to investigate the feasibility of a compact gantry design for laser-driven proton therapy and to determine limitations to comply with. Based on a published gantry beam line design which can filter parabolic spectra from an exponentially decaying broad initial spectrum, a treatment planning study was performed on real patient data sets. All potential parabolic spectra were fed into a treatment planning system and numerous spot scanning proton plans were calculated. To investigate limitations in the fluence per bunch, the proton number of the initial spectrum and the beam width at patient entrance were varied. A scenario where only integer shots are delivered as well as an intensity modulation from shot to shot was studied. The resulting plans were evaluated depending on their dosimetric quality and in terms of required treatment time. In addition, the influence of random shot-to-shot fluctuations on the plan quality was analyzed. The study showed that clinically relevant dose distributions can be produced with the system under investigation even with integer shots. For small target volumes receiving high doses per fraction, the initial proton number per bunch must remain between 1.4 × 10(8) and 8.3 × 10(9) to achieve acceptable delivery times as well as plan qualities. For larger target volumes and standard doses per fraction, the initial proton number is even more restricted to stay between 1.4 × 10(9) and 2.9 × 10(9). The lowest delivery time that could be reached for such a case was 16 min for a 10 Hz system. When modulating the intensity from shot to shot, the delivery time can be reduced to 6 min for this scenario. Since the shot-to-shot fluctuations are of random nature, a compensation effect can be observed, especially for higher laser shot numbers. Therefore, a fluctuation of ± 30% within the proton number does not translate into a dosimetric deviation of the same size. However, for plans with short delivery times these fluctuations cannot cancel out sufficiently, even for ± 10% fluctuations. Under the analyzed terms, it is feasible to achieve clinically relevant dose distributions with laser-driven proton beams. However, to keep the delivery times of the proton plans comparable to conventional proton plans for typical target volumes, a device is required which can modulate the bunch intensity from shot to shot. From the laser acceleration point of view, the proton number per bunch must be kept under control as well as the reproducibility of the bunches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hofmann, Kerstin M., E-mail: kerstin.hofmann@lrz.tu-muenchen.de; Wilkens, Jan J.; Masood, Umar
Purpose: Laser-driven proton acceleration is suggested as a cost- and space-efficient alternative for future radiation therapy centers, although the properties of these beams are fairly different compared to conventionally accelerated proton beams. The laser-driven proton beam is extremely pulsed containing a very high proton number within ultrashort bunches at low bunch repetition rates of few Hz and the energy spectrum of the protons per bunch is very broad. Moreover, these laser accelerated bunches are subject to shot-to-shot fluctuations. Therefore, the aim of this study was to investigate the feasibility of a compact gantry design for laser-driven proton therapy and tomore » determine limitations to comply with. Methods: Based on a published gantry beam line design which can filter parabolic spectra from an exponentially decaying broad initial spectrum, a treatment planning study was performed on real patient data sets. All potential parabolic spectra were fed into a treatment planning system and numerous spot scanning proton plans were calculated. To investigate limitations in the fluence per bunch, the proton number of the initial spectrum and the beam width at patient entrance were varied. A scenario where only integer shots are delivered as well as an intensity modulation from shot to shot was studied. The resulting plans were evaluated depending on their dosimetric quality and in terms of required treatment time. In addition, the influence of random shot-to-shot fluctuations on the plan quality was analyzed. Results: The study showed that clinically relevant dose distributions can be produced with the system under investigation even with integer shots. For small target volumes receiving high doses per fraction, the initial proton number per bunch must remain between 1.4 × 10{sup 8} and 8.3 × 10{sup 9} to achieve acceptable delivery times as well as plan qualities. For larger target volumes and standard doses per fraction, the initial proton number is even more restricted to stay between 1.4 × 10{sup 9} and 2.9 × 10{sup 9}. The lowest delivery time that could be reached for such a case was 16 min for a 10 Hz system. When modulating the intensity from shot to shot, the delivery time can be reduced to 6 min for this scenario. Since the shot-to-shot fluctuations are of random nature, a compensation effect can be observed, especially for higher laser shot numbers. Therefore, a fluctuation of ±30% within the proton number does not translate into a dosimetric deviation of the same size. However, for plans with short delivery times these fluctuations cannot cancel out sufficiently, even for ±10% fluctuations. Conclusions: Under the analyzed terms, it is feasible to achieve clinically relevant dose distributions with laser-driven proton beams. However, to keep the delivery times of the proton plans comparable to conventional proton plans for typical target volumes, a device is required which can modulate the bunch intensity from shot to shot. From the laser acceleration point of view, the proton number per bunch must be kept under control as well as the reproducibility of the bunches.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurricane, O. A.; Clark, D. S.
The work is summarized from several perspectives: 1D simulation perspective: Post-shot models agree with yield data to within a factor of ~2 at low implosion velocities, but the models diverge from the data as the velocity and convergence ratio increase. 2D simulation perspective: Integrated hohlraum-capsule post-shot models agree with primary data for most implosions, but overpredict yield and DSR for a few of the highest velocity implosions. High-resolution 3D post-shot capsule-only modeling captures much of the delivered performance of the one shot currently simulated.
Doubled full shot noise in quantum coherent superconductor-semiconductor junctions.
Lefloch, F; Hoffmann, C; Sanquer, M; Quirion, D
2003-02-14
We performed low temperature shot noise measurements in superconductor (TiN) strongly disordered normal metal (heavily doped Si) weakly transparent junctions. We show that the conductance has a maximum due to coherent multiple Andreev reflections at low energy and that the shot noise is then twice the Poisson noise (S = 4eI). When the subgap conductance reaches its minimum at finite voltage the shot noise changes to the normal value (S = 2eI) due to a large quasiparticle contribution.
Shot-to-shot reproducibility of a self-magnetically insulated ion diode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pushkarev, A. I.; Isakova, Yu. I.; Khailov, I. P.
In this paper we present the analysis of shot to shot reproducibility of the ion beam which is formed by a self-magnetically insulated ion diode with an explosive emission graphite cathode. The experiments were carried out with the TEMP-4M accelerator operating in double-pulse mode: the first pulse is of negative polarity (300-500 ns, 100-150 kV), and this is followed by a second pulse of positive polarity (150 ns, 250-300 kV). The ion current density was 10-70 A/cm{sup 2} depending on the diode geometry. The beam was composed from carbon ions (80%-85%) and protons. It was found that shot to shotmore » variation in the ion current density was about 35%-40%, whilst the diode voltage and current were comparatively stable with the variation limited to no more than 10%. It was shown that focusing of the ion beam can improve the stability of the ion current generation and reduces the variation to 18%-20%. In order to find out the reason for the shot-to-shot variation in ion current density we examined the statistical correlation between the current density of the accelerated beam and other measured characteristics of the diode, such as the accelerating voltage, total current, and first pulse duration. The correlation between the ion current density measured simultaneously at different positions within the cross-section of the beam was also investigated. It was shown that the shot-to-shot variation in ion current density is mainly attributed to the variation in the density of electrons diffusing from the drift region into the A-K gap.« less
Use and Perceptions of Caffeinated Energy Drinks and Energy Shots in Canada.
Wiggers, Danielle; Reid, Jessica L; White, Christine M; Hammond, David
2017-12-01
In Canada, energy drinks and energy shots are currently classified and regulated differently (food and drugs versus natural health products, respectively), on the assumption that they are used and perceived differently. The current study examined potential differences in use and perceptions of energy drinks and shots. An online survey was conducted in 2015 using a national commercial online panel of youth and young adults aged 12-24 years (n=2,040 retained for analysis in 2016). Participants were randomized to view an image of an energy shot or drink, and were asked about 14 potential reasons for using the product. Past consumption of each product was also assessed. Chi-square and t-tests were conducted to examine differences in use and perceptions between products. Overall, 15.6% of respondents reported using both energy shots and drinks. Of all respondents, <1% had tried only energy shots, whereas 58.0% had tried only energy drinks. For each product, the most commonly reported reasons for use were "to stay awake" and "to increase concentration or alertness." Out of 14 potential reasons for use, respondents were significantly more likely to endorse seven of the reasons for energy drinks rather than shots; however, the magnitude of these differences was modest and the ordering of the reasons for use of each product was comparable. Despite differences in prevalence of ever-use of energy shots and drinks, consumption patterns and perceived reasons for using the products are similar. The findings provide little support for regulating energy shots differently than energy drinks. Copyright © 2017 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Proof Rules for Automated Compositional Verification through Learning
NASA Technical Reports Server (NTRS)
Barringer, Howard; Giannakopoulou, Dimitra; Pasareanu, Corina S.
2003-01-01
Compositional proof systems not only enable the stepwise development of concurrent processes but also provide a basis to alleviate the state explosion problem associated with model checking. An assume-guarantee style of specification and reasoning has long been advocated to achieve compositionality. However, this style of reasoning is often non-trivial, typically requiring human input to determine appropriate assumptions. In this paper, we present novel assume- guarantee rules in the setting of finite labelled transition systems with blocking communication. We show how these rules can be applied in an iterative and fully automated fashion within a framework based on learning.
Experimental Study on Fatigue Behaviour of Shot-Peened Open-Hole Steel Plates
Wang, Zhi-Yu; Wang, Qing-Yuan; Cao, Mengqin
2017-01-01
This paper presents an experimental study on the fatigue behaviour of shot-peened open-hole plates with Q345 steel. The beneficial effects induced by shot peening on the fatigue life improvement are highlighted. The characteristic fatigue crack initiation and propagation modes of open-hole details under fatigue loading are revealed. The surface hardening effect brought by the shot peening is analyzed from the aspects of in-depth micro-hardness and compressive residual stress. The fatigue life results are evaluated and related design suggestions are made as a comparison with codified detail categories. In particular, a fracture mechanics theory-based method is proposed and demonstrated its validity in predicting the fatigue life of studied shot-peened open-hole details. PMID:28841160
ECONOMICS OF SAMPLE COMPOSITING AS A SCREENING TOOL IN GROUND WATER QUALITY MONITORING
Recent advances in high throughput/automated compositing with robotics/field-screening methods offer seldom-tapped opportunities for achieving cost-reduction in ground water quality monitoring programs. n economic framework is presented in this paper for the evaluation of sample ...
DOT National Transportation Integrated Search
2017-04-01
This report presents a high-level test and evaluation framework for cooperative driving automation systems that have the potential to significantly improve mobility and enhance traffic flow stability with better safety. It focuses on the test and eva...
Integration and Evaluation of Automated Pavement Distress Data in INDOT’s Pavement Management System
DOT National Transportation Integrated Search
2017-05-01
This study was in two parts. The first part established and demonstrated a framework for pavement data integration. This is critical for fulfilling QC/QA needs of INDOTs pavement management system, because the precision of the physical location re...
Shot trajectory parameters in gold medal stationary shot-putters during world-class competition.
Frossard, Laurent; Smeathers, James; O'Riordan, Alison; Goodman, Scott
2007-10-01
The parameters of the shot's trajectory were reported for male and female gold medalists (classes F52, F53, F54, and F55) who competed at the 2000 Paralympic Games and the 2002 International Paralympic Committee (IPC) World Championships. The specific objective was to determine the magnitude of differences in these parameters across classes and genders. The release velocity of the shot increased with the performance and the classification for both males (8.30 m/s - 9.96 m/s) and females (4.58 m/s - 8.50 m/s). The measured angle of the shot's trajectory at release also increased with the performance and the classification for both males (27.54 degrees - 32.47 degrees) and females (9.02 degrees - 34.52 degrees). The position of the shot from a fixed reference point at release revealed a similar trend for both males (2.01 m - 2.68 m) and females (1.16 m - 1.98 m), although it was weaker.
Atlas-based liver segmentation and hepatic fat-fraction assessment for clinical trials.
Yan, Zhennan; Zhang, Shaoting; Tan, Chaowei; Qin, Hongxing; Belaroussi, Boubakeur; Yu, Hui Jing; Miller, Colin; Metaxas, Dimitris N
2015-04-01
Automated assessment of hepatic fat-fraction is clinically important. A robust and precise segmentation would enable accurate, objective and consistent measurement of hepatic fat-fraction for disease quantification, therapy monitoring and drug development. However, segmenting the liver in clinical trials is a challenging task due to the variability of liver anatomy as well as the diverse sources the images were acquired from. In this paper, we propose an automated and robust framework for liver segmentation and assessment. It uses single statistical atlas registration to initialize a robust deformable model to obtain fine segmentation. Fat-fraction map is computed by using chemical shift based method in the delineated region of liver. This proposed method is validated on 14 abdominal magnetic resonance (MR) volumetric scans. The qualitative and quantitative comparisons show that our proposed method can achieve better segmentation accuracy with less variance comparing with two other atlas-based methods. Experimental results demonstrate the promises of our assessment framework. Copyright © 2014 Elsevier Ltd. All rights reserved.
Smart Buildings and Demand Response
NASA Astrophysics Data System (ADS)
Kiliccote, Sila; Piette, Mary Ann; Ghatikar, Girish
2011-11-01
Advances in communications and control technology, the strengthening of the Internet, and the growing appreciation of the urgency to reduce demand side energy use are motivating the development of improvements in both energy efficiency and demand response (DR) systems in buildings. This paper provides a framework linking continuous energy management and continuous communications for automated demand response (Auto-DR) in various times scales. We provide a set of concepts for monitoring and controls linked to standards and procedures such as Open Automation Demand Response Communication Standards (OpenADR). Basic building energy science and control issues in this approach begin with key building components, systems, end-uses and whole building energy performance metrics. The paper presents a framework about when energy is used, levels of services by energy using systems, granularity of control, and speed of telemetry. DR, when defined as a discrete event, requires a different set of building service levels than daily operations. We provide examples of lessons from DR case studies and links to energy efficiency.
Xu, Yupeng; Yan, Ke; Kim, Jinman; Wang, Xiuying; Li, Changyang; Su, Li; Yu, Suqin; Xu, Xun; Feng, Dagan David
2017-01-01
Worldwide, polypoidal choroidal vasculopathy (PCV) is a common vision-threatening exudative maculopathy, and pigment epithelium detachment (PED) is an important clinical characteristic. Thus, precise and efficient PED segmentation is necessary for PCV clinical diagnosis and treatment. We propose a dual-stage learning framework via deep neural networks (DNN) for automated PED segmentation in PCV patients to avoid issues associated with manual PED segmentation (subjectivity, manual segmentation errors, and high time consumption).The optical coherence tomography scans of fifty patients were quantitatively evaluated with different algorithms and clinicians. Dual-stage DNN outperformed existing PED segmentation methods for all segmentation accuracy parameters, including true positive volume fraction (85.74 ± 8.69%), dice similarity coefficient (85.69 ± 8.08%), positive predictive value (86.02 ± 8.99%) and false positive volume fraction (0.38 ± 0.18%). Dual-stage DNN achieves accurate PED quantitative information, works with multiple types of PEDs and agrees well with manual delineation, suggesting that it is a potential automated assistant for PCV management. PMID:28966847
Xu, Yupeng; Yan, Ke; Kim, Jinman; Wang, Xiuying; Li, Changyang; Su, Li; Yu, Suqin; Xu, Xun; Feng, Dagan David
2017-09-01
Worldwide, polypoidal choroidal vasculopathy (PCV) is a common vision-threatening exudative maculopathy, and pigment epithelium detachment (PED) is an important clinical characteristic. Thus, precise and efficient PED segmentation is necessary for PCV clinical diagnosis and treatment. We propose a dual-stage learning framework via deep neural networks (DNN) for automated PED segmentation in PCV patients to avoid issues associated with manual PED segmentation (subjectivity, manual segmentation errors, and high time consumption).The optical coherence tomography scans of fifty patients were quantitatively evaluated with different algorithms and clinicians. Dual-stage DNN outperformed existing PED segmentation methods for all segmentation accuracy parameters, including true positive volume fraction (85.74 ± 8.69%), dice similarity coefficient (85.69 ± 8.08%), positive predictive value (86.02 ± 8.99%) and false positive volume fraction (0.38 ± 0.18%). Dual-stage DNN achieves accurate PED quantitative information, works with multiple types of PEDs and agrees well with manual delineation, suggesting that it is a potential automated assistant for PCV management.
Automation life-cycle cost model
NASA Technical Reports Server (NTRS)
Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne
1992-01-01
The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.
NASA Astrophysics Data System (ADS)
Ibragimov, Bulat; Toesca, Diego; Chang, Daniel; Koong, Albert; Xing, Lei
2017-12-01
Automated segmentation of the portal vein (PV) for liver radiotherapy planning is a challenging task due to potentially low vasculature contrast, complex PV anatomy and image artifacts originated from fiducial markers and vasculature stents. In this paper, we propose a novel framework for automated segmentation of the PV from computed tomography (CT) images. We apply convolutional neural networks (CNNs) to learn the consistent appearance patterns of the PV using a training set of CT images with reference annotations and then enhance the PV in previously unseen CT images. Markov random fields (MRFs) were further used to smooth the results of the enhancement of the CNN enhancement and remove isolated mis-segmented regions. Finally, CNN-MRF-based enhancement was augmented with PV centerline detection that relied on PV anatomical properties such as tubularity and branch composition. The framework was validated on a clinical database with 72 CT images of patients scheduled for liver stereotactic body radiation therapy. The obtained accuracy of the segmentation was DSC= 0.83 and \
A Prototype Publishing Registry for the Virtual Observatory
NASA Astrophysics Data System (ADS)
Williamson, R.; Plante, R.
2004-07-01
In the Virtual Observatory (VO), a registry helps users locate resources, such as data and services, in a distributed environment. A general framework for VO registries is now under development within the International Virtual Observatory Alliance (IVOA) Registry Working Group. We present a prototype of one component of this framework: the publishing registry. The publishing registry allows data providers to expose metadata descriptions of their resources to the VO environment. Searchable registries can harvest the metadata from many publishing registries and make them searchable by users. We have developed a prototype publishing registry that data providers can install at their sites to publish their resources. The descriptions are exposed using the Open Archive Initiative (OAI) Protocol for Metadata Harvesting. Automating the input of metadata into registries is critical when a provider wishes to describe many resources. We illustrate various strategies for such automation, both currently in use and planned for the future. We also describe how future versions of the registry can adapt automatically to evolving metadata schemas for describing resources.
Automated software system for checking the structure and format of ACM SIG documents
NASA Astrophysics Data System (ADS)
Mirza, Arsalan Rahman; Sah, Melike
2017-04-01
Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.
Data fusion for automated non-destructive inspection
Brierley, N.; Tippetts, T.; Cawley, P.
2014-01-01
In industrial non-destructive evaluation (NDE), it is increasingly common for data acquisition to be automated, driving a recent substantial increase in the availability of data. The collected data need to be analysed, typically necessitating the painstaking manual labour of a skilled operator. Moreover, in automated NDE a region of an inspected component is typically interrogated several times, be it within a single data channel due to multiple probe passes, across several channels acquired simultaneously or over the course of repeated inspections. The systematic combination of these diverse readings is recognized to offer an opportunity to improve the reliability of the inspection, but is not achievable in a manual analysis. This paper describes a data-fusion-based software framework providing a partial automation capability, allowing component regions to be declared defect-free to a very high probability while readily identifying defect indications, thereby optimizing the use of the operator's time. The system is designed to applicable to a wide range of automated NDE scenarios, but the processing is exemplified using the industrial ultrasonic immersion inspection of aerospace turbine discs. Results obtained for industrial datasets demonstrate an orders-of-magnitude reduction in false-call rates, for a given probability of detection, achievable using the developed software system. PMID:25002828
Automating expert role to determine design concept in Kansei Engineering
NASA Astrophysics Data System (ADS)
Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd
2016-02-01
Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.
A Human-Autonomy Teaming Approach for a Flight-Following Task
NASA Technical Reports Server (NTRS)
Brandt, Summer L.; Russell, Ricky; Lachter, Joel; Shively, Robert
2017-01-01
Managing aircraft is becoming more complex with increasingly sophisticated automation responsible for more flight tasks. With this increased complexity, it is becoming more difficult for operators to understand what the automation is doing and why. Human involvement with increasingly autonomous systems must adjust to allow for a more dynamic relationship involving cooperation and teamwork. As part of an ongoing project to develop a framework for human-autonomy teaming (HAT) in aviation, a part-task study was conducted to demonstrate, evaluate and refine proposed critical aspects of HAT. These features were built into an automated recommender system on a ground station available from previous studies. Participants performed a flight-following task once with the original ground station (i.e., No HAT condition) and once with the HAT features enabled (i.e., HAT condition). Behavioral and subjective measures were collected; subjective measures are presented here. Overall, participants preferred the ground station with HAT features enabled compared to the station without the HAT features. Participants reported that the HAT displays and automation were preferred for keeping up with operationally important issues. Additionally, participants reported that the HAT displays and automation provided enough situation awareness to complete the task and reduced workload relative to the No HAT baseline.
Cañal-Bruland, Rouwen; Balch, Lars; Niesert, Loet
2015-07-01
Skilled basketball players are supposed to hit more often from the free throw distance than would be predicted by their shooting performances at adjacent distances. This is dubbed an especial skill. In the current study, we examined whether especial skills in free throw performance in basketball map onto especial skills in visually judging the success of basketball free throws. In addition, we tested whether this effect would be present in those who predict their own shots but absent in those who judge shots performed by another person. Eight skilled basketball players were coupled with eight equally skilled players, and performed 150 set shots from five different distances (including the free throw distance) while the yoked partner observed the shots. At the moment of ball release, the performers' and the observers' vision were synchronously occluded using liquid-crystal occlusion goggles, and both independently judged whether the shot was successful or not. Results did not replicate an especial skill effect in shooting performance. Based on signal detection theory (SDT) measures (d' and criterion c), results also revealed no especial skill for visually discriminating successful from unsuccessful shots at the foul line when compared to other distances. However, players showed an especial skill judgement bias towards judging balls 'in' at the foul line, but not at other distances. Importantly, this bias was only present in those who judged the success of their own shots, but not in those who judged the shots performed by someone else.
Lead exposure in Canada geese of the Eastern Prairie Population
DeStefano, S.; Brand, C.J.; Rusch, D.H.; Finley, Daniel L.; Gillespie, M.M.
1991-01-01
We monitored lead exposure in Eastern Prairie Population Canada geese during summer-winter, 1986-1987 and 1987-1988 at 5 areas. Blood lead concentrations in geese trapped during summer at Cape Churchill Manitoba were below levels indicative of recent lead exposure (0.18 ppm). Geese exposed to lead (≥0.18 ppm blood lead) increased to 7.6% at Oak Hammock Wildlife Management Area (WMA), southern Manitoba, where lead shot was still in use, and to 10.0% at Roseau River WMA, northern Minnesota, when fall-staging geese were close to a source of lead shot in Manitoba. Proportion of birds exposed to lead dropped to <2% at Lac Qui Parle WMA, Minnesota, a steel shot zone since 1980. On the wintering grounds at Swan Lake National Wildlife Refuge in Missouri, 4.9% of all geese showed exposure to lead before the hunting season. Lead exposure rose to 10.0% after hunting ended and then decreased to 5.2% in late winter. Incidence of lead shot in gizzards and concentrations of lead in livers supported blood assay data. Soil samples indicated that lead shot continues to be available to geese at Swan Lake, even though the area was established as a non-toxic shot zone in 1978. Steel shot zones have reduced lead exposure in the Eastern Prairie Population, but lead shot persists in the environment and continues to account for lead exposure and mortality in Eastern Prairie Population Canada geese.
Subjective State, Blood Pressure, and Behavioral Control Changes Produced by an "Energy Shot"
Marczinski, Cecile A; Stamates, Amy L; Ossege, Julianne; Maloney, Sarah F; Bardgett, Mark E; Brown, Clifford J
2014-06-01
Background: Energy drinks and energy shots are popular consumer beverages that are advertised to increase feelings of alertness. Typically, these products include high levels of caffeine, a mild psychostimulant drug. The scientific evidence demonstrating the specific benefits of energy products to users in terms of subjective state and objective performance is surprisingly lacking. Moreover, there are rising health concerns associated with the use of these products. Therefore, the purpose of this study was to investigate the acute effects of a popular energy shot (5-Hour Energy ® ) on subjective and objective measures that were assessed hourly for 6 hours following consumption. Methods: Participants ( n =14) completed a three-session study where they received the energy shot, a placebo control, and no drink. Following dose administration, participants completed subjective Profile of Mood States ratings hourly for 6 hours. Participants also repeatedly completed a behavioral control task (the cued go/no-go task) and provided blood pressure and pulse rate readings at each hour. Results: Consumption of the energy shot did improve subjective state, as measured by increased ratings of vigor and decreased ratings of fatigue. However, the energy shot did not alter objective performance, which worsened over time. Importantly, the energy shot elevated both systolic and diastolic blood pressure. Conclusions: Consumption of one energy shot may only result in modest benefits to subjective state. Individuals with preexisting hypertension or other medical conditions should be cautious about using these new consumer products.
Advanced systems engineering and network planning support
NASA Technical Reports Server (NTRS)
Walters, David H.; Barrett, Larry K.; Boyd, Ronald; Bazaj, Suresh; Mitchell, Lionel; Brosi, Fred
1990-01-01
The objective of this task was to take a fresh look at the NASA Space Network Control (SNC) element for the Advanced Tracking and Data Relay Satellite System (ATDRSS) such that it can be made more efficient and responsive to the user by introducing new concepts and technologies appropriate for the 1997 timeframe. In particular, it was desired to investigate the technologies and concepts employed in similar systems that may be applicable to the SNC. The recommendations resulting from this study include resource partitioning, on-line access to subsets of the SN schedule, fluid scheduling, increased use of demand access on the MA service, automating Inter-System Control functions using monitor by exception, increase automation for distributed data management and distributed work management, viewing SN operational control in terms of the OSI Management framework, and the introduction of automated interface management.
Reaction control system/remote manipulator system automation
NASA Technical Reports Server (NTRS)
Hiers, Harry K.
1990-01-01
The objectives of this project is to evaluate the capability of the Procedural Reasoning System (PRS) in a typical real-time space shuttle application and to assess its potential for use in the Space Station Freedom. PRS, developed by SRI International, is a result of research in automating the monitoring and control of spacecraft systems. The particular application selected for the present work is the automation of malfunction handling procedures for the Shuttle Remote Manipulator System (SRMS). The SRMS malfunction procedures will be encoded within the PRS framework, a crew interface appropriate to the RMS application will be developed, and the real-time data interface software developed. The resulting PRS will then be integrated with the high-fidelity On-orbit Simulation of the NASA Johnson Space Center's System Engineering Simulator, and tests under various SRMS fault scenarios will be conducted.
Mumps and the Vaccine (Shot) to Prevent It
... 6 months to 11 months old should have 1 dose of MMR shot before traveling abroad. Fact Sheet for Parents Color [ ... my child? To learn more about the MMR shot, talk to your child’s doctor, call 1-800-CDC-INFO, or visit www.cdc.gov/ ...
Calibration Shots Recorded for the Salton Seismic Imaging Project, Salton Trough, California
NASA Astrophysics Data System (ADS)
Murphy, J. M.; Rymer, M. J.; Fuis, G. S.; Stock, J. M.; Goldman, M.; Sickler, R. R.; Miller, S. A.; Criley, C. J.; Ricketts, J. W.; Hole, J. A.
2009-12-01
The Salton Seismic Imaging Project (SSIP) is a collaborative venture between the U.S. Geological Survey, California Institute of Technology, and Virginia Polytechnic Institute and State University, to acquire seismic reflection/wide angle refraction data, and currently is scheduled for data acquisition in 2010. The purpose of the project is to get a detailed subsurface 3-D image of the structure of the Salton Trough (including both the Coachella and Imperial Valleys) that can be used for earthquake hazards analysis, geothermal studies, and studies of the transition from ocean-ocean to continent-continent plate-boundary. In June 2009, a series of calibration shots were detonated in the southern Imperial Valley with specific goals in mind. First, these shots were used to measure peak particle velocity and acceleration at various distances from the shots. Second, the shots were used to calibrate the propagation of energy through sediments of the Imperial Valley. Third, the shots were used to test the effects of seismic energy on buried clay drainage pipes, which are abundant throughout the irrigated parts of the Salton Trough. Fourth, we tested the ODEX drilling technique, which uses a down-hole casing hammer for a tight casing fit. Information obtained from the calibration shots will be used for final planning of the main project. The shots were located in an unused field adjacent to Hwy 7, about 6 km north of the U.S. /Mexican border (about 18 km southeast of El Centro). Three closely spaced shot points (16 meters apart) were aligned N-S and drilled to 21-m, 23.5-m, and 27-m depth. The holes were filled with 23-kg, 68-kg, and 123-kg of ammonium-nitrate explosive, respectively. Four instrument types were used to record the seismic energy - six RefTek RT130 6-channel recorders with a 3-component accelerometer and a 3-component 2-Hz velocity sensor, seven RefTek RT130 3-channel recorders with a 3-component 4.5-Hz velocity sensor, 35 Texans with a vertical component 4.5-Hz velocity sensor, and a 60-channel cabled array with 40-Hz sensors. Irrigation districts in both the Coachella Valley and Imperial Valley use clay drainage pipes buried beneath fields to remove irrigation water and prevent ponding. To determine the effect of seismic energy on the drain pipes, we exposed sections of pipe several meters long with a backhoe at distances of 7-15 meters from the shot holes, and, after each shot, visually inspected the pipes. Our shots produced no pipe damage.
Flu shots and the characteristics of unvaccinated elderly Medicare beneficiaries.
Lochner, Kimberly A; Wynne, Marc
2011-12-21
Data from the Medicare Current Beneficiary Survey, 2009. • Overall, 73% of Medicare beneficiaries aged 65 years and older reported receiving a flu shot for the 2008 flu season, but vaccination rates varied by socio-demographic characteristics. Flu vaccination was lowest for beneficiaries aged 65-74 years old, who were non-Hispanic Blacks and Hispanics, were not married, had less than a high school education, or who were eligible for Medicaid (i.e., dual eligibles). • Healthcare utilization and personal health behavior were also related to vaccination rates, with current smokers and those with no hospitalizations or physician visits being less likely to be vaccinated. • Among those beneficiaries who reported receiving a flu shot, 59% received it in a physician's office or clinic, with the next most common setting being in the community (21%); e.g., grocery store, shopping mall, library, or church. • Among those beneficiaries who did not receive a flu shot, the most common reasons were beliefs that the shot could cause side effects or disease (20%), that they didn't think the shot could prevent the flu (17%), or that the shot wasn't needed (16%). Less than 1% reported that they didn't get the flu shot because of cost. Elderly persons (aged 65 years and older) are at increased risk of complications from influenza, with the majority of influenza-related hospitalizations and deaths occurring among the elderly (Fiore et al., 2010). Most physicians recommend their elderly patients get a flu shot each year, and many hospitals inquire about elderly patient's immunization status upon admission, providing a vaccination if requested. The importance of getting a flu shot is underscored by the Department of Health and Human Services' Healthy People initiative, which has set a vaccination goal of 90% for the Nation's elderly by the year 2020 (Department of Health and Human Services [DHHS], 2011). Although all costs related to flu shots are covered by Medicare, requiring no co-pay on the part of the beneficiary (Centers for Medicare and Medicaid Services, 2011), for the 2008 flu season, only 73% of non-institutionalized Medicare beneficiaries, aged 65 years and older, reported receiving one. This report presents the most recent data on flu vaccination rates among non-institutionalized elderly Medicare beneficiaries and their association with socio-demographic and personal health characteristics. The report also describes the places beneficiaries received their flu shot and, for those not getting vaccinated, the reasons reported for not doing so. Public Domain.
Principles of Automation for Patient Safety in Intensive Care: Learning From Aviation.
Dominiczak, Jason; Khansa, Lara
2018-06-01
The transition away from written documentation and analog methods has opened up the possibility of leveraging data science and analytic techniques to improve health care. In the implementation of data science techniques and methodologies, high-acuity patients in the ICU can particularly benefit. The Principles of Automation for Patient Safety in Intensive Care (PASPIC) framework draws on Billings's principles of human-centered aviation (HCA) automation and helps in identifying the advantages, pitfalls, and unintended consequences of automation in health care. Billings's HCA principles are based on the premise that human operators must remain "in command," so that they are continuously informed and actively involved in all aspects of system operations. In addition, automated systems need to be predictable, simple to train, to learn, and to operate, and must be able to monitor the human operators, and every intelligent system element must know the intent of other intelligent system elements. In applying Billings's HCA principles to the ICU setting, PAPSIC has three key characteristics: (1) integration and better interoperability, (2) multidimensional analysis, and (3) enhanced situation awareness. PAPSIC suggests that health care professionals reduce overreliance on automation and implement "cooperative automation" and that vendors reduce mode errors and embrace interoperability. Much can be learned from the aviation industry in automating the ICU. Because it combines "smart" technology with the necessary controls to withstand unintended consequences, PAPSIC could help ensure more informed decision making in the ICU and better patient care. Copyright © 2018 The Joint Commission. Published by Elsevier Inc. All rights reserved.
DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users tomore » assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. Here in this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.« less
DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs
Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin; ...
2018-02-08
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users tomore » assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. Here in this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.« less
SunShot Initiative Portfolio Book 2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solar Energy Technologies Office
2014-05-01
The 2014 SunShot Initiative Portfolio Book outlines the progress towards the goals outlined in the SunShot Vision Study. Contents include overviews of each of SunShot’s five subprogram areas, as well as a description of every active project in the SunShot’s project portfolio as of May 2014.
32 CFR 632.5 - Use of firearms.
Code of Federal Regulations, 2010 CFR
2010-07-01
... not fire if shots are likely to harm innocent bystanders. (3) Since warning shots could harm innocent bystanders, avoid firing them. However, when lesser degrees of force have failed, the law enforcement or.... If able to avoid hazards to innocent persons in these cases, fire warning shots. (4) Aim to disable...
Tdap (tetanus, diphtheria and pertussis) vaccine - what you need to know
... in 3 adults) Redness or swelling where the shot was given (about 1 person in 5) Mild fever of at least ... Pain where the shot was given (up to 1 in 5 or 6) Redness or swelling where the shot was given (up to about 1 in 16 ...
Code of Federal Regulations, 2011 CFR
2011-07-01
... designed or redesigned, made or remade, modified or remodified to automatically fire more than one shot by..., incendiary, blank, shotgun, black powder, and shot). Items shall only be considered as ammunition when loaded... smooth bore either a number of ball shot or a single projectile for each single pull of the trigger. (j...
Code of Federal Regulations, 2012 CFR
2012-07-01
... designed or redesigned, made or remade, modified or remodified to automatically fire more than one shot by..., incendiary, blank, shotgun, black powder, and shot). Items shall only be considered as ammunition when loaded... smooth bore either a number of ball shot or a single projectile for each single pull of the trigger. (j...
16 CFR 1500.17 - Banned hazardous substances.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Voluntary standard. (1) One alternative to the tip-angle requirement that the Commission considered is to... Multiple Shot requires that large multiple-tube devices not tip over (except as the result of the last shot) when shot on a 2-inch thick medium-density foam pad. The Commission cannot conclude that AFSL's...
Code of Federal Regulations, 2013 CFR
2013-07-01
... designed or redesigned, made or remade, modified or remodified to automatically fire more than one shot by..., incendiary, blank, shotgun, black powder, and shot). Items shall only be considered as ammunition when loaded... smooth bore either a number of ball shot or a single projectile for each single pull of the trigger. (j...
Code of Federal Regulations, 2014 CFR
2014-07-01
... designed or redesigned, made or remade, modified or remodified to automatically fire more than one shot by..., incendiary, blank, shotgun, black powder, and shot). Items shall only be considered as ammunition when loaded... smooth bore either a number of ball shot or a single projectile for each single pull of the trigger. (j...
Vaccines for Your Children: Protect Your Child at Every Age
... shots. 1 to 2 Months Learn about the 1-2 months shots and the importance of staying on schedule. 4 ... 18 Years Learn about the 13-18 year shots, vaccines needed for travel, and vaccines before ... Pregnancy Birth 1 to 2 months 4 months 6 months 7 ...
16 CFR 1500.17 - Banned hazardous substances.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Voluntary standard. (1) One alternative to the tip-angle requirement that the Commission considered is to... Multiple Shot requires that large multiple-tube devices not tip over (except as the result of the last shot) when shot on a 2-inch thick medium-density foam pad. The Commission cannot conclude that AFSL's...
... Your doctor will probably recommend that you get one. Flu vaccines are available as a shot. The shot contains killed flu viruses and will ... CDC). This nasal mist did not help prevent cases of flu between 2013 and ... don't have reactions to a flu shot, although a few may notice a fever, sore ...
1987-04-24
eliminated. Averaging the mass spectra from only 500 laser shots (50 seconds with this system) resulted in a detection limit of r15 ppb. The...resolution. Fluctuations in laser pulse energy from shot to shot appear as noise in the interleaved data, but averaging of several such traces gives a good...ranging from 0to 120 ix Wm- 2. quantity of material volatilized was proportional to the number of lase shots . A simple time-of-flight mass spectrometer was
NASA Astrophysics Data System (ADS)
Schroeder, C. B.; Fawley, W. M.; Esarey, E.
2003-07-01
We investigate the statistical properties (e.g., shot-to-shot power fluctuations) of the radiation from a high-gain free-electron laser (FEL) operating in the nonlinear regime. We consider the case of an FEL amplifier reaching saturation whose shot-to-shot fluctuations in input radiation power follow a gamma distribution. We analyze the corresponding output power fluctuations at and beyond saturation, including beam energy spread effects, and find that there are well-characterized values of undulator length for which the fluctuations reach a minimum.
Effects of Laser and Shot Peening on Fatigue Crack Growth in Friction Stir Welds
NASA Technical Reports Server (NTRS)
Hatamleh, Omar; Forman, Royce; Lyons, Jed
2006-01-01
The effects of laser, and shot peening on the fatigue life of Friction Stir Welds (FSW) have been investigated. The surface roughness resulting from various peening techniques was assessed, and the fracture surfaces microstructure was characterized. Laser peening resulted in an increase in fatigue life approximately 60%, while shot peening resulted in 10% increase when compared to the unpeened material. The surface roughness of shot peening was significantly higher compared to the base material, while specimens processed with laser peening were relatively smooth.
Equation of State for RX-08-EL and RX-08-EP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, E.L.; Walton, J.
1985-05-07
JWL Equations of State (EOS's) have been estimated for RX-08-EL and RX-08-EP. The estimated JWL EOS parameters are listed. Previously, we derived a JWL EOS for RX-08-EN based on DYNA2D hydrodynamic code cylinder computations and comparisons with experimental cylinder test results are shown. The experimental cylinder shot results for RX-08-EL, shot K-473, were compared to the experimental cylinder shot results for RX-08-EN, shot K-463, as a reference. 10 figs., 6 tabs.
Comparative toxicity of lead shot in black ducks (Anas rubripes) and mallards (Anas platyrhynchos)
Rattner, B.A.; Fleming, W.J.; Bunck, C.M.
1989-01-01
In winter, pen-reared and wild black ducks (Anas rubripes), and game farm and wild mallards (Anas platyrhynchos), maintained on pelleted feed, were sham-dosed or given one number 4 lead shot. After 14 days, dosed birds were redosed with two or four additional lead shot. This dosing regimen also was repeated in summer using pen-reared black ducks and game farm mallards. Based upon mortality, overt intoxication, weight change, delta-aminolevulinic acid dehydratase activity and protoporphyrin concentration, black ducks and mallards were found to be equally tolerant to lead shot. However, captive wild ducks were more sensitive than their domesticated counterparts, as evidenced by greater mortality and weight loss following lead shot administration. This difference may be related to stress associated with captivity and unnatural diet.
Attention and the evolution of Hollywood film.
Cutting, James E; DeLong, Jordan E; Nothelfer, Christine E
2010-03-01
Reaction times exhibit a spectral patterning known as 1/f, and these patterns can be thought of as reflecting time-varying changes in attention. We investigated the shot structure of Hollywood films to determine if these same patterns are found. We parsed 150 films with release dates from 1935 to 2005 into their sequences of shots and then analyzed the pattern of shot lengths in each film. Autoregressive and power analyses showed that, across that span of 70 years, shots became increasingly more correlated in length with their neighbors and created power spectra approaching 1/f. We suggest, as have others, that 1/f patterns reflect world structure and mental process. Moreover, a 1/f temporal shot structure may help harness observers' attention to the narrative of a film.
High-dynamic-range cross-correlator for shot-to-shot measurement of temporal contrast
NASA Astrophysics Data System (ADS)
Kon, Akira; Nishiuchi, Mamiko; Kiriyama, Hiromitsu; Ogura, Koichi; Mori, Michiaki; Sakaki, Hironao; Kando, Masaki; Kondo, Kiminori
2017-01-01
The temporal contrast of an ultrahigh-intensity laser is a crucial parameter for laser plasma experiments. We have developed a multichannel cross-correlator (MCCC) for single-shot measurements of the temporal contrast in a high-power laser system. The MCCC is based on third-order cross-correlation, and has four channels and independent optical delay lines. We have experimentally demonstrated that the MCCC system achieves a high dynamic range of ˜1012 and a large temporal window of ˜1 ns. Moreover, we were able to measure the shot-to-shot fluctuations of a short-prepulse intensity at -26 ps and long-pulse (amplified spontaneous emission, ASE) intensities at -30, -450, and -950 ps before the arrival of the main pulse at the interaction point.
NASA Astrophysics Data System (ADS)
Sa, Qila; Wang, Zhihui
2018-03-01
At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.
High current nonlinear transmission line based electron beam driver
NASA Astrophysics Data System (ADS)
Hoff, B. W.; French, D. M.; Simon, D. S.; Lepell, P. D.; Montoya, T.; Heidger, S. L.
2017-10-01
A gigawatt-class nonlinear transmission line based electron beam driver is experimentally demonstrated. Four experimental series, each with a different Marx bank charge voltage (15, 20, 25, and 30 kV), were completed. Within each experimental series, shots at peak frequencies ranging from 950 MHz to 1.45 GHz were performed. Peak amplitude modulations of the NLTL output voltage signal were found to range between 18% and 35% for the lowest frequency shots and between 5% and 20% for the highest frequency shots (higher modulation at higher Marx charge voltage). Peak amplitude modulations of the electron beam current were found to range between 10% and 20% for the lowest frequency shots and between 2% and 7% for the highest frequency shots (higher modulation at higher Marx charge voltage).
Hammond, David; Reid, Jessica L
2016-06-27
In 2012, Health Canada transitioned caffeinated energy drinks from Natural Health Product to Food and Drug classification and regulations, implementing temporary guidelines with requirements such as caffeine content limits, mandatory cautionary labelling, and restrictions on health claims. "Energy shots" often contain as much or more caffeine compared to energy drinks and have been associated with a similar number of adverse health events. However, current requirements for energy drinks do not apply to energy shots, which remain classified as "natural health products" on the basis that they are "not consumed or perceived as foods" in the same way as energy drinks. An online survey was conducted with Canadian youth and young adults aged 12-24 years (N = 2040) in October 2014 to examine perceptions of energy shots. Respondents viewed an image of a popular energy shot and were asked which term best described it, with six randomly-ordered options. The vast majority (78.8%) perceived the energy shot as an "energy drink" (vs. "supplement", "vitamin drink", "natural health product", "soft drink" or "food product"). Given consumer perceptions and the similarity in product constituents, there is little basis for regulating energy shots differently from energy drinks; these products should be subject to similar labelling and health warning requirements.
Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing
NASA Technical Reports Server (NTRS)
Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.
2010-01-01
The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.
Motion compensation and noise tolerance in phase-shifting digital in-line holography.
Stenner, Michael D; Neifeld, Mark A
2006-05-15
We present a technique for phase-shifting digital in-line holography which compensates for lateral object motion. By collecting two frames of interference between object and reference fields with identical reference phase, one can estimate the lateral motion that occurred between frames using the cross-correlation. We also describe a very general linear framework for phase-shifting holographic reconstruction which minimizes additive white Gaussian noise (AWGN) for an arbitrary set of reference field amplitudes and phases. We analyze the technique's sensitivity to noise (AWGN, quantization, and shot), errors in the reference fields, errors in motion estimation, resolution, and depth of field. We also present experimental motion-compensated images achieving the expected resolution.
New methodology for shaft design based on life expectancy
NASA Technical Reports Server (NTRS)
Loewenthal, S. H.
1986-01-01
The design of power transmission shafting for reliability has not historically received a great deal of attention. However, weight sensitive aerospace and vehicle applications and those where the penalties of shaft failure are great, require greater confidence in shaft design than earlier methods provided. This report summarizes a fatigue strength-based, design method for sizing shafts under variable amplitude loading histories for limited or nonlimited service life. Moreover, applications factors such as press-fitted collars, shaft size, residual stresses from shot peening or plating, corrosive environments can be readily accommodated into the framework of the analysis. Examples are given which illustrate the use of the method, pointing out the large life penalties due to occasional cyclic overloads.
Pervasive randomness in physics: an introduction to its modelling and spectral characterisation
NASA Astrophysics Data System (ADS)
Howard, Roy
2017-10-01
An introduction to the modelling and spectral characterisation of random phenomena is detailed at a level consistent with a first exposure to the subject at an undergraduate level. A signal framework for defining a random process is provided and this underpins an introduction to common random processes including the Poisson point process, the random walk, the random telegraph signal, shot noise, information signalling random processes, jittered pulse trains, birth-death random processes and Markov chains. An introduction to the spectral characterisation of signals and random processes, via either an energy spectral density or a power spectral density, is detailed. The important case of defining a white noise random process concludes the paper.
Detecting of foreign object debris on airfield pavement using convolution neural network
NASA Astrophysics Data System (ADS)
Cao, Xiaoguang; Gu, Yufeng; Bai, Xiangzhi
2017-11-01
It is of great practical significance to detect foreign object debris (FOD) timely and accurately on the airfield pavement, because the FOD is a fatal threaten for runway safety in airport. In this paper, a new FOD detection framework based on Single Shot MultiBox Detector (SSD) is proposed. Two strategies include making the detection network lighter and using dilated convolution, which are proposed to better solve the FOD detection problem. The advantages mainly include: (i) the network structure becomes lighter to speed up detection task and enhance detection accuracy; (ii) dilated convolution is applied in network structure to handle smaller FOD. Thus, we get a faster and more accurate detection system.
An Information Extraction Framework for Cohort Identification Using Electronic Health Records
Liu, Hongfang; Bielinski, Suzette J.; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B.; Jonnalagadda, Siddhartha R.; Ravikumar, K.E.; Wu, Stephen T.; Kullo, Iftikhar J.; Chute, Christopher G
Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework. PMID:24303255
Intelligent Frameworks for Instructional Design.
ERIC Educational Resources Information Center
Spector, J. Michael; And Others
Many researchers are attempting to develop automated instructional development systems to guide subject matter experts through the lengthy and difficult process of courseware development. Because the targeted users often lack instructional design expertise, a great deal of emphasis has been placed on the use of artificial intelligence (AI) to…
Adaptive Modeling Language and Its Derivatives
NASA Technical Reports Server (NTRS)
Chemaly, Adel
2006-01-01
Adaptive Modeling Language (AML) is the underlying language of an object-oriented, multidisciplinary, knowledge-based engineering framework. AML offers an advanced modeling paradigm with an open architecture, enabling the automation of the entire product development cycle, integrating product configuration, design, analysis, visualization, production planning, inspection, and cost estimation.
Building a framework to manage trust in automation
NASA Astrophysics Data System (ADS)
Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.
2017-05-01
All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.