Advanced interactive display formats for terminal area traffic control
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.
1996-01-01
This report describes the basic design considerations for perspective air traffic control displays. A software framework has been developed for manual viewing parameter setting (MVPS) in preparation for continued, ongoing developments on automated viewing parameter setting (AVPS) schemes. Two distinct modes of MVPS operations are considered, both of which utilize manipulation pointers imbedded in the three-dimensional scene: (1) direct manipulation of the viewing parameters -- in this mode the manipulation pointers act like the control-input device, through which the viewing parameter changes are made. Part of the parameters are rate controlled, and part of them position controlled. This mode is intended for making fast, iterative small changes in the parameters. (2) Indirect manipulation of the viewing parameters -- this mode is intended primarily for introducing large, predetermined changes in the parameters. Requests for changes in viewing parameter setting are entered manually by the operator by moving viewing parameter manipulation pointers on the screen. The motion of these pointers, which are an integral part of the 3-D scene, is limited to the boundaries of the screen. This arrangement has been chosen in order to preserve the correspondence between the spatial lay-outs of the new and the old viewing parameter setting, a feature which contributes to preventing spatial disorientation of the operator. For all viewing operations, e.g. rotation, translation and ranging, the actual change is executed automatically by the system, through gradual transitions with an exponentially damped, sinusoidal velocity profile, in this work referred to as 'slewing' motions. The slewing functions, which eliminate discontinuities in the viewing parameter changes, are designed primarily for enhancing the operator's impression that he, or she, is dealing with an actually existing physical system, rather than an abstract computer-generated scene. The proposed, continued research efforts will deal with the development of automated viewing parameter setting schemes. These schemes employ an optimization strategy, aimed at identifying the best possible vantage point, from which the air traffic control scene can be viewed for a given traffic situation. They determine whether a change in viewing parameter setting is required and determine the dynamic path along which the change to the new viewing parameter setting should take place.
Advanced interactive display formats for terminal area traffic control
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.
1995-01-01
The basic design considerations for perspective Air Traffic Control displays are described. A software framework has been developed for manual viewing parameter setting (MVPS) in preparation for continued, ongoing developments on automated viewing parameter setting (AVPS) schemes. The MVPS system is based on indirect manipulation of the viewing parameters. Requests for changes in viewing parameter setting are entered manually by the operator by moving viewing parameter manipulation pointers on the screen. The motion of these pointers, which are an integral part of the 3-D scene, is limited to the boundaries of screen. This arrangement has been chosen, in order to preserve the correspondence between the new and the old viewing parameter setting, a feature which contributes to preventing spatial disorientation of the operator. For all viewing operations, e.g. rotation, translation and ranging, the actual change is executed automatically by the system, through gradual transitions with an exponentially damped, sinusoidal velocity profile, in this work referred to as 'slewing' motions. The slewing functions, which eliminate discontinuities in the viewing parameter changes, are designed primarily for enhancing the operator's impression that he, or she, is dealing with an actually existing physical system, rather than an abstract computer generated scene. Current, ongoing efforts deal with the development of automated viewing parameter setting schemes. These schemes employ an optimization strategy, aimed at identifying the best possible vantage point, from which the Air Traffic Control scene can be viewed, for a given traffic situation.
Rapid production of optimal-quality reduced-resolution representations of very large databases
Sigeti, David E.; Duchaineau, Mark; Miller, Mark C.; Wolinsky, Murray; Aldrich, Charles; Mineev-Weinstein, Mark B.
2001-01-01
View space representation data is produced in real time from a world space database representing terrain features. The world space database is first preprocessed. A database is formed having one element for each spatial region corresponding to a finest selected level of detail. A multiresolution database is then formed by merging elements and a strict error metric is computed for each element at each level of detail that is independent of parameters defining the view space. The multiresolution database and associated strict error metrics are then processed in real time for real time frame representations. View parameters for a view volume comprising a view location and field of view are selected. The error metric with the view parameters is converted to a view-dependent error metric. Elements with the coarsest resolution are chosen for an initial representation. Data set first elements from the initial representation data set are selected that are at least partially within the view volume. The first elements are placed in a split queue ordered by the value of the view-dependent error metric. If the number of first elements in the queue meets or exceeds a predetermined number of elements or whether the largest error metric is less than or equal to a selected upper error metric bound, the element at the head of the queue is force split and the resulting elements are inserted into the queue. Force splitting is continued until the determination is positive to form a first multiresolution set of elements. The first multiresolution set of elements is then outputted as reduced resolution view space data representing the terrain features.
Arithmetic Data Cube as a Data Intensive Benchmark
NASA Technical Reports Server (NTRS)
Frumkin, Michael A.; Shabano, Leonid
2003-01-01
Data movement across computational grids and across memory hierarchy of individual grid machines is known to be a limiting factor for application involving large data sets. In this paper we introduce the Data Cube Operator on an Arithmetic Data Set which we call Arithmetic Data Cube (ADC). We propose to use the ADC to benchmark grid capabilities to handle large distributed data sets. The ADC stresses all levels of grid memory by producing 2d views of an Arithmetic Data Set of d-tuples described by a small number of parameters. We control data intensity of the ADC by controlling the sizes of the views through choice of the tuple parameters.
Yoon, Ki-Hyuk; Ju, Heongkyu; Kwon, Hyunkyung; Park, Inkyu; Kim, Sung-Kyu
2016-02-22
We present optical characteristics of view image provided by a high-density multi-view autostereoscopic 3D display (HD-MVA3D) with a parallax barrier (PB). Diffraction effects that become of great importance in such a display system that uses a PB, are considered in an one-dimensional model of the 3D display, in which the numerical simulation of light from display panel pixels through PB slits to viewing zone is performed. The simulation results are then compared to the corresponding experimental measurements with discussion. We demonstrate that, as a main parameter for view image quality evaluation, the Fresnel number can be used to determine the PB slit aperture for the best performance of the display system. It is revealed that a set of the display parameters, which gives the Fresnel number of ∼ 0.7 offers maximized brightness of the view images while that corresponding to the Fresnel number of 0.4 ∼ 0.5 offers minimized image crosstalk. The compromise between the brightness and crosstalk enables optimization of the relative magnitude of the brightness to the crosstalk and lead to the choice of display parameter set for the HD-MVA3D with a PB, which satisfies the condition where the Fresnel number lies between 0.4 and 0.7.
Benchmarking Memory Performance with the Data Cube Operator
NASA Technical Reports Server (NTRS)
Frumkin, Michael A.; Shabanov, Leonid V.
2004-01-01
Data movement across a computer memory hierarchy and across computational grids is known to be a limiting factor for applications processing large data sets. We use the Data Cube Operator on an Arithmetic Data Set, called ADC, to benchmark capabilities of computers and of computational grids to handle large distributed data sets. We present a prototype implementation of a parallel algorithm for computation of the operatol: The algorithm follows a known approach for computing views from the smallest parent. The ADC stresses all levels of grid memory and storage by producing some of 2d views of an Arithmetic Data Set of d-tuples described by a small number of integers. We control data intensity of the ADC by selecting the tuple parameters, the sizes of the views, and the number of realized views. Benchmarking results of memory performance of a number of computer architectures and of a small computational grid are presented.
Parameter Estimation for Geoscience Applications Using a Measure-Theoretic Approach
NASA Astrophysics Data System (ADS)
Dawson, C.; Butler, T.; Mattis, S. A.; Graham, L.; Westerink, J. J.; Vesselinov, V. V.; Estep, D.
2016-12-01
Effective modeling of complex physical systems arising in the geosciences is dependent on knowing parameters which are often difficult or impossible to measure in situ. In this talk we focus on two such problems, estimating parameters for groundwater flow and contaminant transport, and estimating parameters within a coastal ocean model. The approach we will describe, proposed by collaborators D. Estep, T. Butler and others, is based on a novel stochastic inversion technique based on measure theory. In this approach, given a probability space on certain observable quantities of interest, one searches for the sets of highest probability in parameter space which give rise to these observables. When viewed as mappings between sets, the stochastic inversion problem is well-posed in certain settings, but there are computational challenges related to the set construction. We will focus the talk on estimating scalar parameters and fields in a contaminant transport setting, and in estimating bottom friction in a complicated near-shore coastal application.
Large storms: Airglow and related measurements. VLF observations, volume 4
NASA Technical Reports Server (NTRS)
1981-01-01
The data presented show the typical values and range of ionospheric and magnetospheric characteristics, as viewed from 1400 km with the ISIS 2 instruments. The definition of each data set depends partly on geophysical parameters and partly on satellite operating mode. Preceding the data set is a description of the organizational parameters and a review of the objectives and general characteristics of the data set. The data are shown as a selection from 12 different data formats. Each data set has a different selection of formats, but uniformity of a given format selection is preserved throughout each data set. Each data set consists of a selected number of passes, each comprising a format combination that is most appropriae for the particular data set. Description of ISIS 2 instruments are provided.
Feature Extraction for Pose Estimation. A Comparison Between Synthetic and Real IR Imagery
1991-12-01
determine the orientation of the sensor relative to the target ....... ........................ 33 4. Effects of changing sensor and target parameters...Reference object is a T-62 tank facing the viewer (sensor/target parameters set equal to zero). NOTE: Changing the target parameters produces...anomalous results. For these images, the field of view (FOV) was not changed .......................... 35 5. Image anomalies from changing the target
Automation of Sensor Control in Uninhabited Aerial Vehicles
2015-07-01
were otherwise performed manually reduces workload and mitigates high workload situations. On the other hand, it has been suggested that the...that automation may help mitigate high workload (Lee, 2008) it would have been interesting if both sets of authors additionally assessed when...frames would result in the sensor view flickering between two views. In support of this, VBS2 initialisation parameters were adjusted to prevent
NASA Technical Reports Server (NTRS)
Murphree, J. S.
1980-01-01
A representative set of data from ISIS 2 covering a range of operating modes and geophysical conditions is presented. The data show the typical values and range of ionospheric and magnetospheric characteristics, as viewed from 1400 km with the ISIS 2 instruments. The definition of each data set depends partly on geophysical parameters and partly on satellite operating mode. Preceding the data set is a description of the organizational parameters and a review of the objectives and general characteristics of the data set. The data are shown as a selection from 12 different data formats. Each data has a different selection of formats, but uniformity of a given format selection is preserved throughout each data set.
NASA Technical Reports Server (NTRS)
Diner, Daniel B. (Inventor)
1994-01-01
Real-time video presentations are provided in the field of operator-supervised automation and teleoperation, particularly in control stations having movable cameras for optimal viewing of a region of interest in robotics and teleoperations for performing different types of tasks. Movable monitors to match the corresponding camera orientations (pan, tilt, and roll) are provided in order to match the coordinate systems of all the monitors to the operator internal coordinate system. Automated control of the arrangement of cameras and monitors, and of the configuration of system parameters, is provided for optimal viewing and performance of each type of task for each operator since operators have different individual characteristics. The optimal viewing arrangement and system parameter configuration is determined and stored for each operator in performing each of many types of tasks in order to aid the automation of setting up optimal arrangements and configurations for successive tasks in real time. Factors in determining what is optimal include the operator's ability to use hand-controllers for each type of task. Robot joint locations, forces and torques are used, as well as the operator's identity, to identify the current type of task being performed in order to call up a stored optimal viewing arrangement and system parameter configuration.
Data Mining Technologies Inspired from Visual Principle
NASA Astrophysics Data System (ADS)
Xu, Zongben
In this talk we review the recent work done by our group on data mining (DM) technologies deduced from simulating visual principle. Through viewing a DM problem as a cognition problems and treading a data set as an image with each light point located at a datum position, we developed a series of high efficient algorithms for clustering, classification and regression via mimicking visual principles. In pattern recognition, human eyes seem to possess a singular aptitude to group objects and find important structure in an efficient way. Thus, a DM algorithm simulating visual system may solve some basic problems in DM research. From this point of view, we proposed a new approach for data clustering by modeling the blurring effect of lateral retinal interconnections based on scale space theory. In this approach, as the data image blurs, smaller light blobs merge into large ones until the whole image becomes one light blob at a low enough level of resolution. By identifying each blob with a cluster, the blurring process then generates a family of clustering along the hierarchy. The proposed approach provides unique solutions to many long standing problems, such as the cluster validity and the sensitivity to initialization problems, in clustering. We extended such an approach to classification and regression problems, through combatively employing the Weber's law in physiology and the cell response classification facts. The resultant classification and regression algorithms are proven to be very efficient and solve the problems of model selection and applicability to huge size of data set in DM technologies. We finally applied the similar idea to the difficult parameter setting problem in support vector machine (SVM). Viewing the parameter setting problem as a recognition problem of choosing a visual scale at which the global and local structures of a data set can be preserved, and the difference between the two structures be maximized in the feature space, we derived a direct parameter setting formula for the Gaussian SVM. The simulations and applications show that the suggested formula significantly outperforms the known model selection methods in terms of efficiency and precision.
NASA Astrophysics Data System (ADS)
Cristescu, Constantin P.; Stan, Cristina; Scarlat, Eugen I.; Minea, Teofil; Cristescu, Cristina M.
2012-04-01
We present a novel method for the parameter oriented analysis of mutual correlation between independent time series or between equivalent structures such as ordered data sets. The proposed method is based on the sliding window technique, defines a new type of correlation measure and can be applied to time series from all domains of science and technology, experimental or simulated. A specific parameter that can characterize the time series is computed for each window and a cross correlation analysis is carried out on the set of values obtained for the time series under investigation. We apply this method to the study of some currency daily exchange rates from the point of view of the Hurst exponent and the intermittency parameter. Interesting correlation relationships are revealed and a tentative crisis prediction is presented.
Multi-Resolution Climate Ensemble Parameter Analysis with Nested Parallel Coordinates Plots.
Wang, Junpeng; Liu, Xiaotong; Shen, Han-Wei; Lin, Guang
2017-01-01
Due to the uncertain nature of weather prediction, climate simulations are usually performed multiple times with different spatial resolutions. The outputs of simulations are multi-resolution spatial temporal ensembles. Each simulation run uses a unique set of values for multiple convective parameters. Distinct parameter settings from different simulation runs in different resolutions constitute a multi-resolution high-dimensional parameter space. Understanding the correlation between the different convective parameters, and establishing a connection between the parameter settings and the ensemble outputs are crucial to domain scientists. The multi-resolution high-dimensional parameter space, however, presents a unique challenge to the existing correlation visualization techniques. We present Nested Parallel Coordinates Plot (NPCP), a new type of parallel coordinates plots that enables visualization of intra-resolution and inter-resolution parameter correlations. With flexible user control, NPCP integrates superimposition, juxtaposition and explicit encodings in a single view for comparative data visualization and analysis. We develop an integrated visual analytics system to help domain scientists understand the connection between multi-resolution convective parameters and the large spatial temporal ensembles. Our system presents intricate climate ensembles with a comprehensive overview and on-demand geographic details. We demonstrate NPCP, along with the climate ensemble visualization system, based on real-world use-cases from our collaborators in computational and predictive science.
Multiple angles on the sterile neutrino - a combined view of cosmological and oscillation limits
NASA Astrophysics Data System (ADS)
Guzowski, Pawel
2017-09-01
The possible existence of sterile neutrinos is an important unresolved question for both particle physics and cosmology. Data sensitive to a sterile neutrino is coming from both particle physics experiments and from astrophysical measurements of the Cosmic Microwave Background. In this study, we address the question whether these two contrasting data sets provide complementary information about sterile neutrinos. We focus on the muon disappearance oscillation channel, taking data from the MINOS, ICECUBE and Planck experiments, converting the limits into particle physics and cosmological parameter spaces, to illustrate the different regions of parameter space where the data sets have the best sensitivity. For the first time, we combine the data sets into a single analysis to illustrate how the limits on the parameters of the sterile-neutrino model are strengthened. We investigate how data from a future accelerator neutrino experiment (SBN) will be able to further constrain this picture.
NASA Astrophysics Data System (ADS)
Traxler, Christoph; Ortner, Thomas; Hesina, Gerd; Barnes, Robert; Gupta, Sanjeev; Paar, Gerhard
2017-04-01
High resolution Digital Terrain Models (DTM) and Digital Outcrop Models (DOM) are highly useful for geological analysis and mission planning in planetary rover missions. PRo3D, developed as part of the EU-FP7 PRoViDE project, is a 3D viewer in which orbital DTMs and DOMs derived from rover stereo imagery can be rendered in a virtual environment for exploration and analysis. It allows fluent navigation over planetary surface models and provides a variety of measurement and annotation tools to complete an extensive geological interpretation. A key aspect of the image collection during planetary rover missions is determining the optimal viewing positions of rover instruments from different positions ('wide baseline stereo'). For the collection of high quality panoramas and stereo imagery the visibility of regions of interest from those positions, and the amount of common features shared by each stereo-pair, or image bundle is crucial. The creation of a highly accurate and reliable 3D surface, in the form of an Ordered Point Cloud (OPC), of the planetary surface, with a low rate of error and a minimum of artefacts, is greatly enhanced by using images that share a high amount of features and a sufficient overlap for wide baseline stereo or target selection. To support users in the selection of adequate viewpoints an interactive View Planner was integrated into PRo3D. The users choose from a set of different rovers and their respective instruments. PRo3D supports for instance the PanCam instrument of ESA's ExoMars 2020 rover mission or the Mastcam-Z camera of NASA's Mars2020 mission. The View Planner uses a DTM obtained from orbiter imagery, which can also be complemented with rover-derived DOMs as the mission progresses. The selected rover is placed onto a position on the terrain - interactively or using the current rover pose as known from the mission. The rover's base polygon and its local coordinate axes, and the chosen instrument's up- and forward vectors are visualised. The parameters of the instrument's pan and tilt unit (PTU) can be altered via the user interface, or alternatively calculated by selecting a target point on the visualised DTM. In the 3D view, the visible region of the planetary surface, resulting from these settings and the camera field-of-view is visualised by a highlighted region with a red border, representing the instruments footprint. The camera view is simulated and rendered in a separate window and PTU parameters can be interactively adjusted, allowing viewpoints, directions, and the expected image to be visualised in real-time in order to allow users the fine-tuning of these settings. In this way, ideal viewpoints and PTU settings for various rover models and instruments can efficiently be defined, resulting in an optimum imagery of the regions of interest.
Unified Research on Network-Based Hard/Soft Information Fusion
2016-02-02
types). There are a number of search tree run parameters which must be set depending on the experimental setting. A pilot study was run to identify...Unlimited Final Report: Unified Research on Network-Based Hard/Soft Information Fusion The views, opinions and/or findings contained in this report...Final Report: Unified Research on Network-Based Hard/Soft Information Fusion Report Title The University at Buffalo (UB) Center for Multisource
A large-scale, long-term study of scale drift: The micro view and the macro view
NASA Astrophysics Data System (ADS)
He, W.; Li, S.; Kingsbury, G. G.
2016-11-01
The development of measurement scales for use across years and grades in educational settings provides unique challenges, as instructional approaches, instructional materials, and content standards all change periodically. This study examined the measurement stability of a set of Rasch measurement scales that have been in place for almost 40 years. In order to investigate the stability of these scales, item responses were collected from a large set of students who took operational adaptive tests using items calibrated to the measurement scales. For the four scales that were examined, item samples ranged from 2183 to 7923 items. Each item was administered to at least 500 students in each grade level, resulting in approximately 3000 responses per item. Stability was examined at the micro level analysing change in item parameter estimates that have occurred since the items were first calibrated. It was also examined at the macro level, involving groups of items and overall test scores for students. Results indicated that individual items had changes in their parameter estimates, which require further analysis and possible recalibration. At the same time, the results at the total score level indicate substantial stability in the measurement scales over the span of their use.
NASA Technical Reports Server (NTRS)
Ardanuy, Phillip E.; Hucek, Richard R.; Groveman, Brian S.; Kyle, H. Lee
1987-01-01
A deconvolution technique is employed that permits recovery of daily averaged earth radiation budget (ERB) parameters at the top of the atmosphere from a set of the Nimbus 7 ERB wide field of view (WFOV) measurements. Improvements in both the spatial resolution of the resultant fields and in the fidelity of the time averages is obtained. The algorithm is evaluated on a set of months during the period 1980-1983. The albedo, outgoing long-wave radiation, and net radiation parameters are analyzed. The amplitude and phase of the quasi-stationary patterns that appear in the spatially deconvolved fields describe the radiation budget components for 'normal' as well as the El Nino/Southern Oscillation (ENSO) episode years. They delineate the seasonal development of large-scale features inherent in the earth's radiation budget as well as the natural variability of interannual differences. These features are underscored by the powerful emergence of the 1982-1983 ENSO event in the fields displayed. The conclusion is that with this type of resolution enhancement, WFOV radiometers provide a useful tool for the observation of the contemporary climate and its variability.
Loss of stability of a railway wheel-set, subcritical or supercritical
NASA Astrophysics Data System (ADS)
Zhang, Tingting; Dai, Huanyun
2017-11-01
Most researches on railway vehicle stability analysis are focused on the codimension 1 (for short, codim 1) bifurcations like subcritical and supercritical Hopf bifurcation. The analysis of codim 1 bifurcation can be completed based on one bifurcation parameter. However, two bifurcation parameters should be considered to give a general view of the motion of the system when it undergoes a degenerate Hopf bifurcation. This kind of bifurcation named the generalised Hopf bifurcation belongs to the codimension 2 (for short, codim 2) bifurcations where two bifurcation parameters need to be taken into consideration. In this paper, we give a numerical analysis of the codim 2 bifurcations of a nonlinear railway wheel-set with the QR algorithm to calculate the eigenvalues of the linearised system incorporating the Golden Cut method and the shooting method to calculate the limit cycles around the Hopf bifurcation points. Here, we found the existence of a generalised Hopf bifurcation where a subcritical Hopf bifurcation turns into a supercritical one with the increase of the bifurcation parameters, which belong to the codim 2 bifurcations, in a nonlinear railway wheel-set model. Only the nonlinear wheel/rail interactive relationship has been taken into consideration in the lateral model that is formulated in this paper. The motion of the wheel-set has been investigated when the bifurcation parameters are perturbed in the neighbourhood of their critical parameters, and the influences of different parameters on critical values of the bifurcation parameters are also given. From the results, it can be seen that the bifurcation types of the wheel-set will change with a variation of the bifurcation parameters in the neighbourhood of their critical values.
Is Morphosyntactic Change Really Rare?
ERIC Educational Resources Information Center
Thomason, Sarah G.
2011-01-01
Jurgen Meisel argues that "grammatical variation...can be described...in terms of parametric variation", and--crucially for his arguments in this paper--that "parameter settings do not change across the lifespan". To this extent he adopts the standard generative view, but he then departs from what he calls "the literature on historical…
Visual exploration of parameter influence on phylogenetic trees.
Hess, Martin; Bremm, Sebastian; Weissgraeber, Stephanie; Hamacher, Kay; Goesele, Michael; Wiemeyer, Josef; von Landesberger, Tatiana
2014-01-01
Evolutionary relationships between organisms are frequently derived as phylogenetic trees inferred from multiple sequence alignments (MSAs). The MSA parameter space is exponentially large, so tens of thousands of potential trees can emerge for each dataset. A proposed visual-analytics approach can reveal the parameters' impact on the trees. Given input trees created with different parameter settings, it hierarchically clusters the trees according to their structural similarity. The most important clusters of similar trees are shown together with their parameters. This view offers interactive parameter exploration and automatic identification of relevant parameters. Biologists applied this approach to real data of 16S ribosomal RNA and protein sequences of ion channels. It revealed which parameters affected the tree structures. This led to a more reliable selection of the best trees.
NASA Astrophysics Data System (ADS)
Chen, D. M.; Clapp, R. G.; Biondi, B.
2006-12-01
Ricksep is a freely-available interactive viewer for multi-dimensional data sets. The viewer is very useful for simultaneous display of multiple data sets from different viewing angles, animation of movement along a path through the data space, and selection of local regions for data processing and information extraction. Several new viewing features are added to enhance the program's functionality in the following three aspects. First, two new data synthesis algorithms are created to adaptively combine information from a data set with mostly high-frequency content, such as seismic data, and another data set with mainly low-frequency content, such as velocity data. Using the algorithms, these two data sets can be synthesized into a single data set which resembles the high-frequency data set on a local scale and at the same time resembles the low- frequency data set on a larger scale. As a result, the originally separated high and low-frequency details can now be more accurately and conveniently studied together. Second, a projection algorithm is developed to display paths through the data space. Paths are geophysically important because they represent wells into the ground. Two difficulties often associated with tracking paths are that they normally cannot be seen clearly inside multi-dimensional spaces and depth information is lost along the direction of projection when ordinary projection techniques are used. The new algorithm projects samples along the path in three orthogonal directions and effectively restores important depth information by using variable projection parameters which are functions of the distance away from the path. Multiple paths in the data space can be generated using different character symbols as positional markers, and users can easily create, modify, and view paths in real time. Third, a viewing history list is implemented which enables Ricksep's users to create, edit and save a recipe for the sequence of viewing states. Then, the recipe can be loaded into an active Ricksep session, after which the user can navigate to any state in the sequence and modify the sequence from that state. Typical uses of this feature are undoing and redoing viewing commands and animating a sequence of viewing states. The theoretical discussion are carried out and several examples using real seismic data are provided to show how these new Ricksep features provide more convenient, accurate ways to manipulate multi-dimensional data sets.
Psychophysical Comparison Of A Video Display System To Film By Using Bone Fracture Images
NASA Astrophysics Data System (ADS)
Seeley, George W.; Stempski, Mark; Roehrig, Hans; Nudelman, Sol; Capp, M. P.
1982-11-01
This study investigated the possibility of using a video display system instead of film for radiological diagnosis. Also investigated were the relationships between characteristics of the system and the observer's accuracy level. Radiologists were used as observers. Thirty-six clinical bone fractures were separated into two matched sets of equal difficulty. The difficulty parameters and ratings were defined by a panel of expert bone radiologists at the Arizona Health Sciences Center, Radiology Department. These two sets of fracture images were then matched with verifiably normal images using parameters such as film type, angle of view, size, portion of anatomy, the film's density range, and the patient's age and sex. The two sets of images were then displayed, using a counterbalanced design, to each of the participating radiologists for diagnosis. Whenever a response was given to a video image, the radiologist used enhancement controls to "window in" on the grey levels of interest. During the TV phase, the radiologist was required to record the settings of the calibrated controls of the image enhancer during interpretation. At no time did any single radiologist see the same film in both modes. The study was designed so that a standard analysis of variance would show the effects of viewing mode (film vs TV), the effects due to stimulus set, and any interactions with observers. A signal detection analysis of observer performance was also performed. Results indicate that the TV display system is almost as good as the view box display; an average of only two more errors were made on the TV display. The difference between the systems has been traced to four observers who had poor accuracy on a small number of films viewed on the TV display. This information is now being correlated with the video system's signal-to-noise ratio (SNR), signal transfer function (STF), and resolution measurements, to obtain information on the basic display and enhancement requirements for a video-based radiologic system. Due to time constraints the results are not included here. The complete results of this study will be reported at the conference.
NASA Astrophysics Data System (ADS)
Ray, Shonket; Keller, Brad M.; Chen, Jinbo; Conant, Emily F.; Kontos, Despina
2016-03-01
This work details a methodology to obtain optimal parameter values for a locally-adaptive texture analysis algorithm that extracts mammographic texture features representative of breast parenchymal complexity for predicting falsepositive (FP) recalls from breast cancer screening with digital mammography. The algorithm has two components: (1) adaptive selection of localized regions of interest (ROIs) and (2) Haralick texture feature extraction via Gray- Level Co-Occurrence Matrices (GLCM). The following parameters were systematically varied: mammographic views used, upper limit of the ROI window size used for adaptive ROI selection, GLCM distance offsets, and gray levels (binning) used for feature extraction. Each iteration per parameter set had logistic regression with stepwise feature selection performed on a clinical screening cohort of 474 non-recalled women and 68 FP recalled women; FP recall prediction was evaluated using area under the curve (AUC) of the receiver operating characteristic (ROC) and associations between the extracted features and FP recall were assessed via odds ratios (OR). A default instance of mediolateral (MLO) view, upper ROI size limit of 143.36 mm (2048 pixels2), GLCM distance offset combination range of 0.07 to 0.84 mm (1 to 12 pixels) and 16 GLCM gray levels was set. The highest ROC performance value of AUC=0.77 [95% confidence intervals: 0.71-0.83] was obtained at three specific instances: the default instance, upper ROI window equal to 17.92 mm (256 pixels2), and gray levels set to 128. The texture feature of sum average was chosen as a statistically significant (p<0.05) predictor and associated with higher odds of FP recall for 12 out of 14 total instances.
The JPL Tropical Cyclone Information System: Data and Tools for Researchers
NASA Astrophysics Data System (ADS)
Knosp, B. W.; Ao, C. O.; Chao, Y.; Dang, V.; Garay, M.; Haddad, Z.; Hristova-Veleva, S.; Lambrigtsen, B.; Li, P. P.; Park, K.; Poulsen, W. L.; Rosenman, M. A.; Su, H.; Vane, D.; Vu, Q. A.; Willis, J. K.; Wu, D.
2008-12-01
The JPL Tropical Cyclone Information System (TCIS) is now open to the public. This web portal is designed to assist researchers by providing a one-stop shop for hurricane related data and analysis tools. While there are currently many places that offer storm data, plots, and other information, none offer an extensive archive of data files and images in a common space. The JPL TCIS was created to fill this gap. As currently configured, the JPL Tropical Cyclone Portal has three main features for researchers. The first feature consists of storm-scale data and plots for both observed and modeled data. As of the TCIS' first release, the entire 2005 storm season has been populated with data and plots from AIRS, MLS, AMSU-A, QuikSCAT, Argo floats, WRF models, GPS, and others. Storm data is subsetted to a 1000x1000 km window around the hurricane track for all six oceanic cyclone basins, and all the available data during the life time of any storm can be downloaded with one mouse click. Users can also view pre-generated storm-scale plots from all these data sets that are all co-located to the same temporal and spatial parameters. Work is currently underway to backfill all storm seasons to 1998 with as many relevant data sets as possible. The second offering from this web portal are large-scale data sets and associated visualization tools powered by Google Maps. On this interactive map, researchers can view a particular storm's intensity and track. Users may also overlay large-scale data such as aerosol maps from MODIS and MISR, and a blended microwave sea-surface temperature (SST) to gain an understanding of the large-scale environment of the storm. For example, by using this map, the cold sea-surface temperature wake can be tracked as a storm passes by. The third feature of this portal deals with interactive model and data analysis. A single-parameter analysis tools has recently been developed and added to this portal where users can plot maps, profiles, and histograms of any given data set on this portal and also get several statistics, such as the mean, standard deviation, and median of the data they are viewing. Also available is the ability to compare and condition data sets with each other. For example, users can choose to view sea surface temperature when wind speed is X m/s. Additional data sets continue to be added to this tool and it will eventually expand to include multi- parameter analyses. In this presentation, we will describe the current configuration of the JPL Tropical Cyclone Portal and demonstrate how it will be an asset to researchers. Future plans for the site will also be discussed.
NASA Technical Reports Server (NTRS)
Haines, R. F.
1978-01-01
Eight commercial pilots were shown 50 colored, high fidelity slides of a standard instrument panel (IP) with the needle positions of each instrument varying from slide to slide and then 50 slides of a head-up display (HUD) symbology format which contained an equivalent amount of flight-related information as the instrument panel slides. All stimuli were presented under controlled, static viewing conditions that allowed the measurement of the speed and accuracy with which one randomly selected flight parameter on each slide could be read. The subject did not know which parameter would be requested and, therefore, had to remember the total set of information in order to answer the question correctly. The results showed that from 6.6 - 8.7 sec total viewing time was required to correctly extract altitude, airspeed, heading, VSI, or ADI from the IP slides and from 6.1 to 7.4 sec for the HUD slides.
AERIAL VIEW, LOOKING NORTH. THE STRUCTURES WITHIN THE WALLED PRECINCT ...
AERIAL VIEW, LOOKING NORTH. THE STRUCTURES WITHIN THE WALLED PRECINCT ONE BLOCK TO THE NORTH OF THE PENITENTIARY (ALONG THE TOP OF THE PHOTOGRAPH) COMPRISE GIRARD COLLEGE. ESTABLISHED IN 1833 UPON THE DEATH OF STEPHEN GIRARD, THE PRECINCT WALL AND ORIGINAL BUILDINGS WERE CONSTRUCTED BETWEEN 1833 AND 1848 ON DESIGNS BY WILLIAM STRICKLAND, WHICH WERE CONCEIVED WITHIN PARAMETERS SET BY GIRARDS WILL. THE SCHOOL IS STILL USED FOR ITS INTENDED EDUCATIONAL FUNCTION. SEE HABS NO. PA-1731 FOR ADDITIONAL INFORMATION. - Eastern State Penitentiary, 2125 Fairmount Avenue, Philadelphia, Philadelphia County, PA
Development of Data Acquisition Set-up for Steady-state Experiments
NASA Astrophysics Data System (ADS)
Srivastava, Amit K.; Gupta, Arnab D.; Sunil, S.; Khan, Ziauddin
2017-04-01
For short duration experiments, generally digitized data is transferred for processing and storage after the experiment whereas in case of steady-state experiment the data is acquired, processed, displayed and stored continuously in pipelined manner. This requires acquiring data through special techniques for storage and on-the-go viewing data to display the current data trends for various physical parameters. A small data acquisition set-up is developed for continuously acquiring signals from various physical parameters at different sampling rate for long duration experiment. This includes the hardware set-up for signal digitization, Field Programmable Gate Arrays (FPGA) based timing system for clock synchronization and event/trigger distribution, time slicing of data streams for storage of data chunks to enable viewing of data during acquisition and channel profile display through down sampling etc. In order to store a long data stream of indefinite/long time duration, the data stream is divided into data slices/chunks of user defined time duration. Data chunks avoid the problem of non-access of server data until the channel data file is closed at the end of the long duration experiment. A graphical user interface has been developed in Lab VIEW application development environment for configuring the data acquisition hardware and storing data chunks on local machine as well as at remote data server through Python for further data access. The data plotting and analysis utilities have been developed with Python software, which provides tools for further data processing. This paper describes the development and implementation of data acquisition for steady-state experiment.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Temperature based Restricted Boltzmann Machines
NASA Astrophysics Data System (ADS)
Li, Guoqi; Deng, Lei; Xu, Yi; Wen, Changyun; Wang, Wei; Pei, Jing; Shi, Luping
2016-01-01
Restricted Boltzmann machines (RBMs), which apply graphical models to learning probability distribution over a set of inputs, have attracted much attention recently since being proposed as building blocks of multi-layer learning systems called deep belief networks (DBNs). Note that temperature is a key factor of the Boltzmann distribution that RBMs originate from. However, none of existing schemes have considered the impact of temperature in the graphical model of DBNs. In this work, we propose temperature based restricted Boltzmann machines (TRBMs) which reveals that temperature is an essential parameter controlling the selectivity of the firing neurons in the hidden layers. We theoretically prove that the effect of temperature can be adjusted by setting the parameter of the sharpness of the logistic function in the proposed TRBMs. The performance of RBMs can be improved by adjusting the temperature parameter of TRBMs. This work provides a comprehensive insights into the deep belief networks and deep learning architectures from a physical point of view.
Photoelectric effect from observer's mathematics point of view
NASA Astrophysics Data System (ADS)
Khots, Boris; Khots, Dmitriy
2014-12-01
When we consider and analyze physical events with the purpose of creating corresponding models we often assume that the mathematical apparatus used in modeling is infallible. In particular, this relates to the use of infinity in various aspects and the use of Newton's definition of a limit in analysis. We believe that is where the main problem lies in contemporary study of nature. This work considers Physical aspects in a setting of arithmetic, algebra, geometry, analysis, topology provided by Observer's Mathematics (see www.mathrelativity.com). Certain results and communications pertaining to solution of these problems are provided. In particular, we prove the following Theorems, which give Observer's Mathematics point of view on Einstein photoelectric effect theory and Lamb-Scully and Hanbury-Brown-Twiss experiments: Theorem 1. There are some values of light intensity where anticorrelation parameter A ∈ [0,1). Theorem 2. There are some values of light intensity where anticorrelation parameter A = 1. Theorem 3. There are some values of light intensity where anticorrelation parameter A > 1.
Doerry, Armin W.
2004-07-20
Movement of a GMTI radar during a coherent processing interval over which a set of radar pulses are processed may cause defocusing of a range-Doppler map in the video signal. This problem may be compensated by varying waveform or sampling parameters of each pulse to compensate for distortions caused by variations in viewing angles from the radar to the target.
NASA Astrophysics Data System (ADS)
Marchand, Gabriel; Soetens, Jean-Christophe; Jacquemin, Denis; Bopp, Philippe A.
2015-12-01
We demonstrate that different sets of Lennard-Jones parameters proposed for the Na+ ion, in conjunction with the empirical combining rules routinely used in simulation packages, can lead to essentially different equilibrium structures for a deprotonated poly-L-glutamic acid molecule (poly-L-glutamate) dissolved in a 0.3M aqueous NaCl solution. It is, however, difficult to discriminate a priori between these model potentials; when investigating the structure of the Na+-solvation shell in bulk NaCl solution, all parameter sets lead to radial distribution functions and solvation numbers in broad agreement with the available experimental data. We do not find any such dependency of the equilibrium structure on the parameters associated with the Cl- ion. This work does not aim at recommending a particular set of parameters for any particular purpose. Instead, it stresses the model dependence of simulation results for complex systems such as biomolecules in solution and thus the difficulties if simulations are to be used for unbiased predictions, or to discriminate between contradictory experiments. However, this opens the possibility of validating a model specifically in view of analyzing experimental data believed to be reliable.
An overview of STRUCTURE: applications, parameter settings, and supporting software
Porras-Hurtado, Liliana; Ruiz, Yarimar; Santos, Carla; Phillips, Christopher; Carracedo, Ángel; Lareu, Maria V.
2013-01-01
Objectives: We present an up-to-date review of STRUCTURE software: one of the most widely used population analysis tools that allows researchers to assess patterns of genetic structure in a set of samples. STRUCTURE can identify subsets of the whole sample by detecting allele frequency differences within the data and can assign individuals to those sub-populations based on analysis of likelihoods. The review covers STRUCTURE's most commonly used ancestry and frequency models, plus an overview of the main applications of the software in human genetics including case-control association studies (CCAS), population genetics, and forensic analysis. The review is accompanied by supplementary material providing a step-by-step guide to running STRUCTURE. Methods: With reference to a worked example, we explore the effects of changing the principal analysis parameters on STRUCTURE results when analyzing a uniform set of human genetic data. Use of the supporting software: CLUMPP and distruct is detailed and we provide an overview and worked example of STRAT software, applicable to CCAS. Conclusion: The guide offers a simplified view of how STRUCTURE, CLUMPP, distruct, and STRAT can be applied to provide researchers with an informed choice of parameter settings and supporting software when analyzing their own genetic data. PMID:23755071
NASA Astrophysics Data System (ADS)
Grzenia, B. J.; Jones, C. E.; Tycner, C.; Sigut, T. A. A.
2016-11-01
The B-emission stars 48 Per (HD 25940, HR 1273) and ψ Per (HD 22192, HR 1087) share similar stellar parameters with their disks viewed near pole-on in the case of 48 Per, and near edge-on for ψ Per. An extensive set of high-quality interferometric observations were obtained for both stars between 2006 and 2011 with the Navy Precision Optical Interferometer (NPOI) in the Hα emitting region. Using a three-step modelling process, model visibilities are compared to observations with a view toward achieving better constraints on the disk models than were possible with previous studies.
Perceptual Calibration for Immersive Display Environments
Ponto, Kevin; Gleicher, Michael; Radwin, Robert G.; Shin, Hyun Joon
2013-01-01
The perception of objects, depth, and distance has been repeatedly shown to be divergent between virtual and physical environments. We hypothesize that many of these discrepancies stem from incorrect geometric viewing parameters, specifically that physical measurements of eye position are insufficiently precise to provide proper viewing parameters. In this paper, we introduce a perceptual calibration procedure derived from geometric models. While most research has used geometric models to predict perceptual errors, we instead use these models inversely to determine perceptually correct viewing parameters. We study the advantages of these new psychophysically determined viewing parameters compared to the commonly used measured viewing parameters in an experiment with 20 subjects. The perceptually calibrated viewing parameters for the subjects generally produced new virtual eye positions that were wider and deeper than standard practices would estimate. Our study shows that perceptually calibrated viewing parameters can significantly improve depth acuity, distance estimation, and the perception of shape. PMID:23428454
Effect of electric potential and current on mandibular linear measurements in cone beam CT.
Panmekiate, S; Apinhasmit, W; Petersson, A
2012-10-01
The purpose of this study was to compare mandibular linear distances measured from cone beam CT (CBCT) images produced by different radiographic parameter settings (peak kilovoltage and milliampere value). 20 cadaver hemimandibles with edentulous ridges posterior to the mental foramen were embedded in clear resin blocks and scanned by a CBCT machine (CB MercuRay(TM); Hitachi Medico Technology Corp., Chiba-ken, Japan). The radiographic parameters comprised four peak kilovoltage settings (60 kVp, 80 kVp, 100 kVp and 120 kVp) and two milliampere settings (10 mA and 15 mA). A 102.4 mm field of view was chosen. Each hemimandible was scanned 8 times with 8 different parameter combinations resulting in 160 CBCT data sets. On the cross-sectional images, six linear distances were measured. To assess the intraobserver variation, the 160 data sets were remeasured after 2 weeks. The measurement precision was calculated using Dahlberg's formula. With the same peak kilovoltage, the measurements yielded by different milliampere values were compared using the paired t-test. With the same milliampere value, the measurements yielded by different peak kilovoltage were compared using analysis of variance. A significant difference was considered when p < 0.05. Measurement precision varied from 0.03 mm to 0.28 mm. No significant differences in the distances were found among the different radiographic parameter combinations. Based upon the specific machine in the present study, low peak kilovoltage and milliampere value might be used for linear measurements in the posterior mandible.
ERIC Educational Resources Information Center
Wright, John C.; And Others
A conceptual model of how children process televised information was developed with the goal of identifying those parameters of the process that are both measurable and manipulable in research settings. The model presented accommodates the nature of information processing both by the child and by the presentation by the medium. Presentation is…
NASA Technical Reports Server (NTRS)
Diner, Daniel B. (Inventor)
1991-01-01
Methods for providing stereoscopic image presentation and stereoscopic configurations using stereoscopic viewing systems having converged or parallel cameras may be set up to reduce or eliminate erroneously perceived accelerations and decelerations by proper selection of parameters, such as an image magnification factor, q, and intercamera distance, 2w. For converged cameras, q is selected to be equal to Ve - qwl = 0, where V is the camera distance, e is half the interocular distance of an observer, w is half the intercamera distance, and l is the actual distance from the first nodal point of each camera to the convergence point, and for parallel cameras, q is selected to be equal to e/w. While converged cameras cannot be set up to provide fully undistorted three-dimensional views, they can be set up to provide a linear relationship between real and apparent depth and thus minimize erroneously perceived accelerations and decelerations for three sagittal planes, x = -w, x = 0, and x = +w which are indicated to the observer. Parallel cameras can be set up to provide fully undistorted three-dimensional views by controlling the location of the observer and by magnification and shifting of left and right images. In addition, the teachings of this disclosure can be used to provide methods of stereoscopic image presentation and stereoscopic camera configurations to produce a nonlinear relation between perceived and real depth, and erroneously produce or enhance perceived accelerations and decelerations in order to provide special effects for entertainment, training, or educational purposes.
Chaos control of Hastings-Powell model by combining chaotic motions.
Danca, Marius-F; Chattopadhyay, Joydev
2016-04-01
In this paper, we propose a Parameter Switching (PS) algorithm as a new chaos control method for the Hastings-Powell (HP) system. The PS algorithm is a convergent scheme that switches the control parameter within a set of values while the controlled system is numerically integrated. The attractor obtained with the PS algorithm matches the attractor obtained by integrating the system with the parameter replaced by the averaged value of the switched parameter values. The switching rule can be applied periodically or randomly over a set of given values. In this way, every stable cycle of the HP system can be approximated if its underlying parameter value equalizes the average value of the switching values. Moreover, the PS algorithm can be viewed as a generalization of Parrondo's game, which is applied for the first time to the HP system, by showing that losing strategy can win: "losing + losing = winning." If "loosing" is replaced with "chaos" and, "winning" with "order" (as the opposite to "chaos"), then by switching the parameter value in the HP system within two values, which generate chaotic motions, the PS algorithm can approximate a stable cycle so that symbolically one can write "chaos + chaos = regular." Also, by considering a different parameter control, new complex dynamics of the HP model are revealed.
Chaos control of Hastings-Powell model by combining chaotic motions
NASA Astrophysics Data System (ADS)
Danca, Marius-F.; Chattopadhyay, Joydev
2016-04-01
In this paper, we propose a Parameter Switching (PS) algorithm as a new chaos control method for the Hastings-Powell (HP) system. The PS algorithm is a convergent scheme that switches the control parameter within a set of values while the controlled system is numerically integrated. The attractor obtained with the PS algorithm matches the attractor obtained by integrating the system with the parameter replaced by the averaged value of the switched parameter values. The switching rule can be applied periodically or randomly over a set of given values. In this way, every stable cycle of the HP system can be approximated if its underlying parameter value equalizes the average value of the switching values. Moreover, the PS algorithm can be viewed as a generalization of Parrondo's game, which is applied for the first time to the HP system, by showing that losing strategy can win: "losing + losing = winning." If "loosing" is replaced with "chaos" and, "winning" with "order" (as the opposite to "chaos"), then by switching the parameter value in the HP system within two values, which generate chaotic motions, the PS algorithm can approximate a stable cycle so that symbolically one can write "chaos + chaos = regular." Also, by considering a different parameter control, new complex dynamics of the HP model are revealed.
Direct computation of orbital sunrise or sunset event parameters
NASA Technical Reports Server (NTRS)
Buglia, J. J.
1986-01-01
An analytical method is developed for determining the geometrical parameters which are needed to describe the viewing angles of the Sun relative to an orbiting spacecraft when the Sun rises or sets with respect to the spacecraft. These equations are rigorous and are frequently used for parametric studies relative to mission planning and for determining instrument parameters. The text is wholly self-contained in that no external reference to ephemerides or other astronomical tables is needed. Equations are presented which allow the computation of Greenwich sidereal time and right ascension and declination of the Sun generally to within a few seconds of arc, or a few tenths of a second in time.
Quality and Control of Water Vapor Winds
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.; Atkinson, Robert J.
1996-01-01
Water vapor imagery from the geostationary satellites such as GOES, Meteosat, and GMS provides synoptic views of dynamical events on a continual basis. Because the imagery represents a non-linear combination of mid- and upper-tropospheric thermodynamic parameters (three-dimensional variations in temperature and humidity), video loops of these image products provide enlightening views of regional flow fields, the movement of tropical and extratropical storm systems, the transfer of moisture between hemispheres and from the tropics to the mid- latitudes, and the dominance of high pressure systems over particular regions of the Earth. Despite the obvious larger scale features, the water vapor imagery contains significant image variability down to the single 8 km GOES pixel. These features can be quantitatively identified and tracked from one time to the next using various image processing techniques. Merrill et al. (1991), Hayden and Schmidt (1992), and Laurent (1993) have documented the operational procedures and capabilities of NOAA and ESOC to produce cloud and water vapor winds. These techniques employ standard correlation and template matching approaches to wind tracking and use qualitative and quantitative procedures to eliminate bad wind vectors from the wind data set. Techniques have also been developed to improve the quality of the operational winds though robust editing procedures (Hayden and Veldon 1991). These quality and control approaches have limitations, are often subjective, and constrain wind variability to be consistent with model derived wind fields. This paper describes research focused on the refinement of objective quality and control parameters for water vapor wind vector data sets. New quality and control measures are developed and employed to provide a more robust wind data set for climate analysis, data assimilation studies, as well as operational weather forecasting. The parameters are applicable to cloud-tracked winds as well with minor modifications. The improvement in winds through use of these new quality and control parameters is measured without the use of rawinsonde or modeled wind field data and compared with other approaches.
Bell's theorem and the problem of decidability between the views of Einstein and Bohr.
Hess, K; Philipp, W
2001-12-04
Einstein, Podolsky, and Rosen (EPR) have designed a gedanken experiment that suggested a theory that was more complete than quantum mechanics. The EPR design was later realized in various forms, with experimental results close to the quantum mechanical prediction. The experimental results by themselves have no bearing on the EPR claim that quantum mechanics must be incomplete nor on the existence of hidden parameters. However, the well known inequalities of Bell are based on the assumption that local hidden parameters exist and, when combined with conflicting experimental results, do appear to prove that local hidden parameters cannot exist. This fact leaves only instantaneous actions at a distance (called "spooky" by Einstein) to explain the experiments. The Bell inequalities are based on a mathematical model of the EPR experiments. They have no experimental confirmation, because they contradict the results of all EPR experiments. In addition to the assumption that hidden parameters exist, Bell tacitly makes a variety of other assumptions; for instance, he assumes that the hidden parameters are governed by a single probability measure independent of the analyzer settings. We argue that the mathematical model of Bell excludes a large set of local hidden variables and a large variety of probability densities. Our set of local hidden variables includes time-like correlated parameters and a generalized probability density. We prove that our extended space of local hidden variables does permit derivation of the quantum result and is consistent with all known experiments.
An extensive dataset of eye movements during viewing of complex images.
Wilming, Niklas; Onat, Selim; Ossandón, José P; Açık, Alper; Kietzmann, Tim C; Kaspar, Kai; Gameiro, Ricardo R; Vormberg, Alexandra; König, Peter
2017-01-31
We present a dataset of free-viewing eye-movement recordings that contains more than 2.7 million fixation locations from 949 observers on more than 1000 images from different categories. This dataset aggregates and harmonizes data from 23 different studies conducted at the Institute of Cognitive Science at Osnabrück University and the University Medical Center in Hamburg-Eppendorf. Trained personnel recorded all studies under standard conditions with homogeneous equipment and parameter settings. All studies allowed for free eye-movements, and differed in the age range of participants (~7-80 years), stimulus sizes, stimulus modifications (phase scrambled, spatial filtering, mirrored), and stimuli categories (natural and urban scenes, web sites, fractal, pink-noise, and ambiguous artistic figures). The size and variability of viewing behavior within this dataset presents a strong opportunity for evaluating and comparing computational models of overt attention, and furthermore, for thoroughly quantifying strategies of viewing behavior. This also makes the dataset a good starting point for investigating whether viewing strategies change in patient groups.
ERIC Educational Resources Information Center
Revista de Documentacao de Estudos em Linguistica Teorica e Aplicada, 2000
2000-01-01
This issue contains the following articles: "Resumption and Last Resort" (Joseph Aoun); "Existentials, A-Chains, and Reconstruction" (Norbert Hornstein); "How Long Was the Nineteenth Century" (David Lightfoot); "Formal Features and Parameter Setting: A View From Portuguese Past Participles and Romance Future…
An Integrated Framework for Parameter-based Optimization of Scientific Workflows.
Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel
2009-01-01
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
NASA Technical Reports Server (NTRS)
Parkinson, C. L.; Comiso, J. C.; Zwally, H. J.
1987-01-01
A summary data set for four years (mid 70's) of Arctic sea ice conditions is available on magnetic tape. The data include monthly and yearly averaged Nimbus 5 electrically scanning microwave radiometer (ESMR) brightness temperatures, an ice concentration parameter derived from the brightness temperatures, monthly climatological surface air temperatures, and monthly climatological sea level pressures. All data matrices are applied to 293 by 293 grids that cover a polar stereographic map enclosing the 50 deg N latitude circle. The grid size varies from about 32 X 32 km at the poles to about 28 X 28 km at 50 deg N. The ice concentration parameter is calculated assuming that the field of view contains only open water and first-year ice with an ice emissivity of 0.92. To account for the presence of multiyear ice, a nomogram is provided relating the ice concentration parameter, the total ice concentration, and the fraction of the ice cover which is multiyear ice.
Kosmulski, Marek
2012-01-01
The numerical values of points of zero charge (PZC, obtained by potentiometric titration) and of isoelectric points (IEP) of various materials reported in the literature have been analyzed. In sets of results reported for the same chemical compound (corresponding to certain chemical formula and crystallographic structure), the IEP are relatively consistent. In contrast, in materials other than metal oxides, the sets of PZC are inconsistent. In view of the inconsistence in the sets of PZC and of the discrepancies between PZC and IEP reported for the same material, it seems that IEP is more suitable than PZC as the unique number characterizing the pH-dependent surface charging of materials other than metal oxides. The present approach is opposite to the usual approach, in which the PZC and IEP are considered as two equally important parameters characterizing the pH-dependent surface charging of materials other than metal oxides. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan
2011-01-01
A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.
Energy optimization for upstream data transfer in 802.15.4 beacon-enabled star formulation
NASA Astrophysics Data System (ADS)
Liu, Hua; Krishnamachari, Bhaskar
2008-08-01
Energy saving is one of the major concerns for low rate personal area networks. This paper models energy consumption for beacon-enabled time-slotted media accessing control cooperated with sleeping scheduling in a star network formulation for IEEE 802.15.4 standard. We investigate two different upstream (data transfer from devices to a network coordinator) strategies: a) tracking strategy: the devices wake up and check status (track the beacon) in each time slot; b) non-tracking strategy: nodes only wake-up upon data arriving and stay awake till data transmitted to the coordinator. We consider the tradeoff between energy cost and average data transmission delay for both strategies. Both scenarios are formulated as optimization problems and the optimal solutions are discussed. Our results show that different data arrival rate and system parameters (such as contention access period interval, upstream speed etc.) result in different strategies in terms of energy optimization with maximum delay constraints. Hence, according to different applications and system settings, different strategies might be chosen by each node to achieve energy optimization for both self-interested view and system view. We give the relation among the tunable parameters by formulas and plots to illustrate which strategy is better under corresponding parameters. There are two main points emphasized in our results with delay constraints: on one hand, when the system setting is fixed by coordinator, nodes in the network can intelligently change their strategies according to corresponding application data arrival rate; on the other hand, when the nodes' applications are known by the coordinator, the coordinator can tune the system parameters to achieve optimal system energy consumption.
Space Particle Hazard Specification, Forecasting, and Mitigation
2007-11-30
Automated FTP scripts permitted users to automatically update their global input parameter data set directly from the National Oceanic and...of CEASE capabilities. The angular field-of-view for CEASE is relatively large and will not allow for pitch angle resolved measurements. However... angular zones spanning 120° in the plane containing the magnetic field with an approximate 4° width in the direction perpendicular to the look-plane
Chaos control of Hastings–Powell model by combining chaotic motions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danca, Marius-F., E-mail: danca@rist.ro; Chattopadhyay, Joydev, E-mail: joydev@isical.ac.in
2016-04-15
In this paper, we propose a Parameter Switching (PS) algorithm as a new chaos control method for the Hastings–Powell (HP) system. The PS algorithm is a convergent scheme that switches the control parameter within a set of values while the controlled system is numerically integrated. The attractor obtained with the PS algorithm matches the attractor obtained by integrating the system with the parameter replaced by the averaged value of the switched parameter values. The switching rule can be applied periodically or randomly over a set of given values. In this way, every stable cycle of the HP system can bemore » approximated if its underlying parameter value equalizes the average value of the switching values. Moreover, the PS algorithm can be viewed as a generalization of Parrondo's game, which is applied for the first time to the HP system, by showing that losing strategy can win: “losing + losing = winning.” If “loosing” is replaced with “chaos” and, “winning” with “order” (as the opposite to “chaos”), then by switching the parameter value in the HP system within two values, which generate chaotic motions, the PS algorithm can approximate a stable cycle so that symbolically one can write “chaos + chaos = regular.” Also, by considering a different parameter control, new complex dynamics of the HP model are revealed.« less
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.
Simulation verification techniques study. Subsystem simulation validation techniques
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.
1974-01-01
Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.
A theoretical study of potentially observable chirality-sensitive NMR effects in molecules.
Garbacz, Piotr; Cukras, Janusz; Jaszuński, Michał
2015-09-21
Two recently predicted nuclear magnetic resonance effects, the chirality-induced rotating electric polarization and the oscillating magnetization, are examined for several experimentally available chiral molecules. We discuss in detail the requirements for experimental detection of chirality-sensitive NMR effects of the studied molecules. These requirements are related to two parameters: the shielding polarizability and the antisymmetric part of the nuclear magnetic shielding tensor. The dominant second contribution has been computed for small molecules at the coupled cluster and density functional theory levels. It was found that DFT calculations using the KT2 functional and the aug-cc-pCVTZ basis set adequately reproduce the CCSD(T) values obtained with the same basis set. The largest values of parameters, thus most promising from the experimental point of view, were obtained for the fluorine nuclei in 1,3-difluorocyclopropene and 1,3-diphenyl-2-fluoro-3-trifluoromethylcyclopropene.
Phase transition in the parametric natural visibility graph.
Snarskii, A A; Bezsudnov, I V
2016-10-01
We investigate time series by mapping them to the complex networks using a parametric natural visibility graph (PNVG) algorithm that generates graphs depending on arbitrary continuous parameter-the angle of view. We study the behavior of the relative number of clusters in PNVG near the critical value of the angle of view. Artificial and experimental time series of different nature are used for numerical PNVG investigations to find critical exponents above and below the critical point as well as the exponent in the finite size scaling regime. Altogether, they allow us to find the critical exponent of the correlation length for PNVG. The set of calculated critical exponents satisfies the basic Widom relation. The PNVG is found to demonstrate scaling behavior. Our results reveal the similarity between the behavior of the relative number of clusters in PNVG and the order parameter in the second-order phase transitions theory. We show that the PNVG is another example of a system (in addition to magnetic, percolation, superconductivity, etc.) with observed second-order phase transition.
Controllable 3D Display System Based on Frontal Projection Lenticular Screen
NASA Astrophysics Data System (ADS)
Feng, Q.; Sang, X.; Yu, X.; Gao, X.; Wang, P.; Li, C.; Zhao, T.
2014-08-01
A novel auto-stereoscopic three-dimensional (3D) projection display system based on the frontal projection lenticular screen is demonstrated. It can provide high real 3D experiences and the freedom of interaction. In the demonstrated system, the content can be changed and the dense of viewing points can be freely adjusted according to the viewers' demand. The high dense viewing points can provide smooth motion parallax and larger image depth without blurry. The basic principle of stereoscopic display is described firstly. Then, design architectures including hardware and software are demonstrated. The system consists of a frontal projection lenticular screen, an optimally designed projector-array and a set of multi-channel image processors. The parameters of the frontal projection lenticular screen are based on the demand of viewing such as the viewing distance and the width of view zones. Each projector is arranged on an adjustable platform. The set of multi-channel image processors are made up of six PCs. One of them is used as the main controller, the other five client PCs can process 30 channel signals and transmit them to the projector-array. Then a natural 3D scene will be perceived based on the frontal projection lenticular screen with more than 1.5 m image depth in real time. The control section is presented in detail, including parallax adjustment, system synchronization, distortion correction, etc. Experimental results demonstrate the effectiveness of this novel controllable 3D display system.
On robust parameter estimation in brain-computer interfacing
NASA Astrophysics Data System (ADS)
Samek, Wojciech; Nakajima, Shinichi; Kawanabe, Motoaki; Müller, Klaus-Robert
2017-12-01
Objective. The reliable estimation of parameters such as mean or covariance matrix from noisy and high-dimensional observations is a prerequisite for successful application of signal processing and machine learning algorithms in brain-computer interfacing (BCI). This challenging task becomes significantly more difficult if the data set contains outliers, e.g. due to subject movements, eye blinks or loose electrodes, as they may heavily bias the estimation and the subsequent statistical analysis. Although various robust estimators have been developed to tackle the outlier problem, they ignore important structural information in the data and thus may not be optimal. Typical structural elements in BCI data are the trials consisting of a few hundred EEG samples and indicating the start and end of a task. Approach. This work discusses the parameter estimation problem in BCI and introduces a novel hierarchical view on robustness which naturally comprises different types of outlierness occurring in structured data. Furthermore, the class of minimum divergence estimators is reviewed and a robust mean and covariance estimator for structured data is derived and evaluated with simulations and on a benchmark data set. Main results. The results show that state-of-the-art BCI algorithms benefit from robustly estimated parameters. Significance. Since parameter estimation is an integral part of various machine learning algorithms, the presented techniques are applicable to many problems beyond BCI.
DIRT: The Dust InfraRed Toolbox
NASA Astrophysics Data System (ADS)
Pound, M. W.; Wolfire, M. G.; Mundy, L. G.; Teuben, P. J.; Lord, S.
We present DIRT, a Java applet geared toward modeling a variety of processes in envelopes of young and evolved stars. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. The computing cluster for the database is described in the accompanying paper by Teuben et al. (2000). A typical user query will return about 50-100 models, which the user can then interactively filter as a function of 8 model parameters (e.g., extinction, size, flux, luminosity). A flexible, multi-dimensional plotter (Figure 1) allows users to view the models, rotate them, tag specific parameters with color or symbol size, and probe individual model points. For any given model, auxiliary plots such as dust grain properties, radial intensity profiles, and the flux as a function of wavelength and beamsize can be viewed. The user can fit observed data to several models simultaneously and see the results of the fit; the best fit is automatically selected for plotting. The URL for this project is http://dustem.astro.umd.edu.
Parametric-Studies and Data-Plotting Modules for the SOAP
NASA Technical Reports Server (NTRS)
2008-01-01
"Parametric Studies" and "Data Table Plot View" are the names of software modules in the Satellite Orbit Analysis Program (SOAP). Parametric Studies enables parameterization of as many as three satellite or ground-station attributes across a range of values and computes the average, minimum, and maximum of a specified metric, the revisit time, or 21 other functions at each point in the parameter space. This computation produces a one-, two-, or three-dimensional table of data representing statistical results across the parameter space. Inasmuch as the output of a parametric study in three dimensions can be a very large data set, visualization is a paramount means of discovering trends in the data (see figure). Data Table Plot View enables visualization of the data table created by Parametric Studies or by another data source: this module quickly generates a display of the data in the form of a rotatable three-dimensional-appearing plot, making it unnecessary to load the SOAP output data into a separate plotting program. The rotatable three-dimensionalappearing plot makes it easy to determine which points in the parameter space are most desirable. Both modules provide intuitive user interfaces for ease of use.
Reconstruction of sparse-view X-ray computed tomography using adaptive iterative algorithms.
Liu, Li; Lin, Weikai; Jin, Mingwu
2015-01-01
In this paper, we propose two reconstruction algorithms for sparse-view X-ray computed tomography (CT). Treating the reconstruction problems as data fidelity constrained total variation (TV) minimization, both algorithms adapt the alternate two-stage strategy: projection onto convex sets (POCS) for data fidelity and non-negativity constraints and steepest descent for TV minimization. The novelty of this work is to determine iterative parameters automatically from data, thus avoiding tedious manual parameter tuning. In TV minimization, the step sizes of steepest descent are adaptively adjusted according to the difference from POCS update in either the projection domain or the image domain, while the step size of algebraic reconstruction technique (ART) in POCS is determined based on the data noise level. In addition, projection errors are used to compare with the error bound to decide whether to perform ART so as to reduce computational costs. The performance of the proposed methods is studied and evaluated using both simulated and physical phantom data. Our methods with automatic parameter tuning achieve similar, if not better, reconstruction performance compared to a representative two-stage algorithm. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Satra, P.; Carsky, J.
2018-04-01
Our research is looking at the travel behaviour from a macroscopic view, taking one municipality as a basic unit. The travel behaviour of one municipality as a whole is becoming one piece of a data in the research of travel behaviour of a larger area, perhaps a country. A data pre-processing is used to cluster the municipalities in groups, which show similarities in their travel behaviour. Such groups can be then researched for reasons of their prevailing pattern of travel behaviour without any distortion caused by municipalities with a different pattern. This paper deals with actual settings of the clustering process, which is based on Bayesian statistics, particularly the mixture model. An optimization of the settings parameters based on correlation of pointer model parameters and relative number of data in clusters is helpful, however not fully reliable method. Thus, method for graphic representation of clusters needs to be developed in order to check their quality. A training of the setting parameters in 2D has proven to be a beneficial method, because it allows visual control of the produced clusters. The clustering better be applied on separate groups of municipalities, where competition of only identical transport modes can be found.
The plant virus microscope image registration method based on mismatches removing.
Wei, Lifang; Zhou, Shucheng; Dong, Heng; Mao, Qianzhuo; Lin, Jiaxiang; Chen, Riqing
2016-01-01
The electron microscopy is one of the major means to observe the virus. The view of virus microscope images is limited by making specimen and the size of the camera's view field. To solve this problem, the virus sample is produced into multi-slice for information fusion and image registration techniques are applied to obtain large field and whole sections. Image registration techniques have been developed in the past decades for increasing the camera's field of view. Nevertheless, these approaches typically work in batch mode and rely on motorized microscopes. Alternatively, the methods are conceived just to provide visually pleasant registration for high overlap ratio image sequence. This work presents a method for virus microscope image registration acquired with detailed visual information and subpixel accuracy, even when overlap ratio of image sequence is 10% or less. The method proposed focus on the correspondence set and interimage transformation. A mismatch removal strategy is proposed by the spatial consistency and the components of keypoint to enrich the correspondence set. And the translation model parameter as well as tonal inhomogeneities is corrected by the hierarchical estimation and model select. In the experiments performed, we tested different registration approaches and virus images, confirming that the translation model is not always stationary, despite the fact that the images of the sample come from the same sequence. The mismatch removal strategy makes building registration of virus microscope images at subpixel accuracy easier and optional parameters for building registration according to the hierarchical estimation and model select strategies make the proposed method high precision and reliable for low overlap ratio image sequence. Copyright © 2015 Elsevier Ltd. All rights reserved.
Age-related changes in visual exploratory behavior in a natural scene setting
Hamel, Johanna; De Beukelaer, Sophie; Kraft, Antje; Ohl, Sven; Audebert, Heinrich J.; Brandt, Stephan A.
2013-01-01
Diverse cognitive functions decline with increasing age, including the ability to process central and peripheral visual information in a laboratory testing situation (useful visual field of view). To investigate whether and how this influences activities of daily life, we studied age-related changes in visual exploratory behavior in a natural scene setting: a driving simulator paradigm of variable complexity was tested in subjects of varying ages with simultaneous eye- and head-movement recordings via a head-mounted camera. Detection and reaction times were also measured by visual fixation and manual reaction. We considered video computer game experience as a possible influence on performance. Data of 73 participants of varying ages were analyzed, driving two different courses. We analyzed the influence of route difficulty level, age, and eccentricity of test stimuli on oculomotor and driving behavior parameters. No significant age effects were found regarding saccadic parameters. In the older subjects head-movements increasingly contributed to gaze amplitude. More demanding courses and more peripheral stimuli locations induced longer reaction times in all age groups. Deterioration of the functionally useful visual field of view with increasing age was not suggested in our study group. However, video game-experienced subjects revealed larger saccade amplitudes and a broader distribution of fixations on the screen. They reacted faster to peripheral objects suggesting the notion of a general detection task rather than perceiving driving as a central task. As the video game-experienced population consisted of younger subjects, our study indicates that effects due to video game experience can easily be misinterpreted as age effects if not accounted for. We therefore view it as essential to consider video game experience in all testing methods using virtual media. PMID:23801970
Computer Code for the Determination of Ejection Seat/Man Aerodynamic Parameters.
1980-08-28
ARMS, and LES (computer code -- .,. ,... ,, ..,.., .: . .. ... ,-." . ;.’ -- I- ta names) and Seat consisted of 4 panels SEAT, BACK, PADD , and SIDE. An... general application of Eq. (I) is for blunt bodies at hypersonic speed, because accuracy of this equation becomes better at higher Mach number. Therefore...pressure coefficient is set equal to zero on those portions of the body that are invisible to a distant observer who views the body from the direction
NASA Astrophysics Data System (ADS)
Cook, Grant O.; Sorensen, Carl D.
2013-12-01
Partial transient liquid-phase (PTLP) bonding is currently an esoteric joining process with limited applications. However, it has preferable advantages compared with typical joining techniques and is the best joining technique for certain applications. Specifically, it can bond hard-to-join materials as well as dissimilar material types, and bonding is performed at comparatively low temperatures. Part of the difficulty in applying PTLP bonding is finding suitable interlayer combinations (ICs). A novel interlayer selection procedure has been developed to facilitate the identification of ICs that will create successful PTLP bonds and is explained in a companion article. An integral part of the selection procedure is a filtering routine that identifies all possible ICs for a given application. This routine utilizes a set of customizable parameters that are based on key characteristics of PTLP bonding. These parameters include important design considerations such as bonding temperature, target remelting temperature, bond solid type, and interlayer thicknesses. The output from this routine provides a detailed view of each candidate IC along with a broad view of the entire candidate set, greatly facilitating the selection of ideal ICs. This routine provides a new perspective on the PTLP bonding process. In addition, the use of this routine, by way of the accompanying selection procedure, will expand PTLP bonding as a viable joining process.
Conceptual design of a neutron camera for MAST Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiszflog, M., E-mail: matthias.weiszflog@physics.uu.se; Sangaroon, S.; Cecconello, M.
2014-11-15
This paper presents two different conceptual designs of neutron cameras for Mega Ampere Spherical Tokamak (MAST) Upgrade. The first one consists of two horizontal cameras, one equatorial and one vertically down-shifted by 65 cm. The second design, viewing the plasma in a poloidal section, also consists of two cameras, one radial and the other one with a diagonal view. Design parameters for the different cameras were selected on the basis of neutron transport calculations and on a set of target measurement requirements taking into account the predicted neutron emissivities in the different MAST Upgrade operating scenarios. Based on a comparisonmore » of the cameras’ profile resolving power, the horizontal cameras are suggested as the best option.« less
Cross-View Action Recognition via Transferable Dictionary Learning.
Zheng, Jingjing; Jiang, Zhuolin; Chellappa, Rama
2016-05-01
Discriminative appearance features are effective for recognizing actions in a fixed view, but may not generalize well to a new view. In this paper, we present two effective approaches to learn dictionaries for robust action recognition across views. In the first approach, we learn a set of view-specific dictionaries where each dictionary corresponds to one camera view. These dictionaries are learned simultaneously from the sets of correspondence videos taken at different views with the aim of encouraging each video in the set to have the same sparse representation. In the second approach, we additionally learn a common dictionary shared by different views to model view-shared features. This approach represents the videos in each view using a view-specific dictionary and the common dictionary. More importantly, it encourages the set of videos taken from the different views of the same action to have the similar sparse representations. The learned common dictionary not only has the capability to represent actions from unseen views, but also makes our approach effective in a semi-supervised setting where no correspondence videos exist and only a few labeled videos exist in the target view. The extensive experiments using three public datasets demonstrate that the proposed approach outperforms recently developed approaches for cross-view action recognition.
The system controlling the composition of clastic sediments
Johnsson, Mark J.
1993-01-01
The composition of clastic sediments and rocks is controlled by a complex suite of parameters operating during pedogenesis, erosion, transport, deposition, and burial. The principal first-order parameters include source rock composition, modification by chemical weathering, mechanical disaggregation and abrasion, authigenic inputs, hydrodynamic sorting, and diagenesis. Each of these first-order parameters is influenced to varying degrees by such factors as the tectonic settings of the source region, transportational system and depositional environment, climate, vegetation, relief, slope, and the nature and energy of transportational and depositional systems. These factors are not independent; rather a complicated web of interrelationships and feedback mechanisms causes many factors to be modulated by others. Accordingly, processes controlling the composition of clastic sediments are best viewed as constituting a system, and in evaluating compositional information the dynamics of the system must be considered as whole.
SCOPE: a web server for practical de novo motif discovery.
Carlson, Jonathan M; Chakravarty, Arijit; DeZiel, Charles E; Gross, Robert H
2007-07-01
SCOPE is a novel parameter-free method for the de novo identification of potential regulatory motifs in sets of coordinately regulated genes. The SCOPE algorithm combines the output of three component algorithms, each designed to identify a particular class of motifs. Using an ensemble learning approach, SCOPE identifies the best candidate motifs from its component algorithms. In tests on experimentally determined datasets, SCOPE identified motifs with a significantly higher level of accuracy than a number of other web-based motif finders run with their default parameters. Because SCOPE has no adjustable parameters, the web server has an intuitive interface, requiring only a set of gene names or FASTA sequences and a choice of species. The most significant motifs found by SCOPE are displayed graphically on the main results page with a table containing summary statistics for each motif. Detailed motif information, including the sequence logo, PWM, consensus sequence and specific matching sites can be viewed through a single click on a motif. SCOPE's efficient, parameter-free search strategy has enabled the development of a web server that is readily accessible to the practising biologist while providing results that compare favorably with those of other motif finders. The SCOPE web server is at
a Web-Based Interactive Platform for Co-Clustering Spatio-Temporal Data
NASA Astrophysics Data System (ADS)
Wu, X.; Poorthuis, A.; Zurita-Milla, R.; Kraak, M.-J.
2017-09-01
Since current studies on clustering analysis mainly focus on exploring spatial or temporal patterns separately, a co-clustering algorithm is utilized in this study to enable the concurrent analysis of spatio-temporal patterns. To allow users to adopt and adapt the algorithm for their own analysis, it is integrated within the server side of an interactive web-based platform. The client side of the platform, running within any modern browser, is a graphical user interface (GUI) with multiple linked visualizations that facilitates the understanding, exploration and interpretation of the raw dataset and co-clustering results. Users can also upload their own datasets and adjust clustering parameters within the platform. To illustrate the use of this platform, an annual temperature dataset from 28 weather stations over 20 years in the Netherlands is used. After the dataset is loaded, it is visualized in a set of linked visualizations: a geographical map, a timeline and a heatmap. This aids the user in understanding the nature of their dataset and the appropriate selection of co-clustering parameters. Once the dataset is processed by the co-clustering algorithm, the results are visualized in the small multiples, a heatmap and a timeline to provide various views for better understanding and also further interpretation. Since the visualization and analysis are integrated in a seamless platform, the user can explore different sets of co-clustering parameters and instantly view the results in order to do iterative, exploratory data analysis. As such, this interactive web-based platform allows users to analyze spatio-temporal data using the co-clustering method and also helps the understanding of the results using multiple linked visualizations.
Zhu, Manlu; Dai, Xiongfeng
2018-01-15
In nature, the maximal growth rates vary widely among different bacteria species. Fast-growing bacteria species such as Escherichia coli can have a shortest generation time of 20 min. Slow-growing bacteria species are perhaps best known for Mycobacterium tuberculosis, a human pathogen with a generation time being no less than 16 h. Despite of the significant progress made on understanding the pathogenesis of M. tuberculosis, we know little on the origin of its intriguingly slow growth. From a global view, the intrinsic constraint of the maximal growth rate of bacteria remains to be a fundamental question in microbiology. In this review, we analyze and discuss this issue from the angle of protein translation capacity, which is the major demand for cell growth. Based on quantitative analysis, we propose four parameters: rRNA chain elongation rate, abundance of RNA polymerase engaged in rRNA synthesis, polypeptide chain elongation rate, and active ribosome fraction, which potentially limit the maximal growth rate of bacteria. We further discuss the relation of these parameters with the growth rate for M. tuberculosis as well as other bacterial species. We highlight future comprehensive investigation of these parameters for different bacteria species to understand how bacteria set their own specific growth rates.
Decomposition of Fuzzy Soft Sets with Finite Value Spaces
Jun, Young Bae
2014-01-01
The notion of fuzzy soft sets is a hybrid soft computing model that integrates both gradualness and parameterization methods in harmony to deal with uncertainty. The decomposition of fuzzy soft sets is of great importance in both theory and practical applications with regard to decision making under uncertainty. This study aims to explore decomposition of fuzzy soft sets with finite value spaces. Scalar uni-product and int-product operations of fuzzy soft sets are introduced and some related properties are investigated. Using t-level soft sets, we define level equivalent relations and show that the quotient structure of the unit interval induced by level equivalent relations is isomorphic to the lattice consisting of all t-level soft sets of a given fuzzy soft set. We also introduce the concepts of crucial threshold values and complete threshold sets. Finally, some decomposition theorems for fuzzy soft sets with finite value spaces are established, illustrated by an example concerning the classification and rating of multimedia cell phones. The obtained results extend some classical decomposition theorems of fuzzy sets, since every fuzzy set can be viewed as a fuzzy soft set with a single parameter. PMID:24558342
Decomposition of fuzzy soft sets with finite value spaces.
Feng, Feng; Fujita, Hamido; Jun, Young Bae; Khan, Madad
2014-01-01
The notion of fuzzy soft sets is a hybrid soft computing model that integrates both gradualness and parameterization methods in harmony to deal with uncertainty. The decomposition of fuzzy soft sets is of great importance in both theory and practical applications with regard to decision making under uncertainty. This study aims to explore decomposition of fuzzy soft sets with finite value spaces. Scalar uni-product and int-product operations of fuzzy soft sets are introduced and some related properties are investigated. Using t-level soft sets, we define level equivalent relations and show that the quotient structure of the unit interval induced by level equivalent relations is isomorphic to the lattice consisting of all t-level soft sets of a given fuzzy soft set. We also introduce the concepts of crucial threshold values and complete threshold sets. Finally, some decomposition theorems for fuzzy soft sets with finite value spaces are established, illustrated by an example concerning the classification and rating of multimedia cell phones. The obtained results extend some classical decomposition theorems of fuzzy sets, since every fuzzy set can be viewed as a fuzzy soft set with a single parameter.
Accuracy Analysis for Automatic Orientation of a Tumbling Oblique Viewing Sensor System
NASA Astrophysics Data System (ADS)
Stebner, K.; Wieden, A.
2014-03-01
Dynamic camera systems with moving parts are difficult to handle in photogrammetric workflow, because it is not ensured that the dynamics are constant over the recording period. Minimum changes of the camera's orientation greatly influence the projection of oblique images. In this publication these effects - originating from the kinematic chain of a dynamic camera system - are analysed and validated. A member of the Modular Airborne Camera System family - MACS-TumbleCam - consisting of a vertical viewing and a tumbling oblique camera was used for this investigation. Focus is on dynamic geometric modeling and the stability of the kinematic chain. To validate the experimental findings, the determined parameters are applied to the exterior orientation of an actual aerial image acquisition campaign using MACS-TumbleCam. The quality of the parameters is sufficient for direct georeferencing of oblique image data from the orientation information of a synchronously captured vertical image dataset. Relative accuracy for the oblique data set ranges from 1.5 pixels when using all images of the image block to 0.3 pixels when using only adjacent images.
Optimal design and critical analysis of a high resolution video plenoptic demonstrator
NASA Astrophysics Data System (ADS)
Drazic, Valter; Sacré, Jean-Jacques; Bertrand, Jérôme; Schubert, Arno; Blondé, Etienne
2011-03-01
A plenoptic camera is a natural multi-view acquisition device also capable of measuring distances by correlating a set of images acquired under different parallaxes. Its single lens and single sensor architecture have two downsides: limited resolution and depth sensitivity. In a very first step and in order to circumvent those shortcomings, we have investigated how the basic design parameters of a plenoptic camera optimize both the resolution of each view and also its depth measuring capability. In a second step, we built a prototype based on a very high resolution Red One® movie camera with an external plenoptic adapter and a relay lens. The prototype delivered 5 video views of 820x410. The main limitation in our prototype is view cross talk due to optical aberrations which reduce the depth accuracy performance. We have simulated some limiting optical aberrations and predicted its impact on the performances of the camera. In addition, we developed adjustment protocols based on a simple pattern and analyzing programs which investigate the view mapping and amount of parallax crosstalk on the sensor on a pixel basis. The results of these developments enabled us to adjust the lenslet array with a sub micrometer precision and to mark the pixels of the sensor where the views do not register properly.
The Quantum Approximation Optimization Algorithm for MaxCut: A Fermionic View
NASA Technical Reports Server (NTRS)
Wang, Zhihui; Hadfield, Stuart; Jiang, Zhang; Rieffel, Eleanor G.
2017-01-01
Farhi et al. recently proposed a class of quantum algorithms, the Quantum Approximate Optimization Algorithm (QAOA), for approximately solving combinatorial optimization problems. A level-p QAOA circuit consists of steps in which a classical Hamiltonian, derived from the cost function, is applied followed by a mixing Hamiltonian. The 2p times for which these two Hamiltonians are applied are the parameters of the algorithm. As p increases, however, the parameter search space grows quickly. The success of the QAOA approach will depend, in part, on finding effective parameter-setting strategies. Here, we analytically and numerically study parameter setting for QAOA applied to MAXCUT. For level-1 QAOA, we derive an analytical expression for a general graph. In principle, expressions for higher p could be derived, but the number of terms quickly becomes prohibitive. For a special case of MAXCUT, the Ring of Disagrees, or the 1D antiferromagnetic ring, we provide an analysis for arbitrarily high level. Using a Fermionic representation, the evolution of the system under QAOA translates into quantum optimal control of an ensemble of independent spins. This treatment enables us to obtain analytical expressions for the performance of QAOA for any p. It also greatly simplifies numerical search for the optimal values of the parameters. By exploring symmetries, we identify a lower-dimensional sub-manifold of interest; the search effort can be accordingly reduced. This analysis also explains an observed symmetry in the optimal parameter values. Further, we numerically investigate the parameter landscape and show that it is a simple one in the sense of having no local optima.
Simultaneous emission and transmission scanning in PET oncology: the effect on parameter estimation
NASA Astrophysics Data System (ADS)
Meikle, S. R.; Eberl, S.; Hooper, P. K.; Fulham, M. J.
1997-02-01
The authors investigated potential sources of bias due to simultaneous emission and transmission (SET) scanning and their effect on parameter estimation in dynamic positron emission tomography (PET) oncology studies. The sources of bias considered include: i) variation in transmission spillover (into the emission window) throughout the field of view, ii) increased scatter arising from rod sources, and iii) inaccurate deadtime correction. Net bias was calculated as a function of the emission count rate and used to predict distortion in [/sup 18/F]2-fluoro-2-deoxy-D-glucose (FDG) and [/sup 11/C]thymidine tissue curves simulating the normal liver and metastatic involvement of the liver. The effect on parameter estimates was assessed by spectral analysis and compartmental modeling. The various sources of bias approximately cancel during the early part of the study when count rate is maximal. Scatter dominates in the latter part of the study, causing apparently decreased tracer clearance which is more marked for thymidine than for FDG. The irreversible disposal rate constant, K/sub i/, was overestimated by <10% for FDG and >30% for thymidine. The authors conclude that SET has a potential role in dynamic FDG PET but is not suitable for /sup 11/C-labeled compounds.
Validation of geometric models for fisheye lenses
NASA Astrophysics Data System (ADS)
Schneider, D.; Schwalbe, E.; Maas, H.-G.
The paper focuses on the photogrammetric investigation of geometric models for different types of optical fisheye constructions (equidistant, equisolid-angle, sterographic and orthographic projection). These models were implemented and thoroughly tested in a spatial resection and a self-calibrating bundle adjustment. For this purpose, fisheye images were taken with a Nikkor 8 mm fisheye lens on a Kodak DSC 14n Pro digital camera in a hemispherical calibration room. Both, the spatial resection and the bundle adjustment resulted in a standard deviation of unit weight of 1/10 pixel with a suitable set of simultaneous calibration parameters introduced into the camera model. The camera-lens combination was treated with all of the four basic models mentioned above. Using the same set of additional lens distortion parameters, the differences between the models can largely be compensated, delivering almost the same precision parameters. The relative object space precision obtained from the bundle adjustment was ca. 1:10 000 of the object dimensions. This value can be considered as a very satisfying result, as fisheye images generally have a lower geometric resolution as a consequence of their large field of view and also have a inferior imaging quality in comparison to most central perspective lenses.
NASA Technical Reports Server (NTRS)
Mather, R. S.; Lerch, F. J.; Rizos, C.; Masters, E. G.; Hirsch, B.
1978-01-01
The 1977 altimetry data bank is analyzed for the geometrical shape of the sea surface expressed as surface spherical harmonics after referral to the higher reference model defined by GEM 9. The resulting determination is expressed as quasi-stationary dynamic SST. Solutions are obtained from different sets of long arcs in the GEOS-3 altimeter data bank as well as from sub-sets related to the September 1975 and March 1976 equinoxes assembled with a view to minimizing seasonal effects. The results are compared with equivalent parameters obtained from the hydrostatic analysis of sporadic temperature, pressure and salinity measurements of the oceans and the known major steady state current systems with comparable wavelengths. The most clearly defined parameter (the zonal harmonic of degree 2) is obtained with an uncertainty of + or - 6 cm. The preferred numerical value is smaller than the oceanographic value due to the effect of the correction for the permanent earth tide. Similar precision is achieved for the zonal harmonic of degree 3. The precision obtained for the fourth degree zonal harmonic reflects more closely the accuracy expected from the level of noise in the orbital solutions.
Analysis of Seasonal Chlorophyll-a Using An Adjoint Three-Dimensional Ocean Carbon Cycle Model
NASA Astrophysics Data System (ADS)
Tjiputra, J.; Winguth, A.; Polzin, D.
2004-12-01
The misfit between numerical ocean model and observations can be reduced using data assimilation. This can be achieved by optimizing the model parameter values using adjoint model. The adjoint model minimizes the model-data misfit by estimating the sensitivity or gradient of the cost function with respect to initial condition, boundary condition, or parameters. The adjoint technique was used to assimilate seasonal chlorophyll-a data from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) satellite to a marine biogeochemical model HAMOCC5.1. An Identical Twin Experiment (ITE) was conducted to test the robustness of the model and the non-linearity level of the forward model. The ITE experiment successfully recovered most of the perturbed parameter to their initial values, and identified the most sensitive ecosystem parameters, which contribute significantly to model-data bias. The regional assimilations of SeaWiFS chlorophyll-a data into the model were able to reduce the model-data misfit (i.e. the cost function) significantly. The cost function reduction mostly occurred in the high latitudes (e.g. the model-data misfit in the northern region during summer season was reduced by 54%). On the other hand, the equatorial regions appear to be relatively stable with no strong reduction in cost function. The optimized parameter set is used to forecast the carbon fluxes between marine ecosystem compartments (e.g. Phytoplankton, Zooplankton, Nutrients, Particulate Organic Carbon, and Dissolved Organic Carbon). The a posteriori model run using the regional best-fit parameterization yields approximately 36 PgC/yr of global net primary productions in the euphotic zone.
Limited Angle Dual Modality Breast Imaging
NASA Astrophysics Data System (ADS)
More, Mitali J.; Li, Heng; Goodale, Patricia J.; Zheng, Yibin; Majewski, Stan; Popov, Vladimir; Welch, Benjamin; Williams, Mark B.
2007-06-01
We are developing a dual modality breast scanner that can obtain x-ray transmission and gamma ray emission images in succession at multiple viewing angles with the breast held under mild compression. These views are reconstructed and fused to obtain three-dimensional images that combine structural and functional information. Here, we describe the dual modality system and present results of phantom experiments designed to test the system's ability to obtain fused volumetric dual modality data sets from a limited number of projections, acquired over a limited (less than 180 degrees) angular range. We also present initial results from phantom experiments conducted to optimize the acquisition geometry for gamma imaging. The optimization parameters include the total number of views and the angular range over which these views should be spread, while keeping the total number of detected counts fixed. We have found that in general, for a fixed number of views centered around the direction perpendicular to the direction of compression, in-plane contrast and SNR are improved as the angular range of the views is decreased. The improvement in contrast and SNR with decreasing angular range is much greater for deeper lesions and for a smaller number of views. However, the z-resolution of the lesion is significantly reduced with decreasing angular range. Finally, we present results from limited angle tomography scans using a system with dual, opposing heads.
Assessment of Existing Data and Reports for System Evaluation
NASA Technical Reports Server (NTRS)
Matolak, David W.; Skidmore, Trent A.
2000-01-01
This report describes work done as part of the Weather Datalink Research project grant. We describe the work done under Task 1 of this project: the assessment of the suitability of available reports and data for use in evaluation of candidate weather datalink systems, and the development of a performance parameter set for comparative system evaluation. It was found that existing data and reports are inadequate for a complete physical layer characterization, but that these reports provide a good foundation for system comparison. In addition, these reports also contain some information useful for evaluation at higher layers. The performance parameter list compiled can be viewed as near complete-additional investigations, both analytical/simulation and experimental, will likely result in additions and improvements to this list.
Inversion of surface parameters using fast learning neural networks
NASA Technical Reports Server (NTRS)
Dawson, M. S.; Olvera, J.; Fung, A. K.; Manry, M. T.
1992-01-01
A neural network approach to the inversion of surface scattering parameters is presented. Simulated data sets based on a surface scattering model are used so that the data may be viewed as taken from a completely known randomly rough surface. The fast learning (FL) neural network and a multilayer perceptron (MLP) trained with backpropagation learning (BP network) are tested on the simulated backscattering data. The RMS error of training the FL network is found to be less than one half the error of the BP network while requiring one to two orders of magnitude less CPU time. When applied to inversion of parameters from a statistically rough surface, the FL method is successful at recovering the surface permittivity, the surface correlation length, and the RMS surface height in less time and with less error than the BP network. Further applications of the FL neural network to the inversion of parameters from backscatter measurements of an inhomogeneous layer above a half space are shown.
20. View to southeast. Aerial view of bridge in setting; ...
20. View to southeast. Aerial view of bridge in setting; downstream side. (135mm lens) - South Fork Trinity River Bridge, State Highway 299 spanning South Fork Trinity River, Salyer, Trinity County, CA
Pulkkinen, Aki; Cox, Ben T; Arridge, Simon R; Goh, Hwan; Kaipio, Jari P; Tarvainen, Tanja
2016-11-01
Estimation of optical absorption and scattering of a target is an inverse problem associated with quantitative photoacoustic tomography. Conventionally, the problem is expressed as two folded. First, images of initial pressure distribution created by absorption of a light pulse are formed based on acoustic boundary measurements. Then, the optical properties are determined based on these photoacoustic images. The optical stage of the inverse problem can thus suffer from, for example, artefacts caused by the acoustic stage. These could be caused by imperfections in the acoustic measurement setting, of which an example is a limited view acoustic measurement geometry. In this work, the forward model of quantitative photoacoustic tomography is treated as a coupled acoustic and optical model and the inverse problem is solved by using a Bayesian approach. Spatial distribution of the optical properties of the imaged target are estimated directly from the photoacoustic time series in varying acoustic detection and optical illumination configurations. It is numerically demonstrated, that estimation of optical properties of the imaged target is feasible in limited view acoustic detection setting.
Monitoring of the electrical parameters in off-grid solar power system
NASA Astrophysics Data System (ADS)
Idzkowski, Adam; Leoniuk, Katarzyna; Walendziuk, Wojciech
2016-09-01
The aim of this work was to make a monitoring dedicated to an off-grid installation. A laboratory set, which was built for that purpose, was equipped with a PV panel, a battery, a charge controller and a load. Additionally, to monitor electrical parameters from this installation there were used: LabJack module (data acquisition card), measuring module (self-built) and a computer with a program, which allows to measure and present the off-grid installation parameters. The program was made in G language using LabVIEW software. The designed system enables analyzing the currents and voltages of PV panel, battery and load. It makes also possible to visualize them on charts and to make reports from registered data. The monitoring system was also verified by a laboratory test and in real conditions. The results of this verification are also presented.
Herding, minority game, market clearing and efficient markets in a simple spin model framework
NASA Astrophysics Data System (ADS)
Kristoufek, Ladislav; Vosvrda, Miloslav
2018-01-01
We present a novel approach towards the financial Ising model. Most studies utilize the model to find settings which generate returns closely mimicking the financial stylized facts such as fat tails, volatility clustering and persistence, and others. We tackle the model utility from the other side and look for the combination of parameters which yields return dynamics of the efficient market in the view of the efficient market hypothesis. Working with the Ising model, we are able to present nicely interpretable results as the model is based on only two parameters. Apart from showing the results of our simulation study, we offer a new interpretation of the Ising model parameters via inverse temperature and entropy. We show that in fact market frictions (to a certain level) and herding behavior of the market participants do not go against market efficiency but what is more, they are needed for the markets to be efficient.
Applications of a High-Altitude Powered Platform (HAPP)
NASA Technical Reports Server (NTRS)
Kuhner, M. B.; Earhart, R. W.; Madigan, J. A.; Ruck, G. T.
1977-01-01
A list of potential uses for the (HAPP) and conceptual system designs for a small subset of the most promising applications were investigated. The method was to postulate a scenario for each application specifying a user, a set of system requirements and the most likely competitor among conventional aircraft and satellite systems. As part of the study of remote sensing applications, a parametric cost comparison was done between aircraft and HAPPS. For most remote sensing applications, aircraft can supply the same data as HAPPs at substantially lower cost. The critical parameters in determining the relative costs of the two systems are the sensor field of view and the required frequency of the observations being made. The HAPP is only competitive with an airplane when sensors having a very wide field of view are appropriate and when the phenomenon being observed must be viewed at least once per day. This eliminates the majority of remote sensing applications from any further consideration.
NASA Technical Reports Server (NTRS)
Senger, Steven O.
1998-01-01
Volumetric data sets have become common in medicine and many sciences through technologies such as computed x-ray tomography (CT), magnetic resonance (MR), positron emission tomography (PET), confocal microscopy and 3D ultrasound. When presented with 2D images humans immediately and unconsciously begin a visual analysis of the scene. The viewer surveys the scene identifying significant landmarks and building an internal mental model of presented information. The identification of features is strongly influenced by the viewers expectations based upon their expert knowledge of what the image should contain. While not a conscious activity, the viewer makes a series of choices about how to interpret the scene. These choices occur in parallel with viewing the scene and effectively change the way the viewer sees the image. It is this interaction of viewing and choice which is the basis of many familiar visual illusions. This is especially important in the interpretation of medical images where it is the expert knowledge of the radiologist which interprets the image. For 3D data sets this interaction of view and choice is frustrated because choices must precede the visualization of the data set. It is not possible to visualize the data set with out making some initial choices which determine how the volume of data is presented to the eye. These choices include, view point orientation, region identification, color and opacity assignments. Further compounding the problem is the fact that these visualization choices are defined in terms of computer graphics as opposed to language of the experts knowledge. The long term goal of this project is to develop an environment where the user can interact with volumetric data sets using tools which promote the utilization of expert knowledge by incorporating visualization and choice into a tight computational loop. The tools will support activities involving the segmentation of structures, construction of surface meshes and local filtering of the data set. To conform to this environment tools should have several key attributes. First, they should be only rely on computations over a local neighborhood of the probe position. Second, they should operate iteratively over time converging towards a limit behavior. Third, they should adapt to user input modifying they operational parameters with time.
An analytically based numerical method for computing view factors in real urban environments
NASA Astrophysics Data System (ADS)
Lee, Doo-Il; Woo, Ju-Wan; Lee, Sang-Hyun
2018-01-01
A view factor is an important morphological parameter used in parameterizing in-canyon radiative energy exchange process as well as in characterizing local climate over urban environments. For realistic representation of the in-canyon radiative processes, a complete set of view factors at the horizontal and vertical surfaces of urban facets is required. Various analytical and numerical methods have been suggested to determine the view factors for urban environments, but most of the methods provide only sky-view factor at the ground level of a specific location or assume simplified morphology of complex urban environments. In this study, a numerical method that can determine the sky-view factors ( ψ ga and ψ wa ) and wall-view factors ( ψ gw and ψ ww ) at the horizontal and vertical surfaces is presented for application to real urban morphology, which are derived from an analytical formulation of the view factor between two blackbody surfaces of arbitrary geometry. The established numerical method is validated against the analytical sky-view factor estimation for ideal street canyon geometries, showing a consolidate confidence in accuracy with errors of less than 0.2 %. Using a three-dimensional building database, the numerical method is also demonstrated to be applicable in determining the sky-view factors at the horizontal (roofs and roads) and vertical (walls) surfaces in real urban environments. The results suggest that the analytically based numerical method can be used for the radiative process parameterization of urban numerical models as well as for the characterization of local urban climate.
Double Mine Building, general view in setting; view northeast ...
Double Mine Building, general view in setting; view northeast - Fort McKinley, Double Mine Building, East side of East Side Drive, approximately 125 feet south of Weymouth Way, Great Diamond Island, Portland, Cumberland County, ME
Feng, Ssj; Sechopoulos, I
2012-06-01
To develop an objective model of the shape of the compressed breast undergoing mammographic or tomosynthesis acquisition. Automated thresholding and edge detection was performed on 984 anonymized digital mammograms (492 craniocaudal (CC) view mammograms and 492 medial lateral oblique (MLO) view mammograms), to extract the edge of each breast. Principal Component Analysis (PCA) was performed on these edge vectors to identify a limited set of parameters and eigenvectors that. These parameters and eigenvectors comprise a model that can be used to describe the breast shapes present in acquired mammograms and to generate realistic models of breasts undergoing acquisition. Sample breast shapes were then generated from this model and evaluated. The mammograms in the database were previously acquired for a separate study and authorized for use in further research. The PCA successfully identified two principal components and their corresponding eigenvectors, forming the basis for the breast shape model. The simulated breast shapes generated from the model are reasonable approximations of clinically acquired mammograms. Using PCA, we have obtained models of the compressed breast undergoing mammographic or tomosynthesis acquisition based on objective analysis of a large image database. Up to now, the breast in the CC view has been approximated as a semi-circular tube, while there has been no objectively-obtained model for the MLO view breast shape. Such models can be used for various breast imaging research applications, such as x-ray scatter estimation and correction, dosimetry estimates, and computer-aided detection and diagnosis. © 2012 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kydonieos, M; Folgueras, A; Florescu, L
2016-06-15
Purpose: Elekta recently developed a solution for in-vivo EPID dosimetry (iViewDose, Elekta AB, Stockholm, Sweden) in conjunction with the Netherlands Cancer Institute (NKI). This uses a simplified commissioning approach via Template Commissioning Models (TCMs), consisting of a subset of linac-independent pre-defined parameters. This work compares the performance of iViewDose using a TCM commissioning approach with that corresponding to full commissioning. Additionally, the dose reconstruction based on the simplified commissioning approach is validated via independent dose measurements. Methods: Measurements were performed at the NKI on a VersaHD™ (Elekta AB, Stockholm, Sweden). Treatment plans were generated with Pinnacle 9.8 (Philips Medical Systems,more » Eindhoven, The Netherlands). A farmer chamber dose measurement and two EPID images were used to create a linac-specific commissioning model based on a TCM. A complete set of commissioning measurements was collected and a full commissioning model was created.The performance of iViewDose based on the two commissioning approaches was compared via a series of set-to-work tests in a slab phantom. In these tests, iViewDose reconstructs and compares EPID to TPS dose for square fields, IMRT and VMAT plans via global gamma analysis and isocentre dose difference. A clinical VMAT plan was delivered to a homogeneous Octavius 4D phantom (PTW, Freiburg, Germany). Dose was measured with the Octavius 1500 array and VeriSoft software was used for 3D dose reconstruction. EPID images were acquired. TCM-based iViewDose and 3D Octavius dose distributions were compared against the TPS. Results: For both the TCM-based and the full commissioning approaches, the pass rate, mean γ and dose difference were >97%, <0.5 and <2.5%, respectively. Equivalent gamma analysis results were obtained for iViewDose (TCM approach) and Octavius for a VMAT plan. Conclusion: iViewDose produces similar results with the simplified and full commissioning approaches. Good agreement is obtained between iViewDose (simplified approach) and the independent measurement tool. This research is funded by Elekta Limited.« less
User interface for ground-water modeling: Arcview extension
Tsou, Ming‐shu; Whittemore, Donald O.
2001-01-01
Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.
Optimal design and critical analysis of a high-resolution video plenoptic demonstrator
NASA Astrophysics Data System (ADS)
Drazic, Valter; Sacré, Jean-Jacques; Schubert, Arno; Bertrand, Jérôme; Blondé, Etienne
2012-01-01
A plenoptic camera is a natural multiview acquisition device also capable of measuring distances by correlating a set of images acquired under different parallaxes. Its single lens and single sensor architecture have two downsides: limited resolution and limited depth sensitivity. As a first step and in order to circumvent those shortcomings, we investigated how the basic design parameters of a plenoptic camera optimize both the resolution of each view and its depth-measuring capability. In a second step, we built a prototype based on a very high resolution Red One® movie camera with an external plenoptic adapter and a relay lens. The prototype delivered five video views of 820 × 410. The main limitation in our prototype is view crosstalk due to optical aberrations that reduce the depth accuracy performance. We simulated some limiting optical aberrations and predicted their impact on the performance of the camera. In addition, we developed adjustment protocols based on a simple pattern and analysis of programs that investigated the view mapping and amount of parallax crosstalk on the sensor on a pixel basis. The results of these developments enabled us to adjust the lenslet array with a submicrometer precision and to mark the pixels of the sensor where the views do not register properly.
Collaborative sparse priors for multi-view ATR
NASA Astrophysics Data System (ADS)
Li, Xuelu; Monga, Vishal
2018-04-01
Recent work has seen a surge of sparse representation based classification (SRC) methods applied to automatic target recognition problems. While traditional SRC approaches used l0 or l1 norm to quantify sparsity, spike and slab priors have established themselves as the gold standard for providing general tunable sparse structures on vectors. In this work, we employ collaborative spike and slab priors that can be applied to matrices to encourage sparsity for the problem of multi-view ATR. That is, target images captured from multiple views are expanded in terms of a training dictionary multiplied with a coefficient matrix. Ideally, for a test image set comprising of multiple views of a target, coefficients corresponding to its identifying class are expected to be active, while others should be zero, i.e. the coefficient matrix is naturally sparse. We develop a new approach to solve the optimization problem that estimates the sparse coefficient matrix jointly with the sparsity inducing parameters in the collaborative prior. ATR problems are investigated on the mid-wave infrared (MWIR) database made available by the US Army Night Vision and Electronic Sensors Directorate, which has a rich collection of views. Experimental results show that the proposed joint prior and coefficient estimation method (JPCEM) can: 1.) enable improved accuracy when multiple views vs. a single one are invoked, and 2.) outperform state of the art alternatives particularly when training imagery is limited.
Role of stereoscopic imaging in the astronomical study of nearby stars and planetary systems
NASA Astrophysics Data System (ADS)
Mark, David S.; Waste, Corby
1997-05-01
The development of stereoscopic imaging as a 3D spatial mapping tool for planetary science is now beginning to find greater usefulness in the study of stellar atmospheres and planetary systems in general. For the first time, telescopes and accompanying spectrometers have demonstrated the capacity to depict the gyrating motion of nearby stars so precisely as to derive the existence of closely orbiting Jovian-type planets, which are gravitationally influencing the motion of the parent star. Also for the first time, remote space borne telescopes, unhindered by atmospheric effects, are recording and tracking the rotational characteristics of our nearby star, the sun, so accurately as to reveal and identify in great detail the heightened turbulence of the sun's corona. In order to perform new forms of stereo imaging and 3D reconstruction with such large scale objects as stars and planets, within solar systems, a set of geometrical parameters must be observed, and are illustrated here. The behavior of nearby stars can be studied over time using an astrometric approach, making use of the earth's orbital path as a semi- yearly stereo base for the viewing telescope. As is often the case in this method, the resulting stereo angle becomes too narrow to afford a beneficial stereo view, given the star's distance and the general level of detected noise in the signal. With the advent, though, of new earth based and space borne interferometers, operating within various wavelengths including IR, the capability of detecting and assembling the full 3-dimensional axes of motion of nearby gyrating stars can be achieved. In addition, the coupling of large interferometers with combined data sets can provide large stereo bases and low signal noise to produce converging 3- dimensional stereo views of nearby planetary systems. Several groups of new astronomical stereo imaging data sets are presented, including 3D views of the sun taken by the Solar and Heliospheric Observatory, coincident stereo views of the planet Jupiter during impact of comet Shoemaker-Levy 9, taken by the Galileo spacecraft and the Hubble Space Telescope, as well as views of nearby stars. Spatial ambiguities arising in singular 2-dimensional viewpoints are shown to be resolvable in twin perspective, 3-dimensional stereo views. Stereo imaging of this nature, therefore, occupies a complementary role in astronomical observing, provided the proper fields of view correspond with the path of the orbital geometry of the observing telescope.
Reichardt, J; Hess, M; Macke, A
2000-04-20
Multiple-scattering correction factors for cirrus particle extinction coefficients measured with Raman and high spectral resolution lidars are calculated with a radiative-transfer model. Cirrus particle-ensemble phase functions are computed from single-crystal phase functions derived in a geometrical-optics approximation. Seven crystal types are considered. In cirrus clouds with height-independent particle extinction coefficients the general pattern of the multiple-scattering parameters has a steep onset at cloud base with values of 0.5-0.7 followed by a gradual and monotonic decrease to 0.1-0.2 at cloud top. The larger the scattering particles are, the more gradual is the rate of decrease. Multiple-scattering parameters of complex crystals and of imperfect hexagonal columns and plates can be well approximated by those of projected-area equivalent ice spheres, whereas perfect hexagonal crystals show values as much as 70% higher than those of spheres. The dependencies of the multiple-scattering parameters on cirrus particle spectrum, base height, and geometric depth and on the lidar parameters laser wavelength and receiver field of view, are discussed, and a set of multiple-scattering parameter profiles for the correction of extinction measurements in homogeneous cirrus is provided.
MO-DE-207A-11: Sparse-View CT Reconstruction Via a Novel Non-Local Means Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Z; Qi, H; Wu, S
2016-06-15
Purpose: Sparse-view computed tomography (CT) reconstruction is an effective strategy to reduce the radiation dose delivered to patients. Due to its insufficiency of measurements, traditional non-local means (NLM) based reconstruction methods often lead to over-smoothness in image edges. To address this problem, an adaptive NLM reconstruction method based on rotational invariance (RIANLM) is proposed. Methods: The method consists of four steps: 1) Initializing parameters; 2) Algebraic reconstruction technique (ART) reconstruction using raw projection data; 3) Positivity constraint of the image reconstructed by ART; 4) Update reconstructed image by using RIANLM filtering. In RIANLM, a novel similarity metric that is rotationalmore » invariance is proposed and used to calculate the distance between two patches. In this way, any patch with similar structure but different orientation to the reference patch would win a relatively large weight to avoid over-smoothed image. Moreover, the parameter h in RIANLM which controls the decay of the weights is adaptive to avoid over-smoothness, while it in NLM is not adaptive during the whole reconstruction process. The proposed method is named as ART-RIANLM and validated on Shepp-Logan phantom and clinical projection data. Results: In our experiments, the searching neighborhood size is set to 15 by 15 and the similarity window is set to 3 by 3. For the simulated case with a resolution of 256 by 256 Shepp-Logan phantom, the ART-RIANLM produces higher SNR (35.38dB<24.00dB) and lower MAE (0.0006<0.0023) reconstructed image than ART-NLM. The visual inspection demonstrated that the proposed method could suppress artifacts or noises more effectively and preserve image edges better. Similar results were found for clinical data case. Conclusion: A novel ART-RIANLM method for sparse-view CT reconstruction is presented with superior image. Compared to the conventional ART-NLM method, the SNR and MAE from ART-RIANLM increases 47% and decreases 74%, respectively.« less
High-immersion three-dimensional display of the numerical computer model
NASA Astrophysics Data System (ADS)
Xing, Shujun; Yu, Xunbo; Zhao, Tianqi; Cai, Yuanfa; Chen, Duo; Chen, Zhidong; Sang, Xinzhu
2013-08-01
High-immersion three-dimensional (3D) displays making them valuable tools for many applications, such as designing and constructing desired building houses, industrial architecture design, aeronautics, scientific research, entertainment, media advertisement, military areas and so on. However, most technologies provide 3D display in the front of screens which are in parallel with the walls, and the sense of immersion is decreased. To get the right multi-view stereo ground image, cameras' photosensitive surface should be parallax to the public focus plane and the cameras' optical axes should be offset to the center of public focus plane both atvertical direction and horizontal direction. It is very common to use virtual cameras, which is an ideal pinhole camera to display 3D model in computer system. We can use virtual cameras to simulate the shooting method of multi-view ground based stereo image. Here, two virtual shooting methods for ground based high-immersion 3D display are presented. The position of virtual camera is determined by the people's eye position in the real world. When the observer stand in the circumcircle of 3D ground display, offset perspective projection virtual cameras is used. If the observer stands out the circumcircle of 3D ground display, offset perspective projection virtual cameras and the orthogonal projection virtual cameras are adopted. In this paper, we mainly discussed the parameter setting of virtual cameras. The Near Clip Plane parameter setting is the main point in the first method, while the rotation angle of virtual cameras is the main point in the second method. In order to validate the results, we use the D3D and OpenGL to render scenes of different viewpoints and generate a stereoscopic image. A realistic visualization system for 3D models is constructed and demonstrated for viewing horizontally, which provides high-immersion 3D visualization. The displayed 3D scenes are compared with the real objects in the real world.
Simulation and Correction of Triana-Viewed Earth Radiation Budget with ERBE/ISCCP Data
NASA Technical Reports Server (NTRS)
Huang, Jian-Ping; Minnis, Patrick; Doelling, David R.; Valero, Francisco P. J.
2002-01-01
This paper describes the simulation of the earth radiation budget (ERB) as viewed by Triana and the development of correction models for converting Trianaviewed radiances into a complete ERB. A full range of Triana views and global radiation fields are simulated using a combination of datasets from ERBE (Earth Radiation Budget Experiment) and ISCCP (International Satellite Cloud Climatology Project) and analyzed with a set of empirical correction factors specific to the Triana views. The results show that the accuracy of global correction factors to estimate ERB from Triana radiances is a function of the Triana position relative to the Lagrange-1 (L1) or the Sun location. Spectral analysis of the global correction factor indicates that both shortwave (SW; 0.2 - 5.0 microns) and longwave (LW; 5 -50 microns) parameters undergo seasonal and diurnal cycles that dominate the periodic fluctuations. The diurnal cycle, especially its amplitude, is also strongly dependent on the seasonal cycle. Based on these results, models are developed to correct the radiances for unviewed areas and anisotropic emission and reflection. A preliminary assessment indicates that these correction models can be applied to Triana radiances to produce the most accurate global ERB to date.
Ask the experts: past, present and future of the rule of five.
Baell, Jonathan; Congreve, Miles; Leeson, Paul; Abad-Zapatero, Celerino
2013-05-01
Coined in 1997, by Christopher Lipinki et al., the rule of five (Ro5) comprises a set of parameters that determine drug-likeness for oral delivery. The parameters are as follows: no more than five hydrogen bond donors (nitrogen or oxygen atoms with one or more hydrogen atoms); no more than ten hydrogen bond acceptors (nitrogen or oxygen atoms); a molecular mass less than 500 Da; and an octanol-water partition coefficient log P no greater than 5. Future Medicinal Chemistry invited a selection of leading researchers to express their views on Lipinski's Ro5, which has influenced drug design for over a decade. Their enlightening responses provide an insight into the current and future role of Ro5, and other rules of thumb, in the evolving world of medicinal chemistry.
The compression–error trade-off for large gridded data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silver, Jeremy D.; Zender, Charles S.
The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less
The compression–error trade-off for large gridded data sets
Silver, Jeremy D.; Zender, Charles S.
2017-01-27
The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less
Stereoscopic 3D reconstruction using motorized zoom lenses within an embedded system
NASA Astrophysics Data System (ADS)
Liu, Pengcheng; Willis, Andrew; Sui, Yunfeng
2009-02-01
This paper describes a novel embedded system capable of estimating 3D positions of surfaces viewed by a stereoscopic rig consisting of a pair of calibrated cameras. Novel theoretical and technical aspects of the system are tied to two aspects of the design that deviate from typical stereoscopic reconstruction systems: (1) incorporation of an 10x zoom lens (Rainbow- H10x8.5) and (2) implementation of the system on an embedded system. The system components include a DSP running μClinux, an embedded version of the Linux operating system, and an FPGA. The DSP orchestrates data flow within the system and performs complex computational tasks and the FPGA provides an interface to the system devices which consist of a CMOS camera pair and a pair of servo motors which rotate (pan) each camera. Calibration of the camera pair is accomplished using a collection of stereo images that view a common chess board calibration pattern for a set of pre-defined zoom positions. Calibration settings for an arbitrary zoom setting are estimated by interpolation of the camera parameters. A low-computational cost method for dense stereo matching is used to compute depth disparities for the stereo image pairs. Surface reconstruction is accomplished by classical triangulation of the matched points from the depth disparities. This article includes our methods and results for the following problems: (1) automatic computation of the focus and exposure settings for the lens and camera sensor, (2) calibration of the system for various zoom settings and (3) stereo reconstruction results for several free form objects.
Optical tests for using smartphones inside medical devices
NASA Astrophysics Data System (ADS)
Bernat, Amir S.; Acobas, Jennifer K.; Phang, Ye Shang; Hassan, David; Bolton, Frank J.; Levitz, David
2018-02-01
Smartphones are currently used in many medical applications and are more frequently being integrated into medical imaging devices. The regulatory requirements in existence today however, particularly the standardization of smartphone imaging through validation and verification testing, only partially cover imaging characteristics with a smartphone. Specifically, it has been shown that smartphone camera specifications are of sufficient quality for medical imaging, and there are devices which comply with the FDA's regulatory requirements for a medical device such as a device's field of view, direction of viewing and optical resolution and optical distortion. However, these regulatory requirements do not call specifically for color testing. Images of the same object using automatic settings or different light sources can show different color composition. Experimental results showing such differences are presented. Under some circumstances, such differences in color composition could potentially lead to incorrect diagnoses. It is therefore critical to control the smartphone camera and illumination parameters properly. This paper examines different smartphone camera settings that affect image quality and color composition. To test and select the correct settings, a test methodology is proposed. It aims at evaluating and testing image color correctness and white balance settings for mobile phones and LED light sources. Emphasis is placed on color consistency and deviation from gray values, specifically by evaluating the ΔC values based on the CIEL*a*b* color space. Results show that such standardization minimizes differences in color composition and thus could reduce the risk of a wrong diagnosis.
Feature selection and classification of multiparametric medical images using bagging and SVM
NASA Astrophysics Data System (ADS)
Fan, Yong; Resnick, Susan M.; Davatzikos, Christos
2008-03-01
This paper presents a framework for brain classification based on multi-parametric medical images. This method takes advantage of multi-parametric imaging to provide a set of discriminative features for classifier construction by using a regional feature extraction method which takes into account joint correlations among different image parameters; in the experiments herein, MRI and PET images of the brain are used. Support vector machine classifiers are then trained based on the most discriminative features selected from the feature set. To facilitate robust classification and optimal selection of parameters involved in classification, in view of the well-known "curse of dimensionality", base classifiers are constructed in a bagging (bootstrap aggregating) framework for building an ensemble classifier and the classification parameters of these base classifiers are optimized by means of maximizing the area under the ROC (receiver operating characteristic) curve estimated from their prediction performance on left-out samples of bootstrap sampling. This classification system is tested on a sex classification problem, where it yields over 90% classification rates for unseen subjects. The proposed classification method is also compared with other commonly used classification algorithms, with favorable results. These results illustrate that the methods built upon information jointly extracted from multi-parametric images have the potential to perform individual classification with high sensitivity and specificity.
Bayesian LASSO, scale space and decision making in association genetics.
Pasanen, Leena; Holmström, Lasse; Sillanpää, Mikko J
2015-01-01
LASSO is a penalized regression method that facilitates model fitting in situations where there are as many, or even more explanatory variables than observations, and only a few variables are relevant in explaining the data. We focus on the Bayesian version of LASSO and consider four problems that need special attention: (i) controlling false positives, (ii) multiple comparisons, (iii) collinearity among explanatory variables, and (iv) the choice of the tuning parameter that controls the amount of shrinkage and the sparsity of the estimates. The particular application considered is association genetics, where LASSO regression can be used to find links between chromosome locations and phenotypic traits in a biological organism. However, the proposed techniques are relevant also in other contexts where LASSO is used for variable selection. We separate the true associations from false positives using the posterior distribution of the effects (regression coefficients) provided by Bayesian LASSO. We propose to solve the multiple comparisons problem by using simultaneous inference based on the joint posterior distribution of the effects. Bayesian LASSO also tends to distribute an effect among collinear variables, making detection of an association difficult. We propose to solve this problem by considering not only individual effects but also their functionals (i.e. sums and differences). Finally, whereas in Bayesian LASSO the tuning parameter is often regarded as a random variable, we adopt a scale space view and consider a whole range of fixed tuning parameters, instead. The effect estimates and the associated inference are considered for all tuning parameters in the selected range and the results are visualized with color maps that provide useful insights into data and the association problem considered. The methods are illustrated using two sets of artificial data and one real data set, all representing typical settings in association genetics.
NASA Astrophysics Data System (ADS)
Koziel, Slawomir; Bekasiewicz, Adrian
2016-10-01
Multi-objective optimization of antenna structures is a challenging task owing to the high computational cost of evaluating the design objectives as well as the large number of adjustable parameters. Design speed-up can be achieved by means of surrogate-based optimization techniques. In particular, a combination of variable-fidelity electromagnetic (EM) simulations, design space reduction techniques, response surface approximation models and design refinement methods permits identification of the Pareto-optimal set of designs within a reasonable timeframe. Here, a study concerning the scalability of surrogate-assisted multi-objective antenna design is carried out based on a set of benchmark problems, with the dimensionality of the design space ranging from six to 24 and a CPU cost of the EM antenna model from 10 to 20 min per simulation. Numerical results indicate that the computational overhead of the design process increases more or less quadratically with the number of adjustable geometric parameters of the antenna structure at hand, which is a promising result from the point of view of handling even more complex problems.
Effect of field of view and monocular viewing on angular size judgements in an outdoor scene
NASA Technical Reports Server (NTRS)
Denz, E. A.; Palmer, E. A.; Ellis, S. R.
1980-01-01
Observers typically overestimate the angular size of distant objects. Significantly, overestimations are greater in outdoor settings than in aircraft visual-scene simulators. The effect of field of view and monocular and binocular viewing conditions on angular size estimation in an outdoor field was examined. Subjects adjusted the size of a variable triangle to match the angular size of a standard triangle set at three greater distances. Goggles were used to vary the field of view from 11.5 deg to 90 deg for both monocular and binocular viewing. In addition, an unrestricted monocular and binocular viewing condition was used. It is concluded that neither restricted fields of view similar to those present in visual simulators nor the restriction of monocular viewing causes a significant loss in depth perception in outdoor settings. Thus, neither factor should significantly affect the depth realism of visual simulators.
NASA Astrophysics Data System (ADS)
Van doninck, Jasper; Tuomisto, Hanna
2017-06-01
Biodiversity mapping in extensive tropical forest areas poses a major challenge for the interpretation of Landsat images, because floristically clearly distinct forest types may show little difference in reflectance. In such cases, the effects of the bidirectional reflection distribution function (BRDF) can be sufficiently strong to cause erroneous image interpretation and classification. Since the opening of the Landsat archive in 2008, several BRDF normalization methods for Landsat have been developed. The simplest of these consist of an empirical view angle normalization, whereas more complex approaches apply the semi-empirical Ross-Li BRDF model and the MODIS MCD43-series of products to normalize directional Landsat reflectance to standard view and solar angles. Here we quantify the effect of surface anisotropy on Landsat TM/ETM+ images over old-growth Amazonian forests, and evaluate five angular normalization approaches. Even for the narrow swath of the Landsat sensors, we observed directional effects in all spectral bands. Those normalization methods that are based on removing the surface reflectance gradient as observed in each image were adequate to normalize TM/ETM+ imagery to nadir viewing, but were less suitable for multitemporal analysis when the solar vector varied strongly among images. Approaches based on the MODIS BRDF model parameters successfully reduced directional effects in the visible bands, but removed only half of the systematic errors in the infrared bands. The best results were obtained when the semi-empirical BRDF model was calibrated using pairs of Landsat observation. This method produces a single set of BRDF parameters, which can then be used to operationally normalize Landsat TM/ETM+ imagery over Amazonian forests to nadir viewing and a standard solar configuration.
Dairy Analytics and Nutrient Analysis (DANA) Prototype System User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sam Alessi; Dennis Keiser
2012-10-01
This document is a user manual for the Dairy Analytics and Nutrient Analysis (DANA) model. DANA provides an analysis of dairy anaerobic digestion technology and allows users to calculate biogas production, co-product valuation, capital costs, expenses, revenue and financial metrics, for user customizable scenarios, dairy and digester types. The model provides results for three anaerobic digester types; Covered Lagoons, Modified Plug Flow, and Complete Mix, and three main energy production technologies; electricity generation, renewable natural gas generation, and compressed natural gas generation. Additional options include different dairy types, bedding types, backend treatment type as well as numerous production, and economicmore » parameters. DANA’s goal is to extend the National Market Value of Anaerobic Digester Products analysis (informa economics, 2012; Innovation Center, 2011) to include a greater and more flexible set of regional digester scenarios and to provide a modular framework for creation of a tool to support farmer and investor needs. Users can set up scenarios from combinations of existing parameters or add new parameters, run the model and view a variety of reports, charts and tables that are automatically produced and delivered over the web interface. DANA is based in the INL’s analysis architecture entitled Generalized Environment for Modeling Systems (GEMS) , which offers extensive collaboration, analysis, and integration opportunities and greatly speeds the ability construct highly scalable web delivered user-oriented decision tools. DANA’s approach uses server-based data processing and web-based user interfaces, rather a client-based spreadsheet approach. This offers a number of benefits over the client-based approach. Server processing and storage can scale up to handle a very large number of scenarios, so that analysis of county, even field level, across the whole U.S., can be performed. Server based databases allow dairy and digester parameters be held and managed in a single managed data repository, while allows users to customize standard values and perform individual analysis. Server-based calculations can be easily extended, versions and upgrades managed, and any changes are immediately available to all users. This user manual describes how to use and/or modify input database tables, run DANA, view and modify reports.« less
Spine centerline extraction and efficient spine reading of MRI and CT data
NASA Astrophysics Data System (ADS)
Lorenz, C.; Vogt, N.; Börnert, P.; Brosch, T.
2018-03-01
Radiological assessment of the spine is performed regularly in the context of orthopedics, neurology, oncology, and trauma management. Due to the extension and curved geometry of the spinal column, reading is time-consuming and requires substantial user interaction to navigate through the data during inspection. In this paper a spine geometry guided viewing approach is proposed facilitating reading by reducing the degrees of freedom to be manipulated during inspection of the data. The method is using the spine centerline as a representation of the spine geometry. We assume that renderings most useful for reading are those that can be locally defined based on a rotation and translation relative to the spine centerline. The resulting renderings conserve locally the relation to the spine and lead to curved planar reformats that can be adjusted using a small set of parameters to minimize user interaction. The spine centerline is extracted by an automated image to image foveal fully convolutional neural network (FFCN) based approach. The network consists of three parallel convolutional pathways working on different levels of resolution and processed fields of view. The outputs of the parallel pathways are combined by a subsequent feature integration pathway to yield the (final) centerline probability map, which is converted into a set of spine centerline points. The network has been trained separately using two data set types, one comprising a mixture of T1 and T2 weighted spine MR images and one using CT image data. We achieve an average centerline position error of 1.7 mm for MR and 0.9 mm for CT and a DICE coefficient of 0.84 for MR and 0.95 for CT. Based on the thus obtained centerline viewing and multi-planar reformatting can be easily facilitated.
Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei
2016-01-01
Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762
Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei
2016-01-01
Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme.
Yamashita, Wakayo; Wang, Gang; Tanaka, Keiji
2010-01-01
One usually fails to recognize an unfamiliar object across changes in viewing angle when it has to be discriminated from similar distractor objects. Previous work has demonstrated that after long-term experience in discriminating among a set of objects seen from the same viewing angle, immediate recognition of the objects across 30-60 degrees changes in viewing angle becomes possible. The capability for view-invariant object recognition should develop during the within-viewing-angle discrimination, which includes two kinds of experience: seeing individual views and discriminating among the objects. The aim of the present study was to determine the relative contribution of each factor to the development of view-invariant object recognition capability. Monkeys were first extensively trained in a task that required view-invariant object recognition (Object task) with several sets of objects. The animals were then exposed to a new set of objects over 26 days in one of two preparatory tasks: one in which each object view was seen individually, and a second that required discrimination among the objects at each of four viewing angles. After the preparatory period, we measured the monkeys' ability to recognize the objects across changes in viewing angle, by introducing the object set to the Object task. Results indicated significant view-invariant recognition after the second but not first preparatory task. These results suggest that discrimination of objects from distractors at each of several viewing angles is required for the development of view-invariant recognition of the objects when the distractors are similar to the objects.
A New Look at the Eclipse Timing Variation Diagram Analysis of Selected 3-body W UMa Systems
NASA Astrophysics Data System (ADS)
Christopoulou, P.-E.; Papageorgiou, A.
2015-07-01
The light travel effect produced by the presence of tertiary components can reveal much about the origin and evolution of over-contact binaries. Monitoring of W UMa systems over the last decade and/or the use of publicly available photometric surveys (NSVS, ASAS, etc.) has uncovered or suggested the presence of many unseen companions, which calls for an in-depth investigation of the parameters derived from cyclic period variations in order to confirm or reject the assumption of hidden companion(s). Progress in the analysis of eclipse timing variations is summarized here both from the empirical and the theoretical points of view, and a more extensive investigation of the proposed orbital parameters of third bodies is proposed. The code we have developed for this, implemented in Python, is set up to handle heuristic scanning with parameter perturbation in parameter space, and to establish realistic uncertainties from the least squares fitting. A computational example is given for TZ Boo, a W UMa system with a spectroscopically detected third component. Future options to be implemented include MCMC and bootstrapping.
NASA Astrophysics Data System (ADS)
Fritz, Andreas; Enßle, Fabian; Zhang, Xiaoli; Koch, Barbara
2016-08-01
The present study analyses the two earth observation sensors regarding their capability of modelling forest above ground biomass and forest density. Our research is carried out at two different demonstration sites. The first is located in south-western Germany (region Karlsruhe) and the second is located in southern China in Jiangle County (Province Fujian). A set of spectral and spatial predictors are computed from both, Sentinel-2A and WorldView-2 data. Window sizes in the range of 3*3 pixels to 21*21 pixels are computed in order to cover the full range of the canopy sizes of mature forest stands. Textural predictors of first and second order (grey-level-co-occurrence matrix) are calculated and are further used within a feature selection procedure. Additionally common spectral predictors from WorldView-2 and Sentinel-2A data such as all relevant spectral bands and NDVI are integrated in the analyses. To examine the most important predictors, a predictor selection algorithm is applied to the data, whereas the entire predictor set of more than 1000 predictors is used to find most important ones. Out of the original set only the most important predictors are then further analysed. Predictor selection is done with the Boruta package in R (Kursa and Rudnicki (2010)), whereas regression is computed with random forest. Prior the classification and regression a tuning of parameters is done by a repetitive model selection (100 runs), based on the .632 bootstrapping. Both are implemented in the caret R pack- age (Kuhn et al. (2016)). To account for the variability in the data set 100 independent runs are performed. Within each run 80 percent of the data is used for training and the 20 percent are used for an independent validation. With the subset of original predictors mapping of above ground biomass is performed.
Green, R.O.; Pieters, C.; Mouroulis, P.; Eastwood, M.; Boardman, J.; Glavich, T.; Isaacson, P.; Annadurai, M.; Besse, S.; Barr, D.; Buratti, B.; Cate, D.; Chatterjee, A.; Clark, R.; Cheek, L.; Combe, J.; Dhingra, D.; Essandoh, V.; Geier, S.; Goswami, J.N.; Green, R.; Haemmerle, V.; Head, J.; Hovland, L.; Hyman, S.; Klima, R.; Koch, T.; Kramer, G.; Kumar, A.S.K.; Lee, Kenneth; Lundeen, S.; Malaret, E.; McCord, T.; McLaughlin, S.; Mustard, J.; Nettles, J.; Petro, N.; Plourde, K.; Racho, C.; Rodriquez, J.; Runyon, C.; Sellar, G.; Smith, C.; Sobel, H.; Staid, M.; Sunshine, J.; Taylor, L.; Thaisen, K.; Tompkins, S.; Tseng, H.; Vane, G.; Varanasi, P.; White, M.; Wilson, D.
2011-01-01
The NASA Discovery Moon Mineralogy Mapper imaging spectrometer was selected to pursue a wide range of science objectives requiring measurement of composition at fine spatial scales over the full lunar surface. To pursue these objectives, a broad spectral range imaging spectrometer with high uniformity and high signal-to-noise ratio capable of measuring compositionally diagnostic spectral absorption features from a wide variety of known and possible lunar materials was required. For this purpose the Moon Mineralogy Mapper imaging spectrometer was designed and developed that measures the spectral range from 430 to 3000 nm with 10 nm spectral sampling through a 24 degree field of view with 0.7 milliradian spatial sampling. The instrument has a signal-to-noise ratio of greater than 400 for the specified equatorial reference radiance and greater than 100 for the polar reference radiance. The spectral cross-track uniformity is >90% and spectral instantaneous field-of-view uniformity is >90%. The Moon Mineralogy Mapper was launched on Chandrayaan-1 on the 22nd of October. On the 18th of November 2008 the Moon Mineralogy Mapper was turned on and collected a first light data set within 24 h. During this early checkout period and throughout the mission the spacecraft thermal environment and orbital parameters varied more than expected and placed operational and data quality constraints on the measurements. On the 29th of August 2009, spacecraft communication was lost. Over the course of the flight mission 1542 downlinked data sets were acquired that provide coverage of more than 95% of the lunar surface. An end-to-end science data calibration system was developed and all measurements have been passed through this system and delivered to the Planetary Data System (PDS.NASA.GOV). An extensive effort has been undertaken by the science team to validate the Moon Mineralogy Mapper science measurements in the context of the mission objectives. A focused spectral, radiometric, spatial, and uniformity validation effort has been pursued with selected data sets including an Earth-view data set. With this effort an initial validation of the on-orbit performance of the imaging spectrometer has been achieved, including validation of the cross-track spectral uniformity and spectral instantaneous field of view uniformity. The Moon Mineralogy Mapper is the first imaging spectrometer to measure a data set of this kind at the Moon. These calibrated science measurements are being used to address the full set of science goals and objectives for this mission. Copyright 2011 by the American Geophysical Union.
C. Pieters,; P. Mouroulis,; M. Eastwood,; J. Boardman,; Green, R.O.; Glavich, T.; Isaacson, P.; Annadurai, M.; Besse, S.; Cate, D.; Chatterjee, A.; Clark, R.; Barr, D.; Cheek, L.; Combe, J.; Dhingra, D.; Essandoh, V.; Geier, S.; Goswami, J.N.; Green, R.; Haemmerle, V.; Head, J.; Hovland, L.; Hyman, S.; Klima, R.; Koch, T.; Kramer, G.; Kumar, A.S.K.; Lee, K.; Lundeen, S.; Malaret, E.; McCord, T.; McLaughlin, S.; Mustard, J.; Nettles, J.; Petro, N.; Plourde, K.; Racho, C.; Rodriguez, J.; Runyon, C.; Sellar, G.; Smith, C.; Sobel, H.; Staid, M.; Sunshine, J.; Taylor, L.; Thaisen, K.; Tompkins, S.; Tseng, H.; Vane, G.; Varanasi, P.; White, M.; Wilson, D.
2011-01-01
The NASA Discovery Moon Mineralogy Mapper imaging spectrometer was selected to pursue a wide range of science objectives requiring measurement of composition at fine spatial scales over the full lunar surface. To pursue these objectives, a broad spectral range imaging spectrometer with high uniformity and high signal-to-noise ratio capable of measuring compositionally diagnostic spectral absorption features from a wide variety of known and possible lunar materials was required. For this purpose the Moon Mineralogy Mapper imaging spectrometer was designed and developed that measures the spectral range from 430 to 3000 nm with 10 nm spectral sampling through a 24 degree field of view with 0.7 milliradian spatial sampling. The instrument has a signal-to-noise ratio of greater than 400 for the specified equatorial reference radiance and greater than 100 for the polar reference radiance. The spectral cross-track uniformity is >90% and spectral instantaneous field-of-view uniformity is >90%. The Moon Mineralogy Mapper was launched on Chandrayaan-1 on the 22nd of October. On the 18th of November 2008 the Moon Mineralogy Mapper was turned on and collected a first light data set within 24 h. During this early checkout period and throughout the mission the spacecraft thermal environment and orbital parameters varied more than expected and placed operational and data quality constraints on the measurements. On the 29th of August 2009, spacecraft communication was lost. Over the course of the flight mission 1542 downlinked data sets were acquired that provide coverage of more than 95% of the lunar surface. An end-to-end science data calibration system was developed and all measurements have been passed through this system and delivered to the Planetary Data System (PDS.NASA.GOV). An extensive effort has been undertaken by the science team to validate the Moon Mineralogy Mapper science measurements in the context of the mission objectives. A focused spectral, radiometric, spatial, and uniformity validation effort has been pursued with selected data sets including an Earth-view data set. With this effort an initial validation of the on-orbit performance of the imaging spectrometer has been achieved, including validation of the cross-track spectral uniformity and spectral instantaneous field of view uniformity. The Moon Mineralogy Mapper is the first imaging spectrometer to measure a data set of this kind at the Moon. These calibrated science measurements are being used to address the full set of science goals and objectives for this mission.
2. GENERAL VIEW OF SETTING OF TANK T0014 (ON LEFT). ...
2. GENERAL VIEW OF SETTING OF TANK T0014 (ON LEFT). TANK T0015 IS ON RIGHT. VIEW TO SOUTHWEST. - Rocky Mountain Arsenal, Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
2. GENERAL VIEW OF SETTING OF TANK T1060 (ON RIGHT). ...
2. GENERAL VIEW OF SETTING OF TANK T1060 (ON RIGHT). TANK T0161 IS ON LEFT. VIEW TO SOUTHWEST. - Rocky Mountain Arsenal, Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
2. GENERAL VIEW OF SETTING OF TANK T0164 (ON LEFT). ...
2. GENERAL VIEW OF SETTING OF TANK T0164 (ON LEFT). TANK T0165 IS ON RIGHT. VIEW TO SOUTHWEST. - Rocky Mountain Arsenal, Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
1. GENERAL VIEW OF SETTING OF TANK T0164 (ON RIGHT). ...
1. GENERAL VIEW OF SETTING OF TANK T0164 (ON RIGHT). TANK T0165 IS ON LEFT. VIEW TO NORTHEAST. - Rocky Mountain Arsenal, Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
1. GENERAL VIEW OF SETTING OF TANK T0014 (ON LEFT). ...
1. GENERAL VIEW OF SETTING OF TANK T0014 (ON LEFT). TANK T0015 IS ON RIGHT. VIEW TO NORTHWEST. - Rocky Mountain Arsenal, Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
1. GENERAL VIEW OF SETTING OF TANK T0160 (ON LEFT). ...
1. GENERAL VIEW OF SETTING OF TANK T0160 (ON LEFT). TANK T0161 IS ON RIGHT. VIEW TO NORTHWEST. - Rocky Mountain Arsenal, Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
3. GENERAL VIEW OF SETTING OF TANK 0745C (ON LEFT). ...
3. GENERAL VIEW OF SETTING OF TANK 0745C (ON LEFT). TANK 0745B IS ON RIGHT. VIEW TO SOUTHWEST. - Rocky Mountain Arsenal, Gasoline Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
Precise interferometric tracking of the DSCS II geosynchronous orbiter
NASA Astrophysics Data System (ADS)
Border, J. S.; Donivan, F. F., Jr.; Shiomi, T.; Kawano, N.
1986-01-01
A demonstration of the precise tracking of a geosynchronous orbiter by radio metric techniques based on very-long-baseline interferometry (VLBI) has been jointly conducted by the Jet Propulsion Laboratory and Japan's Radio Research Laboratory. Simultaneous observations of a U.S. Air Force communications satellite from tracking stations in California, Australia, and Japan have determined the satellite's position with an accuracy of a few meters. Accuracy claims are based on formal statistics, which include the effects of errors in non-estimated parameters and which are supported by a chi-squared of less than one, and on the consistency of orbit solutions from disjoint data sets. A study made to assess the impact of shorter baselines and reduced data noise concludes that with a properly designed system, similar accuracy could be obtained for either a satellite viewed from stations located within the continental U.S. or for a satellite viewed from stations within Japanese territory.
Managing Spatial Selections With Contextual Snapshots
Mindek, P; Gröller, M E; Bruckner, S
2014-01-01
Spatial selections are a ubiquitous concept in visualization. By localizing particular features, they can be analysed and compared in different views. However, the semantics of such selections often depend on specific parameter settings and it can be difficult to reconstruct them without additional information. In this paper, we present the concept of contextual snapshots as an effective means for managing spatial selections in visualized data. The selections are automatically associated with the context in which they have been created. Contextual snapshots can also be used as the basis for interactive integrated and linked views, which enable in-place investigation and comparison of multiple visual representations of data. Our approach is implemented as a flexible toolkit with well-defined interfaces for integration into existing systems. We demonstrate the power and generality of our techniques by applying them to several distinct scenarios such as the visualization of simulation data, the analysis of historical documents and the display of anatomical data. PMID:25821284
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chremos, Alexandros, E-mail: achremos@imperial.ac.uk; Nikoubashman, Arash, E-mail: arashn@princeton.edu; Panagiotopoulos, Athanassios Z.
In this contribution, we develop a coarse-graining methodology for mapping specific block copolymer systems to bead-spring particle-based models. We map the constituent Kuhn segments to Lennard-Jones particles, and establish a semi-empirical correlation between the experimentally determined Flory-Huggins parameter χ and the interaction of the model potential. For these purposes, we have performed an extensive set of isobaric–isothermal Monte Carlo simulations of binary mixtures of Lennard-Jones particles with the same size but with asymmetric energetic parameters. The phase behavior of these monomeric mixtures is then extended to chains with finite sizes through theoretical considerations. Such a top-down coarse-graining approach is importantmore » from a computational point of view, since many characteristic features of block copolymer systems are on time and length scales which are still inaccessible through fully atomistic simulations. We demonstrate the applicability of our method for generating parameters by reproducing the morphology diagram of a specific diblock copolymer, namely, poly(styrene-b-methyl methacrylate), which has been extensively studied in experiments.« less
van de Geijn, J; Fraass, B A
1984-01-01
The net fractional depth dose (NFD) is defined as the fractional depth dose (FDD) corrected for inverse square law. Analysis of its behavior as a function of depth, field size, and source-surface distance has led to an analytical description with only seven model parameters related to straightforward physical properties. The determination of the characteristic parameter values requires only seven experimentally determined FDDs. The validity of the description has been tested for beam qualities ranging from 60Co gamma rays to 18-MV x rays, using published data from several different sources as well as locally measured data sets. The small number of model parameters is attractive for computer or hand-held calculator applications. The small amount of required measured data is important in view of practical data acquisition for implementation of a computer-based dose calculation system. The generating function allows easy and accurate generation of FDD, tissue-air ratio, tissue-maximum ratio, and tissue-phantom ratio tables.
Net fractional depth dose: a basis for a unified analytical description of FDD, TAR, TMR, and TPR
DOE Office of Scientific and Technical Information (OSTI.GOV)
van de Geijn, J.; Fraass, B.A.
The net fractional depth dose (NFD) is defined as the fractional depth dose (FDD) corrected for inverse square law. Analysis of its behavior as a function of depth, field size, and source-surface distance has led to an analytical description with only seven model parameters related to straightforward physical properties. The determination of the characteristic parameter values requires only seven experimentally determined FDDs. The validity of the description has been tested for beam qualities ranging from /sup 60/Co gamma rays to 18-MV x rays, using published data from several different sources as well as locally measured data sets. The small numbermore » of model parameters is attractive for computer or hand-held calculator applications. The small amount of required measured data is important in view of practical data acquisition for implementation of a computer-based dose calculation system. The generating function allows easy and accurate generation of FDD, tissue-air ratio, tissue-maximum ratio, and tissue-phantom ratio tables.« less
Meteorological Station, general view in setting showing west and north ...
Meteorological Station, general view in setting showing west and north sides; view to southeast - Fort McKinley, Meteorological Station, East side of Weymouth Way, approximately 225 feet south of Cove Side Drive, Great Diamond Island, Portland, Cumberland County, ME
Computer-aided detection of breast masses: Four-view strategy for screening mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei Jun; Chan Heangping; Zhou Chuan
2011-04-15
Purpose: To improve the performance of a computer-aided detection (CAD) system for mass detection by using four-view information in screening mammography. Methods: The authors developed a four-view CAD system that emulates radiologists' reading by using the craniocaudal and mediolateral oblique views of the ipsilateral breast to reduce false positives (FPs) and the corresponding views of the contralateral breast to detect asymmetry. The CAD system consists of four major components: (1) Initial detection of breast masses on individual views, (2) information fusion of the ipsilateral views of the breast (referred to as two-view analysis), (3) information fusion of the corresponding viewsmore » of the contralateral breast (referred to as bilateral analysis), and (4) fusion of the four-view information with a decision tree. The authors collected two data sets for training and testing of the CAD system: A mass set containing 389 patients with 389 biopsy-proven masses and a normal set containing 200 normal subjects. All cases had four-view mammograms. The true locations of the masses on the mammograms were identified by an experienced MQSA radiologist. The authors randomly divided the mass set into two independent sets for cross validation training and testing. The overall test performance was assessed by averaging the free response receiver operating characteristic (FROC) curves of the two test subsets. The FP rates during the FROC analysis were estimated by using the normal set only. The jackknife free-response ROC (JAFROC) method was used to estimate the statistical significance of the difference between the test FROC curves obtained with the single-view and the four-view CAD systems. Results: Using the single-view CAD system, the breast-based test sensitivities were 58% and 77% at the FP rates of 0.5 and 1.0 per image, respectively. With the four-view CAD system, the breast-based test sensitivities were improved to 76% and 87% at the corresponding FP rates, respectively. The improvement was found to be statistically significant (p<0.0001) by JAFROC analysis. Conclusions: The four-view information fusion approach that emulates radiologists' reading strategy significantly improves the performance of breast mass detection of the CAD system in comparison with the single-view approach.« less
Climbing fibers predict movement kinematics and performance errors.
Streng, Martha L; Popa, Laurentiu S; Ebner, Timothy J
2017-09-01
Requisite for understanding cerebellar function is a complete characterization of the signals provided by complex spike (CS) discharge of Purkinje cells, the output neurons of the cerebellar cortex. Numerous studies have provided insights into CS function, with the most predominant view being that they are evoked by error events. However, several reports suggest that CSs encode other aspects of movements and do not always respond to errors or unexpected perturbations. Here, we evaluated CS firing during a pseudo-random manual tracking task in the monkey ( Macaca mulatta ). This task provides extensive coverage of the work space and relative independence of movement parameters, delivering a robust data set to assess the signals that activate climbing fibers. Using reverse correlation, we determined feedforward and feedback CSs firing probability maps with position, velocity, and acceleration, as well as position error, a measure of tracking performance. The direction and magnitude of the CS modulation were quantified using linear regression analysis. The major findings are that CSs significantly encode all three kinematic parameters and position error, with acceleration modulation particularly common. The modulation is not related to "events," either for position error or kinematics. Instead, CSs are spatially tuned and provide a linear representation of each parameter evaluated. The CS modulation is largely predictive. Similar analyses show that the simple spike firing is modulated by the same parameters as the CSs. Therefore, CSs carry a broader array of signals than previously described and argue for climbing fiber input having a prominent role in online motor control. NEW & NOTEWORTHY This article demonstrates that complex spike (CS) discharge of cerebellar Purkinje cells encodes multiple parameters of movement, including motor errors and kinematics. The CS firing is not driven by error or kinematic events; instead it provides a linear representation of each parameter. In contrast with the view that CSs carry feedback signals, the CSs are predominantly predictive of upcoming position errors and kinematics. Therefore, climbing fibers carry multiple and predictive signals for online motor control. Copyright © 2017 the American Physiological Society.
Molecular profiles to biology and pathways: a systems biology approach.
Van Laere, Steven; Dirix, Luc; Vermeulen, Peter
2016-06-16
Interpreting molecular profiles in a biological context requires specialized analysis strategies. Initially, lists of relevant genes were screened to identify enriched concepts associated with pathways or specific molecular processes. However, the shortcoming of interpreting gene lists by using predefined sets of genes has resulted in the development of novel methods that heavily rely on network-based concepts. These algorithms have the advantage that they allow a more holistic view of the signaling properties of the condition under study as well as that they are suitable for integrating different data types like gene expression, gene mutation, and even histological parameters.
Military display performance parameters
NASA Astrophysics Data System (ADS)
Desjardins, Daniel D.; Meyer, Frederick
2012-06-01
The military display market is analyzed in terms of four of its segments: avionics, vetronics, dismounted soldier, and command and control. Requirements are summarized for a number of technology-driving parameters, to include luminance, night vision imaging system compatibility, gray levels, resolution, dimming range, viewing angle, video capability, altitude, temperature, shock and vibration, etc., for direct-view and virtual-view displays in cockpits and crew stations. Technical specifications are discussed for selected programs.
Okamura, Jun-ya; Yamaguchi, Reona; Honda, Kazunari; Tanaka, Keiji
2014-01-01
One fails to recognize an unfamiliar object across changes in viewing angle when it must be discriminated from similar distractor objects. View-invariant recognition gradually develops as the viewer repeatedly sees the objects in rotation. It is assumed that different views of each object are associated with one another while their successive appearance is experienced in rotation. However, natural experience of objects also contains ample opportunities to discriminate among objects at each of the multiple viewing angles. Our previous behavioral experiments showed that after experiencing a new set of object stimuli during a task that required only discrimination at each of four viewing angles at 30° intervals, monkeys could recognize the objects across changes in viewing angle up to 60°. By recording activities of neurons from the inferotemporal cortex after various types of preparatory experience, we here found a possible neural substrate for the monkeys' performance. For object sets that the monkeys had experienced during the task that required only discrimination at each of four viewing angles, many inferotemporal neurons showed object selectivity covering multiple views. The degree of view generalization found for these object sets was similar to that found for stimulus sets with which the monkeys had been trained to conduct view-invariant recognition. These results suggest that the experience of discriminating new objects in each of several viewing angles develops the partially view-generalized object selectivity distributed over many neurons in the inferotemporal cortex, which in turn bases the monkeys' emergent capability to discriminate the objects across changes in viewing angle. PMID:25378169
A new NASA/MSFC mission analysis global cloud cover data base
NASA Technical Reports Server (NTRS)
Brown, S. C.; Jeffries, W. R., III
1985-01-01
A global cloud cover data set, derived from the USAF 3D NEPH Analysis, was developed for use in climate studies and for Earth viewing applications. This data set contains a single parameter - total sky cover - separated in time by 3 or 6 hr intervals and in space by approximately 50 n.mi. Cloud cover amount is recorded for each grid point (of a square grid) by a single alphanumeric character representing each 5 percent increment of sky cover. The data are arranged in both quarterly and monthly formats. The data base currently provides daily, 3-hr observed total sky cover for the Northern Hemisphere from 1972 through 1977 less 1976. For the Southern Hemisphere, there are data at 6-hr intervals for 1976 through 1978 and at 3-hr intervals for 1979 and 1980. More years of data are being added. To validate the data base, the percent frequency of or = 0.3 and or = 0.8 cloud cover was compared with ground observed cloud amounts at several locations with generally good agreement. Mean or other desired cloud amounts can be calculated for any time period and any size area from a single grid point to a hemisphere. The data base is especially useful in evaluating the consequence of cloud cover on Earth viewing space missions. The temporal and spatial frequency of the data allow simulations that closely approximate any projected viewing mission. No adjustments are required to account for cloud continuity.
2. GENERAL VIEW OF SETTING OF TANK T0075 (ON RIGHT). ...
2. GENERAL VIEW OF SETTING OF TANK T0075 (ON RIGHT). TANKS T0076, T0077, AND T0078 ARE ON LEFT. VIEW TO SOUTHWEST. - Rocky Mountain Arsenal, Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
1. GENERAL VIEW OF SETTING OF TANK T0075 (ON LEFT). ...
1. GENERAL VIEW OF SETTING OF TANK T0075 (ON LEFT). TANKS T0076, T0077, AND T0078 ARE ON RIGHT. VIEW TO NORTHEAST. - Rocky Mountain Arsenal, Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
1. GENERAL VIEW OF SETTING OF TANK 0745C (ON RIGHT). ...
1. GENERAL VIEW OF SETTING OF TANK 0745C (ON RIGHT). TANKS 0745A, AND O745B ARE ON LEFT. VIEW TO NORTHEAST. - Rocky Mountain Arsenal, Gasoline Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
2. GENERAL VIEW OF SETTING OF TANK 0745C (ON LEFT). ...
2. GENERAL VIEW OF SETTING OF TANK 0745C (ON LEFT). TANKS 0745A AND 0745B ARE ON RIGHT. VIEW TO NORTHWEST. - Rocky Mountain Arsenal, Gasoline Storage Tank, December Seventh Avenue & D Street, Commerce City, Adams County, CO
GEMAS: Unmixing magnetic properties of European agricultural soil
NASA Astrophysics Data System (ADS)
Fabian, Karl; Reimann, Clemens; Kuzina, Dilyara; Kosareva, Lina; Fattakhova, Leysan; Nurgaliev, Danis
2016-04-01
High resolution magnetic measurements provide new methods for world-wide characterization and monitoring of agricultural soil which is essential for quantifying geologic and human impact on the critical zone environment and consequences of climatic change, for planning economic and ecological land use, and for forensic applications. Hysteresis measurements of all Ap samples from the GEMAS survey yield a comprehensive overview of mineral magnetic properties in European agricultural soil on a continental scale. Low (460 Hz), and high frequency (4600 Hz) magnetic susceptibility k were measured using a Bartington MS2B sensor. Hysteresis properties were determined by a J-coercivity spectrometer, built at the paleomagnetic laboratory of Kazan University, providing for each sample a modified hysteresis loop, backfield curve, acquisition curve of isothermal remanent magnetization, and a viscous IRM decay spectrum. Each measurement set is obtained in a single run from zero field up to 1.5 T and back to -1.5 T. The resulting data are used to create the first continental-scale maps of magnetic soil parameters. Because the GEMAS geochemical atlas contains a comprehensive set of geochemical data for the same soil samples, the new data can be used to map magnetic parameters in relation to chemical and geological parameters. The data set also provides a unique opportunity to analyze the magnetic mineral fraction of the soil samples by unmixing their IRM acquisition curves. The endmember coefficients are interpreted by linear inversion for other magnetic, physical and chemical properties which results in an unprecedented and detailed view of the mineral magnetic composition of European agricultural soils.
Facial recognition using simulated prosthetic pixelized vision.
Thompson, Robert W; Barnett, G David; Humayun, Mark S; Dagnelie, Gislin
2003-11-01
To evaluate a model of simulated pixelized prosthetic vision using noncontiguous circular phosphenes, to test the effects of phosphene and grid parameters on facial recognition. A video headset was used to view a reference set of four faces, followed by a partially averted image of one of those faces viewed through a square pixelizing grid that contained 10x10 to 32x32 dots separated by gaps. The grid size, dot size, gap width, dot dropout rate, and gray-scale resolution were varied separately about a standard test condition, for a total of 16 conditions. All tests were first performed at 99% contrast and then repeated at 12.5% contrast. Discrimination speed and performance were influenced by all stimulus parameters. The subjects achieved highly significant facial recognition accuracy for all high-contrast tests except for grids with 70% random dot dropout and two gray levels. In low-contrast tests, significant facial recognition accuracy was achieved for all but the most adverse grid parameters: total grid area less than 17% of the target image, 70% dropout, four or fewer gray levels, and a gap of 40.5 arcmin. For difficult test conditions, a pronounced learning effect was noticed during high-contrast trials, and a more subtle practice effect on timing was evident during subsequent low-contrast trials. These findings suggest that reliable face recognition with crude pixelized grids can be learned and may be possible, even with a crude visual prosthesis.
Corredor, Germán; Whitney, Jon; Arias, Viviana; Madabhushi, Anant; Romero, Eduardo
2017-01-01
Abstract. Computational histomorphometric approaches typically use low-level image features for building machine learning classifiers. However, these approaches usually ignore high-level expert knowledge. A computational model (M_im) combines low-, mid-, and high-level image information to predict the likelihood of cancer in whole slide images. Handcrafted low- and mid-level features are computed from area, color, and spatial nuclei distributions. High-level information is implicitly captured from the recorded navigations of pathologists while exploring whole slide images during diagnostic tasks. This model was validated by predicting the presence of cancer in a set of unseen fields of view. The available database was composed of 24 cases of basal-cell carcinoma, from which 17 served to estimate the model parameters and the remaining 7 comprised the evaluation set. A total of 274 fields of view of size 1024×1024 pixels were extracted from the evaluation set. Then 176 patches from this set were used to train a support vector machine classifier to predict the presence of cancer on a patch-by-patch basis while the remaining 98 image patches were used for independent testing, ensuring that the training and test sets do not comprise patches from the same patient. A baseline model (M_ex) estimated the cancer likelihood for each of the image patches. M_ex uses the same visual features as M_im, but its weights are estimated from nuclei manually labeled as cancerous or noncancerous by a pathologist. M_im achieved an accuracy of 74.49% and an F-measure of 80.31%, while M_ex yielded corresponding accuracy and F-measures of 73.47% and 77.97%, respectively. PMID:28382314
NASA Astrophysics Data System (ADS)
Lee, Harim; Moon, Y.-J.; Na, Hyeonock; Jang, Soojeong; Lee, Jae-Ok
2015-12-01
To prepare for when only single-view observations are available, we have made a test whether the 3-D parameters (radial velocity, angular width, and source location) of halo coronal mass ejections (HCMEs) from single-view observations are consistent with those from multiview observations. For this test, we select 44 HCMEs from December 2010 to June 2011 with the following conditions: partial and full HCMEs by SOHO and limb CMEs by twin STEREO spacecraft when they were approximately in quadrature. In this study, we compare the 3-D parameters of the HCMEs from three different methods: (1) a geometrical triangulation method, the STEREO CAT tool developed by NASA/CCMC, for multiview observations using STEREO/SECCHI and SOHO/LASCO data, (2) the graduated cylindrical shell (GCS) flux rope model for multiview observations using STEREO/SECCHI data, and (3) an ice cream cone model for single-view observations using SOHO/LASCO data. We find that the radial velocities and the source locations of the HCMEs from three methods are well consistent with one another with high correlation coefficients (≥0.9). However, the angular widths by the ice cream cone model are noticeably underestimated for broad CMEs larger than 100° and several partial HCMEs. A comparison between the 3-D CME parameters directly measured from twin STEREO spacecraft and the above 3-D parameters shows that the parameters from multiview are more consistent with the STEREO measurements than those from single view.
Aerosol particle size distribution in the stratosphere retrieved from SCIAMACHY limb measurements
NASA Astrophysics Data System (ADS)
Malinina, Elizaveta; Rozanov, Alexei; Rozanov, Vladimir; Liebing, Patricia; Bovensmann, Heinrich; Burrows, John P.
2018-04-01
South Fork Telephone Switchboard Building, general view in setting showing ...
South Fork Telephone Switchboard Building, general view in setting showing (N) side; view (S) - Fort McKinley, South Fork Telephone Switchboard Building, South side of Weymouth Way, approximately 100 feet west of East Side Drive, Great Diamond Island, Portland, Cumberland County, ME
Adaptive Phase Delay Generator
NASA Technical Reports Server (NTRS)
Greer, Lawrence
2013-01-01
There are several experimental setups involving rotating machinery that require some form of synchronization. The adaptive phase delay generator (APDG) the Bencic-1000 is a flexible instrument that allows the user to generate pulses synchronized to the rising edge of a tachometer signal from any piece of rotating machinery. These synchronized pulses can vary by the delay angle, pulse width, number of pulses per period, number of skipped pulses, and total number of pulses. Due to the design of the pulse generator, any and all of these parameters can be changed independently, yielding an unparalleled level of versatility. There are two user interfaces to the APDG. The first is a LabVIEW program that has the advantage of displaying all of the pulse parameters and input signal data within one neatly organized window on the PC monitor. Furthermore, the LabVIEW interface plots the rpm of the two input signal channels in real time. The second user interface is a handheld portable device that goes anywhere a computer is not accessible. It consists of a liquid-crystal display and keypad, which enable the user to control the unit by scrolling through a host of command menus and parameter listings. The APDG combines all of the desired synchronization control into one unit. The experimenter can adjust the delay, pulse width, pulse count, number of skipped pulses, and produce a specified number of pulses per revolution. Each of these parameters can be changed independently, providing an unparalleled level of versatility when synchronizing hardware to a host of rotating machinery. The APDG allows experimenters to set up quickly and generate a host of synchronizing configurations using a simple user interface, which hopefully leads to faster results.
A system-level view of optimizing high-channel-count wireless biosignal telemetry.
Chandler, Rodney J; Gibson, Sarah; Karkare, Vaibhav; Farshchi, Shahin; Marković, Dejan; Judy, Jack W
2009-01-01
In this paper we perform a system-level analysis of a wireless biosignal telemetry system. We perform an analysis of each major system component (e.g., analog front end, analog-to-digital converter, digital signal processor, and wireless link), in which we consider physical, algorithmic, and design limitations. Since there are a wide range applications for wireless biosignal telemetry systems, each with their own unique set of requirements for key parameters (e.g., channel count, power dissipation, noise level, number of bits, etc.), our analysis is equally broad. The net result is a set of plots, in which the power dissipation for each component and as the system as a whole, are plotted as a function of the number of channels for different architectural strategies. These results are also compared to existing implementations of complete wireless biosignal telemetry systems.
Okamura, Jun-Ya; Yamaguchi, Reona; Honda, Kazunari; Wang, Gang; Tanaka, Keiji
2014-11-05
One fails to recognize an unfamiliar object across changes in viewing angle when it must be discriminated from similar distractor objects. View-invariant recognition gradually develops as the viewer repeatedly sees the objects in rotation. It is assumed that different views of each object are associated with one another while their successive appearance is experienced in rotation. However, natural experience of objects also contains ample opportunities to discriminate among objects at each of the multiple viewing angles. Our previous behavioral experiments showed that after experiencing a new set of object stimuli during a task that required only discrimination at each of four viewing angles at 30° intervals, monkeys could recognize the objects across changes in viewing angle up to 60°. By recording activities of neurons from the inferotemporal cortex after various types of preparatory experience, we here found a possible neural substrate for the monkeys' performance. For object sets that the monkeys had experienced during the task that required only discrimination at each of four viewing angles, many inferotemporal neurons showed object selectivity covering multiple views. The degree of view generalization found for these object sets was similar to that found for stimulus sets with which the monkeys had been trained to conduct view-invariant recognition. These results suggest that the experience of discriminating new objects in each of several viewing angles develops the partially view-generalized object selectivity distributed over many neurons in the inferotemporal cortex, which in turn bases the monkeys' emergent capability to discriminate the objects across changes in viewing angle. Copyright © 2014 the authors 0270-6474/14/3415047-13$15.00/0.
Efficient view based 3-D object retrieval using Hidden Markov Model
NASA Astrophysics Data System (ADS)
Jain, Yogendra Kumar; Singh, Roshan Kumar
2013-12-01
Recent research effort has been dedicated to view based 3-D object retrieval, because of highly discriminative property of 3-D object and has multi view representation. The state-of-art method is highly depending on their own camera array setting for capturing views of 3-D object and use complex Zernike descriptor, HAC for representative view selection which limit their practical application and make it inefficient for retrieval. Therefore, an efficient and effective algorithm is required for 3-D Object Retrieval. In order to move toward a general framework for efficient 3-D object retrieval which is independent of camera array setting and avoidance of representative view selection, we propose an Efficient View Based 3-D Object Retrieval (EVBOR) method using Hidden Markov Model (HMM). In this framework, each object is represented by independent set of view, which means views are captured from any direction without any camera array restriction. In this, views are clustered (including query view) to generate the view cluster, which is then used to build the query model with HMM. In our proposed method, HMM is used in twofold: in the training (i.e. HMM estimate) and in the retrieval (i.e. HMM decode). The query model is trained by using these view clusters. The EVBOR query model is worked on the basis of query model combining with HMM. The proposed approach remove statically camera array setting for view capturing and can be apply for any 3-D object database to retrieve 3-D object efficiently and effectively. Experimental results demonstrate that the proposed scheme has shown better performance than existing methods. [Figure not available: see fulltext.
Global Seabed Materials and Habitats Mapped: The Computational Methods
NASA Astrophysics Data System (ADS)
Jenkins, C. J.
2016-02-01
What the seabed is made of has proven difficult to map on the scale of whole ocean-basins. Direct sampling and observation can be augmented with proxy-parameter methods such as acoustics. Both avenues are essential to obtain enough detail and coverage, and also to validate the mapping methods. We focus on the direct observations such as samplings, photo and video, probes, diver and sub reports, and surveyed features. These are often in word-descriptive form: over 85% of the records for site materials are in this form, whether as sample/view descriptions or classifications, or described parameters such as consolidation, color, odor, structures and components. Descriptions are absolutely necessary for unusual materials and for processes - in other words, for research. This project dbSEABED not only has the largest collection of seafloor materials data worldwide, but it uses advanced computing math to obtain the best possible coverages and detail. Included in those techniques are linguistic text analysis (e.g., Natural Language Processing, NLP), fuzzy set theory (FST), and machine learning (ML, e.g., Random Forest). These techniques allow efficient and accurate import of huge datasets, thereby optimizing the data that exists. They merge quantitative and qualitative types of data for rich parameter sets, and extrapolate where the data are sparse for best map production. The dbSEABED data resources are now very widely used worldwide in oceanographic research, environmental management, the geosciences, engineering and survey.
A Multistage Approach for Image Registration.
Bowen, Francis; Hu, Jianghai; Du, Eliza Yingzi
2016-09-01
Successful image registration is an important step for object recognition, target detection, remote sensing, multimodal content fusion, scene blending, and disaster assessment and management. The geometric and photometric variations between images adversely affect the ability for an algorithm to estimate the transformation parameters that relate the two images. Local deformations, lighting conditions, object obstructions, and perspective differences all contribute to the challenges faced by traditional registration techniques. In this paper, a novel multistage registration approach is proposed that is resilient to view point differences, image content variations, and lighting conditions. Robust registration is realized through the utilization of a novel region descriptor which couples with the spatial and texture characteristics of invariant feature points. The proposed region descriptor is exploited in a multistage approach. A multistage process allows the utilization of the graph-based descriptor in many scenarios thus allowing the algorithm to be applied to a broader set of images. Each successive stage of the registration technique is evaluated through an effective similarity metric which determines subsequent action. The registration of aerial and street view images from pre- and post-disaster provide strong evidence that the proposed method estimates more accurate global transformation parameters than traditional feature-based methods. Experimental results show the robustness and accuracy of the proposed multistage image registration methodology.
Wireless modification of the intraoperative examination monitor for awake surgery.
Yoshimitsu, Kitaro; Maruyama, Takashi; Muragaki, Yoshihiro; Suzuki, Takashi; Saito, Taiichi; Nitta, Masayuki; Tanaka, Masahiko; Chernov, Mikhail; Tamura, Manabu; Ikuta, Soko; Okamoto, Jun; Okada, Yoshikazu; Iseki, Hiroshi
2011-01-01
The dedicated intraoperative examination monitor for awake surgery (IEMAS) was originally developed by us to facilitate the process of brain mapping during awake craniotomy and successfully used in 186 neurosurgical procedures. This information-sharing device provides the opportunity for all members of the surgical team to visualize a wide spectrum of the integrated intraoperative information related to the condition of the patient, nuances of the surgical procedure, and details of the cortical mapping, practically without interruption of the surgical manipulations. The wide set of both anatomical and functional parameters, such as view of the patient's mimic and face movements while answering the specific questions, type of the examination test, position of the surgical instruments, parameters of the bispectral index monitor, and general view of the surgical field through the operating microscope, is presented compactly in one screen with several displays. However, the initially designed IEMAS system was occasionally affected by interruption or detachment of the connecting cables, which sometimes interfered with its effective clinical use. Therefore, a new modification of the device was developed. The specific feature is installation of wireless information transmitting technology using audio-visual transmitters and receivers for transfer of images and verbal information. The modified IEMAS system is very convenient to use in the narrow space of the operating room.
Stability of the Broad-line Region Geometry and Dynamics in Arp 151 Over Seven Years
NASA Astrophysics Data System (ADS)
Pancoast, A.; Barth, A. J.; Horne, K.; Treu, T.; Brewer, B. J.; Bennert, V. N.; Canalizo, G.; Gates, E. L.; Li, W.; Malkan, M. A.; Sand, D.; Schmidt, T.; Valenti, S.; Woo, J.-H.; Clubb, K. I.; Cooper, M. C.; Crawford, S. M.; Hönig, S. F.; Joner, M. D.; Kandrashoff, M. T.; Lazarova, M.; Nierenberg, A. M.; Romero-Colmenero, E.; Son, D.; Tollerud, E.; Walsh, J. L.; Winkler, H.
2018-04-01
The Seyfert 1 galaxy Arp 151 was monitored as part of three reverberation mapping campaigns spanning 2008–2015. We present modeling of these velocity-resolved reverberation mapping data sets using a geometric and dynamical model for the broad-line region (BLR). By modeling each of the three data sets independently, we infer the evolution of the BLR structure in Arp 151 over a total of 7 yr and constrain the systematic uncertainties in nonvarying parameters such as the black hole mass. We find that the BLR geometry of a thick disk viewed close to face-on is stable over this time, although the size of the BLR grows by a factor of ∼2. The dynamics of the BLR are dominated by inflow, and the inferred black hole mass is consistent for the three data sets, despite the increase in BLR size. Combining the inference for the three data sets yields a black hole mass and statistical uncertainty of log10({M}BH}/{M}ȯ ) = {6.82}-0.09+0.09 with a standard deviation in individual measurements of 0.13 dex.
Jago, R; Zahra, J; Edwards, M J; Kesten, J M; Solomon-Moore, E; Thompson, J L; Sebire, S J
2016-01-01
Objectives The present study used qualitative methods to: (1) examine the strategies that were used by parents of children aged 5–6 years to manage screen viewing; (2) identify key factors that affect the implementation of the strategies and (3) develop suggestions for future intervention content. Design Telephone interviews were conducted with parents of children aged 5–6 years participating in a larger study. Interviews were transcribed verbatim and analysed using an inductive and deductive content analysis. Coding and theme generation was iterative and refined throughout. Setting Parents were recruited through 57 primary schools located in the greater Bristol area (UK). Participants 53 parents of children aged 5–6 years. Results Parents reported that for many children, screen viewing was a highly desirable behaviour that was difficult to manage, and that parents used the provision of screen viewing as a tool for reward and/or punishment. Parents managed screen viewing by setting limits in relation to daily events such as meals, before and after school, and bedtime. Screen-viewing rules were often altered depending on parental preferences and tasks. Inconsistent messaging within and between parents represented a source of conflict at times. Potential strategies to facilitate reducing screen viewing were identified, including setting screen-viewing limits in relation to specific events, collaborative rule setting, monitoring that involves mothers, fathers and the child, developing a family-specific set of alternative activities to screen viewing and developing a child's ability to self-monitor their own screen viewing. Conclusions Managing screen viewing is a challenge for many parents and can often cause tension in the home. The data presented in this paper provide key suggestions of new approaches that could be incorporated into behaviour change programmes to reduce child screen viewing. PMID:26932143
Quantum approximate optimization algorithm for MaxCut: A fermionic view
NASA Astrophysics Data System (ADS)
Wang, Zhihui; Hadfield, Stuart; Jiang, Zhang; Rieffel, Eleanor G.
2018-02-01
Farhi et al. recently proposed a class of quantum algorithms, the quantum approximate optimization algorithm (QAOA), for approximately solving combinatorial optimization problems (E. Farhi et al., arXiv:1411.4028;
An Integrated Approach to Parameter Learning in Infinite-Dimensional Space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, Zachary M.; Wendelberger, Joanne Roth
The availability of sophisticated modern physics codes has greatly extended the ability of domain scientists to understand the processes underlying their observations of complicated processes, but it has also introduced the curse of dimensionality via the many user-set parameters available to tune. Many of these parameters are naturally expressed as functional data, such as initial temperature distributions, equations of state, and controls. Thus, when attempting to find parameters that match observed data, being able to navigate parameter-space becomes highly non-trivial, especially considering that accurate simulations can be expensive both in terms of time and money. Existing solutions include batch-parallel simulations,more » high-dimensional, derivative-free optimization, and expert guessing, all of which make some contribution to solving the problem but do not completely resolve the issue. In this work, we explore the possibility of coupling together all three of the techniques just described by designing user-guided, batch-parallel optimization schemes. Our motivating example is a neutron diffusion partial differential equation where the time-varying multiplication factor serves as the unknown control parameter to be learned. We find that a simple, batch-parallelizable, random-walk scheme is able to make some progress on the problem but does not by itself produce satisfactory results. After reducing the dimensionality of the problem using functional principal component analysis (fPCA), we are able to track the progress of the solver in a visually simple way as well as viewing the associated principle components. This allows a human to make reasonable guesses about which points in the state space the random walker should try next. Thus, by combining the random walker's ability to find descent directions with the human's understanding of the underlying physics, it is possible to use expensive simulations more efficiently and more quickly arrive at the desired parameter set.« less
Segmentation and Quantification for Angle-Closure Glaucoma Assessment in Anterior Segment OCT.
Fu, Huazhu; Xu, Yanwu; Lin, Stephen; Zhang, Xiaoqin; Wong, Damon Wing Kee; Liu, Jiang; Frangi, Alejandro F; Baskaran, Mani; Aung, Tin
2017-09-01
Angle-closure glaucoma is a major cause of irreversible visual impairment and can be identified by measuring the anterior chamber angle (ACA) of the eye. The ACA can be viewed clearly through anterior segment optical coherence tomography (AS-OCT), but the imaging characteristics and the shapes and locations of major ocular structures can vary significantly among different AS-OCT modalities, thus complicating image analysis. To address this problem, we propose a data-driven approach for automatic AS-OCT structure segmentation, measurement, and screening. Our technique first estimates initial markers in the eye through label transfer from a hand-labeled exemplar data set, whose images are collected over different patients and AS-OCT modalities. These initial markers are then refined by using a graph-based smoothing method that is guided by AS-OCT structural information. These markers facilitate segmentation of major clinical structures, which are used to recover standard clinical parameters. These parameters can be used not only to support clinicians in making anatomical assessments, but also to serve as features for detecting anterior angle closure in automatic glaucoma screening algorithms. Experiments on Visante AS-OCT and Cirrus high-definition-OCT data sets demonstrate the effectiveness of our approach.
NASA Technical Reports Server (NTRS)
Miller, Richard L.; Georgiou, Ioannis; Glorioso, Mark V.; McCorquodale, J. Alex; Crowder, Keely
2006-01-01
Field measurements from small boats and sparse arrays of instrumented buoys often do not provide sufficient data to capture the dynamic nature of biogeophysical parameters in may coastal aquatic environments. Several investigators have shown the MODIS 250 m images can provide daily synoptic views of suspended sediment concentration in coastal waters to determine sediment transport and fate. However, the use of MODIS for coastal environments can be limited due to a lack of cloud-free images. Sediment transport models are not constrained by sky conditions but often suffer from a lack of in situ observations for model calibration or validation. We demonstrate here the utility of MODIS 250 m to calibrate (set model parameters), validate output, and set or reset initial conditions of a hydrodynamic and sediment transport model (ECOMSED) developed for Lake Pontchartrain, LA USA. We present our approach in the context of how to quickly assess of 'prototype' an application of NASA data to support environmental managers and decision makers. The combination of daily MODIS imagery and model simulations offer a more robust monitoring and prediction system of suspended sediments than available from either system alone.
vSPARQL: A View Definition Language for the Semantic Web
Shaw, Marianne; Detwiler, Landon T.; Noy, Natalya; Brinkley, James; Suciu, Dan
2010-01-01
Translational medicine applications would like to leverage the biological and biomedical ontologies, vocabularies, and data sets available on the semantic web. We present a general solution for RDF information set reuse inspired by database views. Our view definition language, vSPARQL, allows applications to specify the exact content that they are interested in and how that content should be restructured or modified. Applications can access relevant content by querying against these view definitions. We evaluate the expressivity of our approach by defining views for practical use cases and comparing our view definition language to existing query languages. PMID:20800106
Classification of a set of vectors using self-organizing map- and rule-based technique
NASA Astrophysics Data System (ADS)
Ae, Tadashi; Okaniwa, Kaishirou; Nosaka, Kenzaburou
2005-02-01
There exist various objects, such as pictures, music, texts, etc., around our environment. We have a view for these objects by looking, reading or listening. Our view is concerned with our behaviors deeply, and is very important to understand our behaviors. We have a view for an object, and decide the next action (data selection, etc.) with our view. Such a series of actions constructs a sequence. Therefore, we propose a method which acquires a view as a vector from several words for a view, and apply the vector to sequence generation. We focus on sequences of the data of which a user selects from a multimedia database containing pictures, music, movie, etc... These data cannot be stereotyped because user's view for them changes by each user. Therefore, we represent the structure of the multimedia database as the vector representing user's view and the stereotyped vector, and acquire sequences containing the structure as elements. Such a vector can be classified by SOM (Self-Organizing Map). Hidden Markov Model (HMM) is a method to generate sequences. Therefore, we use HMM of which a state corresponds to the representative vector of user's view, and acquire sequences containing the change of user's view. We call it Vector-state Markov Model (VMM). We introduce the rough set theory as a rule-base technique, which plays a role of classifying the sets of data such as the sets of "Tour".
Advanced Interactive Display Formats for Terminal Area Traffic Control
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Shaviv, G. E.
1999-01-01
This research project deals with an on-line dynamic method for automated viewing parameter management in perspective displays. Perspective images are optimized such that a human observer will perceive relevant spatial geometrical features with minimal errors. In order to compute the errors at which observers reconstruct spatial features from perspective images, a visual spatial-perception model was formulated. The model was employed as the basis of an optimization scheme aimed at seeking the optimal projection parameter setting. These ideas are implemented in the context of an air traffic control (ATC) application. A concept, referred to as an active display system, was developed. This system uses heuristic rules to identify relevant geometrical features of the three-dimensional air traffic situation. Agile, on-line optimization was achieved by a specially developed and custom-tailored genetic algorithm (GA), which was to deal with the multi-modal characteristics of the objective function and exploit its time-evolving nature.
Isoplanatic patch of the human eye for arbitrary wavelengths
NASA Astrophysics Data System (ADS)
Han, Guoqing; Cao, Zhaoliang; Mu, Quanquan; Wang, Yukun; Li, Dayu; Wang, Shaoxin; Xu, Zihao; Wu, Daosheng; Hu, Lifa; Xuan, Li
2018-03-01
The isoplanatic patch of the human eye is a key parameter for the adaptive optics system (AOS) designed for retinal imaging. The field of view (FOV) usually sets to the same size as the isoplanatic patch to obtain high resolution images. However, it has only been measured at a specific wavelength. Here we investigate the wavelength dependence of this important parameter. An optical setup is initially designed and established in a laboratory to measure the isoplanatic patch at various wavelengths (655 nm, 730 nm and 808 nm). We established the Navarro wide-angle eye model in Zemax software to further validate our results, which suggested high consistency between the two. The isoplanatic patch as a function of wavelength was obtained within the range of visible to near-infrared, which can be expressed as: θ=0.0028 λ - 0 . 74. This work is beneficial for the AOS design for retinal imaging.
Thoma, Patrizia; Soria Bauser, Denise; Suchan, Boris
2013-08-30
This article introduces the freely available Bochum Emotional Stimulus Set (BESST), which contains pictures of bodies and faces depicting either a neutral expression or one of the six basic emotions (happiness, sadness, fear, anger, disgust, and surprise), presented from two different perspectives (0° frontal view vs. camera averted by 45° to the left). The set comprises 565 frontal view and 564 averted view pictures of real-life bodies with masked facial expressions and 560 frontal and 560 averted view faces which were synthetically created using the FaceGen 3.5 Modeller. All stimuli were validated in terms of categorization accuracy and the perceived naturalness of the expression. Additionally, each facial stimulus was morphed into three age versions (20/40/60 years). The results show high recognition of the intended facial expressions, even under speeded forced-choice conditions, as corresponds to common experimental settings. The average naturalness ratings for the stimuli range between medium and high. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Farsalinos, Konstantinos E; Daraban, Ana M; Ünlü, Serkan; Thomas, James D; Badano, Luigi P; Voigt, Jens-Uwe
2015-10-01
This study was planned by the EACVI/ASE/Industry Task Force to Standardize Deformation Imaging to (1) test the variability of speckle-tracking global longitudinal strain (GLS) measurements among different vendors and (2) compare GLS measurement variability with conventional echocardiographic parameters. Sixty-two volunteers were studied using ultrasound systems from seven manufacturers. Each volunteer was examined by the same sonographer on all machines. Inter- and intraobserver variability was determined in a true test-retest setting. Conventional echocardiographic parameters were acquired for comparison. Using the software packages of the respective manufacturer and of two software-only vendors, endocardial GLS was measured because it was the only GLS parameter that could be provided by all manufactures. We compared GLSAV (the average from the three apical views) and GLS4CH (measured in the four-chamber view) measurements among vendors and with the conventional echocardiographic parameters. Absolute values of GLSAV ranged from 18.0% to 21.5%, while GLS4CH ranged from 17.9% to 21.4%. The absolute difference between vendors for GLSAV was up to 3.7% strain units (P < .001). The interobserver relative mean errors were 5.4% to 8.6% for GLSAV and 6.2% to 11.0% for GLS4CH, while the intraobserver relative mean errors were 4.9% to 7.3% and 7.2% to 11.3%, respectively. These errors were lower than for left ventricular ejection fraction and most other conventional echocardiographic parameters. Reproducibility of GLS measurements was good and in many cases superior to conventional echocardiographic measurements. The small but statistically significant variation among vendors should be considered in performing serial studies and reflects a reference point for ongoing standardization efforts. Copyright © 2015 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Xu, Chi; Senaratne, Charutha L.; Culbertson, Robert J.; Kouvetakis, John; Menéndez, José
2017-09-01
The compositional dependence of the lattice parameter in Ge1-ySny alloys has been determined from combined X-ray diffraction and Rutherford Backscattering (RBS) measurements of a large set of epitaxial films with compositions in the 0 < y < 0.14 range. In view of contradictory prior results, a critical analysis of this method has been carried out, with emphasis on nonlinear elasticity corrections and systematic errors in popular RBS simulation codes. The approach followed is validated by showing that measurements of Ge1-xSix films yield a bowing parameter θGeSi =-0.0253(30) Å, in excellent agreement with the classic work by Dismukes. When the same methodology is applied to Ge1-ySny alloy films, it is found that the bowing parameter θGeSn is zero within experimental error, so that the system follows Vegard's law. This is in qualitative agreement with ab initio theory, but the value of the experimental bowing parameter is significantly smaller than the theoretical prediction. Possible reasons for this discrepancy are discussed in detail.
Spectral gap optimization of order parameters for sampling complex molecular systems
Tiwary, Pratyush; Berne, B. J.
2016-01-01
In modern-day simulations of many-body systems, much of the computational complexity is shifted to the identification of slowly changing molecular order parameters called collective variables (CVs) or reaction coordinates. A vast array of enhanced-sampling methods are based on the identification and biasing of these low-dimensional order parameters, whose fluctuations are important in driving rare events of interest. Here, we describe a new algorithm for finding optimal low-dimensional CVs for use in enhanced-sampling biasing methods like umbrella sampling, metadynamics, and related methods, when limited prior static and dynamic information is known about the system, and a much larger set of candidate CVs is specified. The algorithm involves estimating the best combination of these candidate CVs, as quantified by a maximum path entropy estimate of the spectral gap for dynamics viewed as a function of that CV. The algorithm is called spectral gap optimization of order parameters (SGOOP). Through multiple practical examples, we show how this postprocessing procedure can lead to optimization of CV and several orders of magnitude improvement in the convergence of the free energy calculated through metadynamics, essentially giving the ability to extract useful information even from unsuccessful metadynamics runs. PMID:26929365
View-limiting shrouds for insolation radiometers
NASA Technical Reports Server (NTRS)
Dennison, E. W.; Trentelman, G. F.
1985-01-01
Insolation radiometers (normal incidence pyrheliometers) are used to measure the solar radiation incident on solar concentrators for calibrating thermal power generation measurements. The measured insolation value is dependent on the atmospheric transparency, solar elevation angle, circumsolar radiation, and radiometer field of view. The radiant energy entering the thermal receiver is dependent on the same factors. The insolation value and the receiver input will be proportional if the concentrator and the radiometer have similar fields of view. This report describes one practical method for matching the field of view of a radiometer to that of a solar concentrator. The concentrator field of view can be calculated by optical ray tracing methods and the field of view of a radiometer with a simple shroud can be calculated by using geometric equations. The parameters for the shroud can be adjusted to provide an acceptable match between the respective fields of view. Concentrator fields of view have been calculated for a family of paraboloidal concentrators and receiver apertures. The corresponding shroud parameters have also been determined.
Kubik, Martha Y; Gurvich, Olga V; Fulkerson, Jayne A
2017-01-19
Television (TV) viewing is popular among adults and children, and child TV-viewing time is positively associated with parent TV-viewing time. Efforts to limit the TV-viewing time of children typically target parent rule-setting. However, little is known about the association between parent TV-viewing practices and rule-setting. We used baseline height and weight data and survey data collected from 2011 through 2015 on parents and their 8- to 12-year-old children (N = 212 parent/child dyads) who were participants in 2 community-based obesity prevention intervention trials conducted in metropolitan Minnesota. Multivariable binary logistic regression analysis was used to assess the association between parent TV-viewing time on weekdays or weekend days (dichotomized as ≤2 hrs/d vs ≥2.5 hrs/d) and parent rules limiting child TV-viewing time. Child mean age was 10 (standard deviation [SD], 1.4) years, mean body mass index (BMI) percentile was 81 (SD, 16.7), approximately half of the sample were boys, and 42% of the sample was nonwhite. Parent mean age was 41 (SD, 7.5) years, and mean BMI was 29 (SD, 7.5); most of the sample was female, and 36% of the sample was nonwhite. Parents who limited their TV-viewing time on weekend days to 2 hours or fewer per day were almost 3 times more likely to report setting rules limiting child TV-viewing time than were parents who watched 2.5 hours or more per day (P = .01). A similar association was not seen for parent weekday TV-viewing time. For most adults and children, a meaningful decrease in sedentariness will require reductions in TV-viewing time. Family-based interventions to reduce TV-viewing time that target the TV-viewing practices of both children and parents are needed.
Gurvich, Olga V.; Fulkerson, Jayne A.
2017-01-01
Introduction Television (TV) viewing is popular among adults and children, and child TV-viewing time is positively associated with parent TV-viewing time. Efforts to limit the TV-viewing time of children typically target parent rule-setting. However, little is known about the association between parent TV-viewing practices and rule-setting. Methods We used baseline height and weight data and survey data collected from 2011 through 2015 on parents and their 8- to 12-year-old children (N = 212 parent/child dyads) who were participants in 2 community-based obesity prevention intervention trials conducted in metropolitan Minnesota. Multivariable binary logistic regression analysis was used to assess the association between parent TV-viewing time on weekdays or weekend days (dichotomized as ≤2 hrs/d vs ≥2.5 hrs/d) and parent rules limiting child TV-viewing time. Results Child mean age was 10 (standard deviation [SD], 1.4) years, mean body mass index (BMI) percentile was 81 (SD, 16.7), approximately half of the sample were boys, and 42% of the sample was nonwhite. Parent mean age was 41 (SD, 7.5) years, and mean BMI was 29 (SD, 7.5); most of the sample was female, and 36% of the sample was nonwhite. Parents who limited their TV-viewing time on weekend days to 2 hours or fewer per day were almost 3 times more likely to report setting rules limiting child TV-viewing time than were parents who watched 2.5 hours or more per day (P = .01). A similar association was not seen for parent weekday TV-viewing time. Conclusion For most adults and children, a meaningful decrease in sedentariness will require reductions in TV-viewing time. Family-based interventions to reduce TV-viewing time that target the TV-viewing practices of both children and parents are needed. PMID:28103183
Views on Biotic Nature and the Idea of Sustainable Development
NASA Astrophysics Data System (ADS)
Łepko, Zbigniew
2017-12-01
The search for balance between humankind's civilisational aspirations and the durable protection of nature is conditioned by contemporaneous views of biotic nature. Of particular importance in this regard are physiocentric and physiological views that may be set against one another. The first of these was presented by Hans Jonas, the second by Lothar Schäfer. This paper does not confine itself to setting one view against the other, but rather sets minimum conditions for cooperation between their promoters in the interests of balance between the aspirations of the present generation and those of future generations. Both views of nature are in their own way conducive to a break with the illusion present in some areas of the modern natural sciences - that nature is a boundless area of are inexhaustible resources.
Diverse knowledges and competing interests: an essay on socio-technical problem-solving.
di Norcia, Vincent
2002-01-01
Solving complex socio-technical problems, this paper claims, involves diverse knowledges (cognitive diversity), competing interests (social diversity), and pragmatism. To explain this view, this paper first explores two different cases: Canadian pulp and paper mill pollution and siting nuclear reactors in systematically sensitive areas of California. Solving such socio-technically complex problems involves cognitive diversity as well as social diversity and pragmatism. Cognitive diversity requires one to not only recognize relevant knowledges but also to assess their validity. Finally, it is suggested, integrating the resultant set of diverse relevant and valid knowledges determines the parameters of the solution space for the problem.
A laser technique for characterizing the geometry of plant canopies
NASA Technical Reports Server (NTRS)
Vanderbilt, V. C.; Silva, L. F.; Bauer, M. E.
1977-01-01
The interception of solar power by the canopy is investigated as a function of solar zenith angle (time), component of the canopy, and depth into the canopy. The projected foliage area, cumulative leaf area, and view factors within the canopy are examined as a function of the same parameters. Two systems are proposed that are capable of describing the geometrical aspects of a vegetative canopy and of operation in an automatic mode. Either system would provide sufficient data to yield a numerical map of the foliage area in the canopy. Both systems would involve the collection of large data sets in a short time period using minimal manpower.
NASA Technical Reports Server (NTRS)
Randel, D. L.; Campbell, G. G.; Vonder Haar, T. H.; Smith, L.
1986-01-01
Scale factors and assumptions which were applied in calculations of global radiation budget parameters based on ERB data are discussed. The study was performed to examine the relationship between the composite global ERB map that can be generated every six days using all available data and the actual average global ERB. The wide field of view ERB instrument functioned for the first 19 months of the Nimbus-7 life, and furnished sufficient data for calculating actual ERB averages. The composite was most accurate in regions with the least variation in radiation budget.
1. View of Asylum (Western) Avenue viaduct in setting from ...
1. View of Asylum (Western) Avenue viaduct in setting from northwest, facing southeast. - Asylum Avenue Viaduct, Spanning Second Creek & Southern Railroad at State Route 62, Knoxville, Knox County, TN
vSPARQL: a view definition language for the semantic web.
Shaw, Marianne; Detwiler, Landon T; Noy, Natalya; Brinkley, James; Suciu, Dan
2011-02-01
Translational medicine applications would like to leverage the biological and biomedical ontologies, vocabularies, and data sets available on the semantic web. We present a general solution for RDF information set reuse inspired by database views. Our view definition language, vSPARQL, allows applications to specify the exact content that they are interested in and how that content should be restructured or modified. Applications can access relevant content by querying against these view definitions. We evaluate the expressivity of our approach by defining views for practical use cases and comparing our view definition language to existing query languages. Copyright © 2010 Elsevier Inc. All rights reserved.
3. Oblique view of building in setting; view to northwest, ...
3. Oblique view of building in setting; view to northwest, 65mm lens. Railroad cut in foreground was made in 1928 when Southern Pacific Railroad realigned its main line in connection with the construction of its Martinez-Benicia Bridge. It was this cut which led to continual settlement of the southeast corner of the building, resulting in its structural failure. - Benicia Arsenal, Powder Magazine No. 5, Junction of Interstate Highways 680 & 780, Benicia, Solano County, CA
Overview of Icing Physics Relevant to Scaling
NASA Technical Reports Server (NTRS)
Anderson, David N.; Tsao, Jen-Ching
2005-01-01
An understanding of icing physics is required for the development of both scaling methods and ice-accretion prediction codes. This paper gives an overview of our present understanding of the important physical processes and the associated similarity parameters that determine the shape of Appendix C ice accretions. For many years it has been recognized that ice accretion processes depend on flow effects over the model, on droplet trajectories, on the rate of water collection and time of exposure, and, for glaze ice, on a heat balance. For scaling applications, equations describing these events have been based on analyses at the stagnation line of the model and have resulted in the identification of several non-dimensional similarity parameters. The parameters include the modified inertia parameter of the water drop, the accumulation parameter and the freezing fraction. Other parameters dealing with the leading edge heat balance have also been used for convenience. By equating scale expressions for these parameters to the values to be simulated a set of equations is produced which can be solved for the scale test conditions. Studies in the past few years have shown that at least one parameter in addition to those mentioned above is needed to describe surface-water effects, and some of the traditional parameters may not be as significant as once thought. Insight into the importance of each parameter, and the physical processes it represents, can be made by viewing whether ice shapes change, and the extent of the change, when each parameter is varied. Experimental evidence is presented to establish the importance of each of the traditionally used parameters and to identify the possible form of a new similarity parameter to be used for scaling.
Designing for Annual Spacelift Performance
NASA Technical Reports Server (NTRS)
McCleskey, Carey M.; Zapata, Edgar
2017-01-01
This paper presents a methodology for approaching space launch system design from a total architectural point of view. This different approach to conceptual design is contrasted with traditional approaches that focus on a single set of metrics for flight system performance, i.e., payload lift per flight, vehicle mass, specific impulse, etc. The approach presented works with a larger set of metrics, including annual system lift, or "spacelift" performance. Spacelift performance is more inclusive of the flight production capability of the total architecture, i.e., the flight and ground systems working together as a whole to produce flights on a repeated basis. In the proposed methodology, spacelift performance becomes an important design-for-support parameter for flight system concepts and truly advanced spaceport architectures of the future. The paper covers examples of existing system spacelift performance as benchmarks, points out specific attributes of space transportation systems that must be greatly improved over these existing designs, and outlines current activity in this area.
Multi-scale modularity and motif distributional effect in metabolic networks.
Gao, Shang; Chen, Alan; Rahmani, Ali; Zeng, Jia; Tan, Mehmet; Alhajj, Reda; Rokne, Jon; Demetrick, Douglas; Wei, Xiaohui
2016-01-01
Metabolism is a set of fundamental processes that play important roles in a plethora of biological and medical contexts. It is understood that the topological information of reconstructed metabolic networks, such as modular organization, has crucial implications on biological functions. Recent interpretations of modularity in network settings provide a view of multiple network partitions induced by different resolution parameters. Here we ask the question: How do multiple network partitions affect the organization of metabolic networks? Since network motifs are often interpreted as the super families of evolved units, we further investigate their impact under multiple network partitions and investigate how the distribution of network motifs influences the organization of metabolic networks. We studied Homo sapiens, Saccharomyces cerevisiae and Escherichia coli metabolic networks; we analyzed the relationship between different community structures and motif distribution patterns. Further, we quantified the degree to which motifs participate in the modular organization of metabolic networks.
9. VIEW SOUTHSOUTHEAST STERN OF JFK, SCAFFOLDING SET UP FOR ...
9. VIEW SOUTH-SOUTHEAST STERN OF JFK, SCAFFOLDING SET UP FOR REMOUNTING OF PROPELLERS. - Naval Base Philadelphia-Philadelphia Naval Shipyard, Dry Dock No. 5, League Island, Philadelphia, Philadelphia County, PA
Production of a water quality map of Saginaw Bay by computer processing of LANDSAT-2 data
NASA Technical Reports Server (NTRS)
Mckeon, J. B.; Rogers, R. H.; Smith, V. E.
1977-01-01
Surface truth and LANDSAT measurements collected July 31, 1975, for Saginaw Bay were used to demonstrate a technique for producing a color coded water quality map. On this map, color was used as a code to quantify five discrete ranges in the following water quality parameters: (1) temperature, (2) Secchi depth, (3) chloride, (4) conductivity, (5) total Kjeldahl nitrogen, (6) total phosphorous, (7)chlorophyll a, (8) total solids and (9) suspended solids. The LANDSAT and water quality relationship was established through the use of a set of linear regression equations where the water quality parameters are the dependent variables and LANDSAT measurements are the independent variables. Although the procedure is scene and surface truth dependent, it provides both a basis for extrapolating water quality parameters from point samples to unsampled areas and a synoptic view of water mass boundaries over the 3000 sq. km bay area made from one day's ship data that is superior, in many ways, to the traditional machine contoured maps made from three day's ship data.
Determining index of refraction from polarimetric hyperspectral radiance measurements
NASA Astrophysics Data System (ADS)
Martin, Jacob A.; Gross, Kevin C.
2015-09-01
Polarimetric hyperspectral imaging (P-HSI) combines two of the most common remote sensing modalities. This work leverages the combination of these techniques to improve material classification. Classifying and identifying materials requires parameters which are invariant to changing viewing conditions, and most often a material's reflectivity or emissivity is used. Measuring these most often requires assumptions be made about the material and atmospheric conditions. Combining both polarimetric and hyperspectral imaging, we propose a method to remotely estimate the index of refraction of a material. In general, this is an underdetermined problem because both the real and imaginary components of index of refraction are unknown at every spectral point. By modeling the spectral variation of the index of refraction using a few parameters, however, the problem can be made overdetermined. A number of different functions can be used to describe this spectral variation, and some are discussed here. Reducing the number of spectral parameters to fit allows us to add parameters which estimate atmospheric downwelling radiance and transmittance. Additionally, the object temperature is added as a fit parameter. The set of these parameters that best replicate the measured data is then found using a bounded Nelder-Mead simplex search algorithm. Other search algorithms are also examined and discussed. Results show that this technique has promise but also some limitations, which are the subject of ongoing work.
System and method for attitude determination based on optical imaging
NASA Technical Reports Server (NTRS)
Junkins, John L. (Inventor); Pollock, Thomas C. (Inventor); Mortari, Daniele (Inventor)
2003-01-01
A method and apparatus is provide for receiving a first set of optical data from a first field of view and receiving a second set of optical data from a second field of view. A portion of the first set of optical data is communicated and a portion of the second set of optical data is reflected, both toward an optical combiner. The optical combiner then focuses the portions onto the image plane such that information at the image plane that is associated with the first and second fields of view is received by an optical detector and used to determine an attitude characteristic.
Heuristics for multiobjective multiple sequence alignment.
Abbasi, Maryam; Paquete, Luís; Pereira, Francisco B
2016-07-15
Aligning multiple sequences arises in many tasks in Bioinformatics. However, the alignments produced by the current software packages are highly dependent on the parameters setting, such as the relative importance of opening gaps with respect to the increase of similarity. Choosing only one parameter setting may provide an undesirable bias in further steps of the analysis and give too simplistic interpretations. In this work, we reformulate multiple sequence alignment from a multiobjective point of view. The goal is to generate several sequence alignments that represent a trade-off between maximizing the substitution score and minimizing the number of indels/gaps in the sum-of-pairs score function. This trade-off gives to the practitioner further information about the similarity of the sequences, from which she could analyse and choose the most plausible alignment. We introduce several heuristic approaches, based on local search procedures, that compute a set of sequence alignments, which are representative of the trade-off between the two objectives (substitution score and indels). Several algorithm design options are discussed and analysed, with particular emphasis on the influence of the starting alignment and neighborhood search definitions on the overall performance. A perturbation technique is proposed to improve the local search, which provides a wide range of high-quality alignments. The proposed approach is tested experimentally on a wide range of instances. We performed several experiments with sequences obtained from the benchmark database BAliBASE 3.0. To evaluate the quality of the results, we calculate the hypervolume indicator of the set of score vectors returned by the algorithms. The results obtained allow us to identify reasonably good choices of parameters for our approach. Further, we compared our method in terms of correctly aligned pairs ratio and columns correctly aligned ratio with respect to reference alignments. Experimental results show that our approaches can obtain better results than TCoffee and Clustal Omega in terms of the first ratio.
Development and validation of the AFIT scene and sensor emulator for testing (ASSET)
NASA Astrophysics Data System (ADS)
Young, Shannon R.; Steward, Bryan J.; Gross, Kevin C.
2017-05-01
ASSET is a physics-based model used to generate synthetic data sets of wide field of view (WFOV) electro-optical and infrared (EO/IR) sensors with realistic radiometric properties, noise characteristics, and sensor artifacts. It was developed to meet the need for applications where precise knowledge of the underlying truth is required but is impractical to obtain for real sensors. For example, due to accelerating advances in imaging technology, the volume of data available from WFOV EO/IR sensors has drastically increased over the past several decades, and as a result, there is a need for fast, robust, automatic detection and tracking algorithms. Evaluation of these algorithms is difficult for objects that traverse a wide area (100-10,000 km) because obtaining accurate truth for the full object trajectory often requires costly instrumentation. Additionally, tracking and detection algorithms perform differently depending on factors such as the object kinematics, environment, and sensor configuration. A variety of truth data sets spanning these parameters are needed for thorough testing, which is often cost prohibitive. The use of synthetic data sets for algorithm development allows for full control of scene parameters with full knowledge of truth. However, in order for analysis using synthetic data to be meaningful, the data must be truly representative of real sensor collections. ASSET aims to provide a means of generating such representative data sets for WFOV sensors operating in the visible through thermal infrared. The work reported here describes the ASSET model, as well as provides validation results from comparisons to laboratory imagers and satellite data (e.g. Landsat-8).
NASA Astrophysics Data System (ADS)
Khatri, Pradeep; Hayasaka, Tadahiro; Iwabuchi, Hironobu; Takamura, Tamio; Irie, Hitoshi; Nakajima, Takashi Y.; Letu, Husi; Kai, Qin
2017-04-01
Clouds are known to have profound impacts on atmospheric radiation and water budget, climate change, atmosphere-surface interaction, and so on. Cloud optical thickness (COT) and effective radius (Re) are two fundamental cloud parameters required to study clouds from climatological and hydrological point of view. Large spatial-temporal coverages of those cloud parameters from space observation have proved to be very useful for cloud research; however, validation of space-based products is still a challenging task due to lack of reliable data. Ground-based remote sensing instruments, such as sky radiometers distributed around the world through international observation networks of SKYNET (http://atmos2.cr.chiba-u.jp/skynet/) and AERONET (https://aeronet.gsfc.nasa.gov/) have a great potential to produce ground-truth cloud parameters at different parts of the globe to validate satellite products. Focusing to the sky radiometers of SKYNET and AERONET, a few cloud retrieval methods exists, but those methods have some difficulties to address the problem when cloud is optically thin. It is because the observed transmittances at two wavelengths can be originated from more than one set of COD and Re, and the choice of the most plausible set is difficult. At the same time, calibration issue, especially for the wavelength of near infrared (NIR) region, which is important to retrieve Re, is also a difficult task at present. As a result, instruments need to be calibrated at a high mountain or calibration terms need to be transferred from a standard instrument. Taking those points on account, we developed a new retrieval method emphasizing to overcome above-mentioned difficulties. We used observed transmittances of multiple wavelengths to overcome the first problem. We further proposed a method to obtain calibration constant of NIR wavelength channel using observation data. Our cloud retrieval method is found to produce relatively accurate COD and Re when validated them using data of a narrow field of view radiometer of collocated observation in one SKYNET site. Though the method is developed for the sky radiometer of SKYNET, it can be still used for the sky radiometer of AERONET and other instruments observing spectral zenith transmittances. The proposed retrieval method is then applied to retrieve cloud parameters at key sites of SKYNET within Japan, which are then used to validate cloud products obtained from space observations by MODIS sensors onboard TERRA/AQUA satellites and Himawari 8, a Japanese geostationary satellite. Our analyses suggest the underestimation (overestimation) of COD (Re) from space observations.
NASA Astrophysics Data System (ADS)
Steinberg, P. D.; Bednar, J. A.; Rudiger, P.; Stevens, J. L. R.; Ball, C. E.; Christensen, S. D.; Pothina, D.
2017-12-01
The rich variety of software libraries available in the Python scientific ecosystem provides a flexible and powerful alternative to traditional integrated GIS (geographic information system) programs. Each such library focuses on doing a certain set of general-purpose tasks well, and Python makes it relatively simple to glue the libraries together to solve a wide range of complex, open-ended problems in Earth science. However, choosing an appropriate set of libraries can be challenging, and it is difficult to predict how much "glue code" will be needed for any particular combination of libraries and tasks. Here we present a set of libraries that have been designed to work well together to build interactive analyses and visualizations of large geographic datasets, in standard web browsers. The resulting workflows run on ordinary laptops even for billions of data points, and easily scale up to larger compute clusters when available. The declarative top-level interface used in these libraries means that even complex, fully interactive applications can be built and deployed as web services using only a few dozen lines of code, making it simple to create and share custom interactive applications even for datasets too large for most traditional GIS systems. The libraries we will cover include GeoViews (HoloViews extended for geographic applications) for declaring visualizable/plottable objects, Bokeh for building visual web applications from GeoViews objects, Datashader for rendering arbitrarily large datasets faithfully as fixed-size images, Param for specifying user-modifiable parameters that model your domain, Xarray for computing with n-dimensional array data, Dask for flexibly dispatching computational tasks across processors, and Numba for compiling array-based Python code down to fast machine code. We will show how to use the resulting workflow with static datasets and with simulators such as GSSHA or AdH, allowing you to deploy flexible, high-performance web-based dashboards for your GIS data or simulations without needing major investments in code development or maintenance.
NASA Astrophysics Data System (ADS)
Sramek, Benjamin Koerner
The ability to deliver conformal dose distributions in radiation therapy through intensity modulation and the potential for tumor dose escalation to improve treatment outcome has necessitated an increase in localization accuracy of inter- and intra-fractional patient geometry. Megavoltage cone-beam CT imaging using the treatment beam and onboard electronic portal imaging device is one option currently being studied for implementation in image-guided radiation therapy. However, routine clinical use is predicated upon continued improvements in image quality and patient dose delivered during acquisition. The formal statement of hypothesis for this investigation was that the conformity of planned to delivered dose distributions in image-guided radiation therapy could be further enhanced through the application of kilovoltage scatter correction and intermediate view estimation techniques to megavoltage cone-beam CT imaging, and that normalized dose measurements could be acquired and inter-compared between multiple imaging geometries. The specific aims of this investigation were to: (1) incorporate the Feldkamp, Davis and Kress filtered backprojection algorithm into a program to reconstruct a voxelized linear attenuation coefficient dataset from a set of acquired megavoltage cone-beam CT projections, (2) characterize the effects on megavoltage cone-beam CT image quality resulting from the application of Intermediate View Interpolation and Intermediate View Reprojection techniques to limited-projection datasets, (3) incorporate the Scatter and Primary Estimation from Collimator Shadows (SPECS) algorithm into megavoltage cone-beam CT image reconstruction and determine the set of SPECS parameters which maximize image quality and quantitative accuracy, and (4) evaluate the normalized axial dose distributions received during megavoltage cone-beam CT image acquisition using radiochromic film and thermoluminescent dosimeter measurements in anthropomorphic pelvic and head and neck phantoms. The conclusions of this investigation were: (1) the implementation of intermediate view estimation techniques to megavoltage cone-beam CT produced improvements in image quality, with the largest impact occurring for smaller numbers of initially-acquired projections, (2) the SPECS scatter correction algorithm could be successfully incorporated into projection data acquired using an electronic portal imaging device during megavoltage cone-beam CT image reconstruction, (3) a large range of SPECS parameters were shown to reduce cupping artifacts as well as improve reconstruction accuracy, with application to anthropomorphic phantom geometries improving the percent difference in reconstructed electron density for soft tissue from -13.6% to -2.0%, and for cortical bone from -9.7% to 1.4%, (4) dose measurements in the anthropomorphic phantoms showed consistent agreement between planar measurements using radiochromic film and point measurements using thermoluminescent dosimeters, and (5) a comparison of normalized dose measurements acquired with radiochromic film to those calculated using multiple treatment planning systems, accelerator-detector combinations, patient geometries and accelerator outputs produced a relatively good agreement.
7. Southeast elevation showing building in setting; view to northwest; ...
7. Southeast elevation showing building in setting; view to northwest; Highway 101 in foreground, State Street at right. - V.E. Wood Auto Building, 315 State Street, Santa Barbara, Santa Barbara County, CA
ERIC Educational Resources Information Center
Gialamas, Vasilis; Nikolopoulou, Kleopatra
2010-01-01
This paper regards a comparative study which investigates in-service and pre-service Greek early childhood teachers' views and intentions about integrating and using computers in early childhood settings. Views and intentions were investigated via a questionnaire administered to 240 in-service and 428 pre-service early childhood teachers.…
5. GENERAL VIEW SHOWING ORIGINAL SETTING AT WHARFSIDE WITH CONTAINERIZED ...
5. GENERAL VIEW SHOWING ORIGINAL SETTING AT WHARFSIDE WITH CONTAINERIZED FREIGHT LOADING EQUIPMENT AT PORT OF OAKLAND FACILITY - Oakland Army Base, Transit Shed, East of Dunkirk Street & South of Burma Road, Oakland, Alameda County, CA
Testing optimum viewing conditions for mammographic image displays.
Waynant, R W; Chakrabarti, K; Kaczmarek, R A; Dagenais, I
1999-05-01
The viewbox luminance and viewing room light level are important parameters in a medical film display, but these parameters have not had much attention. Spatial variations and too much room illumination can mask real signal or create the false perception of a signal. This presentation looks at how scotopic light sources and dark-adapted radiologists may identify more real diseases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, L; Han, Y; Jin, M
Purpose: To develop an iterative reconstruction method for X-ray CT, in which the reconstruction can quickly converge to the desired solution with much reduced projection views. Methods: The reconstruction is formulated as a convex feasibility problem, i.e. the solution is an intersection of three convex sets: 1) data fidelity (DF) set – the L2 norm of the difference of observed projections and those from the reconstructed image is no greater than an error bound; 2) non-negativity of image voxels (NN) set; and 3) piecewise constant (PC) set - the total variation (TV) of the reconstructed image is no greater thanmore » an upper bound. The solution can be found by applying projection onto convex sets (POCS) sequentially for these three convex sets. Specifically, the algebraic reconstruction technique and setting negative voxels as zero are used for projection onto the DF and NN sets, respectively, while the projection onto the PC set is achieved by solving a standard Rudin, Osher, and Fatemi (ROF) model. The proposed method is named as full sequential POCS (FS-POCS), which is tested using the Shepp-Logan phantom and the Catphan600 phantom and compared with two similar algorithms, TV-POCS and CP-TV. Results: Using the Shepp-Logan phantom, the root mean square error (RMSE) of reconstructed images changing along with the number of iterations is used as the convergence measurement. In general, FS- POCS converges faster than TV-POCS and CP-TV, especially with fewer projection views. FS-POCS can also achieve accurate reconstruction of cone-beam CT of the Catphan600 phantom using only 54 views, comparable to that of FDK using 364 views. Conclusion: We developed an efficient iterative reconstruction for sparse-view CT using full sequential POCS. The simulation and physical phantom data demonstrated the computational efficiency and effectiveness of FS-POCS.« less
Using textons to rank crystallization droplets by the likely presence of crystals
Ng, Jia Tsing; Dekker, Carien; Kroemer, Markus; Osborne, Michael; von Delft, Frank
2014-01-01
The visual inspection of crystallization experiments is an important yet time-consuming and subjective step in X-ray crystallography. Previously published studies have focused on automatically classifying crystallization droplets into distinct but ultimately arbitrary experiment outcomes; here, a method is described that instead ranks droplets by their likelihood of containing crystals or microcrystals, thereby prioritizing for visual inspection those images that are most likely to contain useful information. The use of textons is introduced to describe crystallization droplets objectively, allowing them to be scored with the posterior probability of a random forest classifier trained against droplets manually annotated for the presence or absence of crystals or microcrystals. Unlike multi-class classification, this two-class system lends itself naturally to unidirectional ranking, which is most useful for assisting sequential viewing because images can be arranged simply by using these scores: this places droplets with probable crystalline behaviour early in the viewing order. Using this approach, the top ten wells included at least one human-annotated crystal or microcrystal for 94% of the plates in a data set of 196 plates imaged with a Minstrel HT system. The algorithm is robustly transferable to at least one other imaging system: when the parameters trained from Minstrel HT images are applied to a data set imaged by the Rock Imager system, human-annotated crystals ranked in the top ten wells for 90% of the plates. Because rearranging images is fundamental to the approach, a custom viewer was written to seamlessly support such ranked viewing, along with another important output of the algorithm, namely the shape of the curve of scores, which is itself a useful overview of the behaviour of the plate; additional features with known usefulness were adopted from existing viewers. Evidence is presented that such ranked viewing of images allows faster but more accurate evaluation of drops, in particular for the identification of microcrystals. PMID:25286854
Jago, R; Zahra, J; Edwards, M J; Kesten, J M; Solomon-Moore, E; Thompson, J L; Sebire, S J
2016-03-01
The present study used qualitative methods to: (1) examine the strategies that were used by parents of children aged 5-6 years to manage screen viewing; (2) identify key factors that affect the implementation of the strategies and (3) develop suggestions for future intervention content. Telephone interviews were conducted with parents of children aged 5-6 years participating in a larger study. Interviews were transcribed verbatim and analysed using an inductive and deductive content analysis. Coding and theme generation was iterative and refined throughout. Parents were recruited through 57 primary schools located in the greater Bristol area (UK). 53 parents of children aged 5-6 years. Parents reported that for many children, screen viewing was a highly desirable behaviour that was difficult to manage, and that parents used the provision of screen viewing as a tool for reward and/or punishment. Parents managed screen viewing by setting limits in relation to daily events such as meals, before and after school, and bedtime. Screen-viewing rules were often altered depending on parental preferences and tasks. Inconsistent messaging within and between parents represented a source of conflict at times. Potential strategies to facilitate reducing screen viewing were identified, including setting screen-viewing limits in relation to specific events, collaborative rule setting, monitoring that involves mothers, fathers and the child, developing a family-specific set of alternative activities to screen viewing and developing a child's ability to self-monitor their own screen viewing. Managing screen viewing is a challenge for many parents and can often cause tension in the home. The data presented in this paper provide key suggestions of new approaches that could be incorporated into behaviour change programmes to reduce child screen viewing. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
2. AERIAL VIEW, SHOWING GLENDALE ROAD BRIDGE WITHIN ITS SETTING ...
2. AERIAL VIEW, SHOWING GLENDALE ROAD BRIDGE WITHIN ITS SETTING AT GLENDALE ROAD CROSSING OF DEEP CREEK LAKE (PHOTOGRAPH BY RUTHVAN MORROW) - Glendale Road Bridge, Spanning Deep Creek Lake on Glendale Road, McHenry, Garrett County, MD
1. AERIAL VIEW, SHOWING GLENDALE ROAD BRIDGE WITHIN ITS SETTING ...
1. AERIAL VIEW, SHOWING GLENDALE ROAD BRIDGE WITHIN ITS SETTING AT GLENDALE ROAD CROSSING OF DEEP CREEK LAKE (PHOTOGRAPH BY RUTHVAN MORROW) - Glendale Road Bridge, Spanning Deep Creek Lake on Glendale Road, McHenry, Garrett County, MD
Integrated performance and reliability specification for digital avionics systems
NASA Technical Reports Server (NTRS)
Brehm, Eric W.; Goettge, Robert T.
1995-01-01
This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.
The Effects of Age and Set Size on the Fast Extraction of Egocentric Distance
Gajewski, Daniel A.; Wallin, Courtney P.; Philbeck, John W.
2016-01-01
Angular direction is a source of information about the distance to floor-level objects that can be extracted from brief glimpses (near one's threshold for detection). Age and set size are two factors known to impact the viewing time needed to directionally localize an object, and these were posited to similarly govern the extraction of distance. The question here was whether viewing durations sufficient to support object detection (controlled for age and set size) would also be sufficient to support well-constrained judgments of distance. Regardless of viewing duration, distance judgments were more accurate (less biased towards underestimation) when multiple potential targets were presented, suggesting that the relative angular declinations between the objects are an additional source of useful information. Distance judgments were more precise with additional viewing time, but the benefit did not depend on set size and accuracy did not improve with longer viewing durations. The overall pattern suggests that distance can be efficiently derived from direction for floor-level objects. Controlling for age-related differences in the viewing time needed to support detection was sufficient to support distal localization but only when brief and longer glimpse trials were interspersed. Information extracted from longer glimpse trials presumably supported performance on subsequent trials when viewing time was more limited. This outcome suggests a particularly important role for prior visual experience in distance judgments for older observers. PMID:27398065
Machining of bone: Analysis of cutting force and surface roughness by turning process.
Noordin, M Y; Jiawkok, N; Ndaruhadi, P Y M W; Kurniawan, D
2015-11-01
There are millions of orthopedic surgeries and dental implantation procedures performed every year globally. Most of them involve machining of bones and cartilage. However, theoretical and analytical study on bone machining is lagging behind its practice and implementation. This study views bone machining as a machining process with bovine bone as the workpiece material. Turning process which makes the basis of the actually used drilling process was experimented. The focus is on evaluating the effects of three machining parameters, that is, cutting speed, feed, and depth of cut, to machining responses, that is, cutting forces and surface roughness resulted by the turning process. Response surface methodology was used to quantify the relation between the machining parameters and the machining responses. The turning process was done at various cutting speeds (29-156 m/min), depths of cut (0.03 -0.37 mm), and feeds (0.023-0.11 mm/rev). Empirical models of the resulted cutting force and surface roughness as the functions of cutting speed, depth of cut, and feed were developed. Observation using the developed empirical models found that within the range of machining parameters evaluated, the most influential machining parameter to the cutting force is depth of cut, followed by feed and cutting speed. The lowest cutting force was obtained at the lowest cutting speed, lowest depth of cut, and highest feed setting. For surface roughness, feed is the most significant machining condition, followed by cutting speed, and with depth of cut showed no effect. The finest surface finish was obtained at the lowest cutting speed and feed setting. © IMechE 2015.
A Non-Stationary 1981-2012 AVHRR NDVI(sub 3g) Time Series
NASA Technical Reports Server (NTRS)
Pinzon, Jorge E.; Tucker, Compton J.
2014-01-01
The NDVI(sub 3g) time series is an improved 8-km normalized difference vegetation index (NDVI) data set produced from Advanced Very High Resolution Radiometer (AVHRR) instruments that extends from 1981 to the present. The AVHRR instruments have flown or are flying on fourteen polar-orbiting meteorological satellites operated by the National Oceanic and Atmospheric Administration (NOAA) and are currently flying on two European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) polar-orbiting meteorological satellites, MetOp-A and MetOp-B. This long AVHRR record is comprised of data from two different sensors: the AVHRR/2 instrument that spans July 1981 to November 2000 and the AVHRR/3 instrument that continues these measurements from November 2000 to the present. The main difficulty in processing AVHRR NDVI data is to properly deal with limitations of the AVHRR instruments. Complicating among-instrument AVHRR inter-calibration of channels one and two is the dual gain introduced in late 2000 on the AVHRR/3 instruments for both these channels. We have processed NDVI data derived from the Sea-Viewing Wide Field-of-view Sensor (SeaWiFS) from 1997 to 2010 to overcome among-instrument AVHRR calibration difficulties. We use Bayesian methods with high quality well-calibrated SeaWiFS NDVI data for deriving AVHRR NDVI calibration parameters. Evaluation of the uncertainties of our resulting NDVI values gives an error of plus or minus 0.005 NDVI units for our 1981 to present data set that is independent of time within our AVHRR NDVI continuum and has resulted in a non-stationary climate data set.
Speaker verification system using acoustic data and non-acoustic data
Gable, Todd J [Walnut Creek, CA; Ng, Lawrence C [Danville, CA; Holzrichter, John F [Berkeley, CA; Burnett, Greg C [Livermore, CA
2006-03-21
A method and system for speech characterization. One embodiment includes a method for speaker verification which includes collecting data from a speaker, wherein the data comprises acoustic data and non-acoustic data. The data is used to generate a template that includes a first set of "template" parameters. The method further includes receiving a real-time identity claim from a claimant, and using acoustic data and non-acoustic data from the identity claim to generate a second set of parameters. The method further includes comparing the first set of parameters to the set of parameters to determine whether the claimant is the speaker. The first set of parameters and the second set of parameters include at least one purely non-acoustic parameter, including a non-acoustic glottal shape parameter derived from averaging multiple glottal cycle waveforms.
NASA Astrophysics Data System (ADS)
Kumar, R.; Barani, G.; Jagadeesan, S.
2012-10-01
This article reviews the various approaches to defining and Measuring Brand Equity. CRM strategy (Customer Relationship Management) is a business philosophy, stemming from relationship marketing that joins strategy and technology, with the aim of creating value for both customers and the company. In this paper we justify the interest of establishing a formal system to measure CRM performance. It analyses the diverse views regarding the set of attributes relevant for measurement of Brand equity. Existing measures of brand equity have been classified into three categories for the discussion in the paper. One set of measures are those focusing on outcome of Brand Equity at the product market level, the second category is that of measures related to customer mindset while the third set is based on measurement of financial parameters. The paper presents a comprehensive review of the work done by various researchers over the last few decades. It analyses the merits and limitations of the different types of measures. Based on the observations made by experts in related literature the authors suggest the scope for further research in the discipline.
Dipolarization Fronts from Reconnection Onset
NASA Astrophysics Data System (ADS)
Sitnov, M. I.; Swisdak, M. M.; Merkin, V. G.; Buzulukova, N.; Moore, T. E.
2012-12-01
Dipolarization fronts observed in the magnetotail are often viewed as signatures of bursty magnetic reconnection. However, until recently spontaneous reconnection was considered to be fully prohibited in the magnetotail geometry because of the linear stability of the ion tearing mode. Recent theoretical studies showed that spontaneous reconnection could be possible in the magnetotail geometries with the accumulation of magnetic flux at the tailward end of the thin current sheet, a distinctive feature of the magnetotail prior to substorm onset. That result was confirmed by open-boundary full-particle simulations of 2D current sheet equilibria, where two magnetotails were separated by an equilibrium X-line and weak external electric field was imposed to nudge the system toward the instability threshold. To investigate the roles of the equilibrium X-line, driving electric field and other parameters in the reconnection onset process we performed a set of 2D PIC runs with different initial settings. The investigated parameter space includes the critical current sheet thickness, flux tube volume per unit magnetic flux and the north-south component of the magnetic field. Such an investigation is critically important for the implementation of kinetic reconnection onset criteria into global MHD codes. The results are compared with Geotail visualization of the magnetotail during substorms, as well as Cluster and THEMIS observations of dipolarization fronts.
A 3D Freehand Ultrasound System for Multi-view Reconstructions from Sparse 2D Scanning Planes
2011-01-01
Background A significant limitation of existing 3D ultrasound systems comes from the fact that the majority of them work with fixed acquisition geometries. As a result, the users have very limited control over the geometry of the 2D scanning planes. Methods We present a low-cost and flexible ultrasound imaging system that integrates several image processing components to allow for 3D reconstructions from limited numbers of 2D image planes and multiple acoustic views. Our approach is based on a 3D freehand ultrasound system that allows users to control the 2D acquisition imaging using conventional 2D probes. For reliable performance, we develop new methods for image segmentation and robust multi-view registration. We first present a new hybrid geometric level-set approach that provides reliable segmentation performance with relatively simple initializations and minimum edge leakage. Optimization of the segmentation model parameters and its effect on performance is carefully discussed. Second, using the segmented images, a new coarse to fine automatic multi-view registration method is introduced. The approach uses a 3D Hotelling transform to initialize an optimization search. Then, the fine scale feature-based registration is performed using a robust, non-linear least squares algorithm. The robustness of the multi-view registration system allows for accurate 3D reconstructions from sparse 2D image planes. Results Volume measurements from multi-view 3D reconstructions are found to be consistently and significantly more accurate than measurements from single view reconstructions. The volume error of multi-view reconstruction is measured to be less than 5% of the true volume. We show that volume reconstruction accuracy is a function of the total number of 2D image planes and the number of views for calibrated phantom. In clinical in-vivo cardiac experiments, we show that volume estimates of the left ventricle from multi-view reconstructions are found to be in better agreement with clinical measures than measures from single view reconstructions. Conclusions Multi-view 3D reconstruction from sparse 2D freehand B-mode images leads to more accurate volume quantification compared to single view systems. The flexibility and low-cost of the proposed system allow for fine control of the image acquisition planes for optimal 3D reconstructions from multiple views. PMID:21251284
A 3D freehand ultrasound system for multi-view reconstructions from sparse 2D scanning planes.
Yu, Honggang; Pattichis, Marios S; Agurto, Carla; Beth Goens, M
2011-01-20
A significant limitation of existing 3D ultrasound systems comes from the fact that the majority of them work with fixed acquisition geometries. As a result, the users have very limited control over the geometry of the 2D scanning planes. We present a low-cost and flexible ultrasound imaging system that integrates several image processing components to allow for 3D reconstructions from limited numbers of 2D image planes and multiple acoustic views. Our approach is based on a 3D freehand ultrasound system that allows users to control the 2D acquisition imaging using conventional 2D probes.For reliable performance, we develop new methods for image segmentation and robust multi-view registration. We first present a new hybrid geometric level-set approach that provides reliable segmentation performance with relatively simple initializations and minimum edge leakage. Optimization of the segmentation model parameters and its effect on performance is carefully discussed. Second, using the segmented images, a new coarse to fine automatic multi-view registration method is introduced. The approach uses a 3D Hotelling transform to initialize an optimization search. Then, the fine scale feature-based registration is performed using a robust, non-linear least squares algorithm. The robustness of the multi-view registration system allows for accurate 3D reconstructions from sparse 2D image planes. Volume measurements from multi-view 3D reconstructions are found to be consistently and significantly more accurate than measurements from single view reconstructions. The volume error of multi-view reconstruction is measured to be less than 5% of the true volume. We show that volume reconstruction accuracy is a function of the total number of 2D image planes and the number of views for calibrated phantom. In clinical in-vivo cardiac experiments, we show that volume estimates of the left ventricle from multi-view reconstructions are found to be in better agreement with clinical measures than measures from single view reconstructions. Multi-view 3D reconstruction from sparse 2D freehand B-mode images leads to more accurate volume quantification compared to single view systems. The flexibility and low-cost of the proposed system allow for fine control of the image acquisition planes for optimal 3D reconstructions from multiple views.
Implementing an Automated Antenna Measurement System
NASA Technical Reports Server (NTRS)
Valerio, Matthew D.; Romanofsky, Robert R.; VanKeuls, Fred W.
2003-01-01
We developed an automated measurement system using a PC running a LabView application, a Velmex BiSlide X-Y positioner, and a HP85l0C network analyzer. The system provides high positioning accuracy and requires no user supervision. After the user inputs the necessary parameters into the LabView application, LabView controls the motor positioning and performs the data acquisition. Current parameters and measured data are shown on the PC display in two 3-D graphs and updated after every data point is collected. The final output is a formatted data file for later processing.
Characterization of BEGe detectors in the HADES underground laboratory
NASA Astrophysics Data System (ADS)
Andreotti, Erica; Gerda Collaboration
2013-08-01
This paper describes the characterization of newly produced Broad Energy Germanium (BEGe) detectors, enriched in the isotope 76Ge. These detectors have been produced in the frame of the GERDA experiment. The aim of the characterization campaign consists in the determination of all the important operational parameters (active volume, dead layer thickness and uniformity, energy resolution, detector stability in time, quality of pulse shape discrimination). A protocol test procedure and devoted set-ups, partially automated, have been developed in view of the large number (∼ 25) of BEGe's detectors to be tested. The characterization is carried out in the HADES underground laboratory, located 225 m below ground (∼ 500 m water equivalent) in Mol, Belgium.
Automated selection of computed tomography display parameters using neural networks
NASA Astrophysics Data System (ADS)
Zhang, Di; Neu, Scott; Valentino, Daniel J.
2001-07-01
A collection of artificial neural networks (ANN's) was trained to identify simple anatomical structures in a set of x-ray computed tomography (CT) images. These neural networks learned to associate a point in an image with the anatomical structure containing the point by using the image pixels located on the horizontal and vertical lines that ran through the point. The neural networks were integrated into a computer software tool whose function is to select an index into a list of CT window/level values from the location of the user's mouse cursor. Based upon the anatomical structure selected by the user, the software tool automatically adjusts the image display to optimally view the structure.
NASA Technical Reports Server (NTRS)
Deau, E. A.; Spilker, L. J.; Morishima, R.; Brooks, S.; Pilorz, S.; Altobelli, N.
2011-01-01
After more than six years in orbit around Saturn, the Cassini Composite Infrared Spectrometer (CIRS) has acquired an extensive set of measurements of Saturn's main rings (A, B, C and Cassini Division) in the thermal infrared. Temperatures were retrieved for the lit and unlit rings over a variety of ring geometries that include phase angle, solar and spacecraft elevations and local time. We show that some of these parameters (solar and spacecraft elevations, phase angle) play a role in the temperature variations in the first order, while the others (ring and particle local time) produced second order effects. The results of this comparison will be presented.
Coherent broadband sonar signal processing with the environmentally corrected matched filter
NASA Astrophysics Data System (ADS)
Camin, Henry John, III
The matched filter is the standard approach for coherently processing active sonar signals, where knowledge of the transmitted waveform is used in the detection and parameter estimation of received echoes. Matched filtering broadband signals provides higher levels of range resolution and reverberation noise suppression than can be realized through narrowband processing. Since theoretical processing gains are proportional to the signal bandwidth, it is typically desirable to utilize the widest band signals possible. However, as signal bandwidth increases, so do environmental effects that tend to decrease correlation between the received echo and the transmitted waveform. This is especially true for ultra wideband signals, where the bandwidth exceeds an octave or approximately 70% fractional bandwidth. This loss of coherence often results in processing gains and range resolution much lower than theoretically predicted. Wiener filtering, commonly used in image processing to improve distorted and noisy photos, is investigated in this dissertation as an approach to correct for these environmental effects. This improved signal processing, Environmentally Corrected Matched Filter (ECMF), first uses a Wiener filter to estimate the environmental transfer function and then again to correct the received signal using this estimate. This process can be viewed as a smarter inverse or whitening filter that adjusts behavior according to the signal to noise ratio across the spectrum. Though the ECMF is independent of bandwidth, it is expected that ultra wideband signals will see the largest improvement, since they tend to be more impacted by environmental effects. The development of the ECMF and demonstration of improved parameter estimation with its use are the primary emphases in this dissertation. Additionally, several new contributions to the field of sonar signal processing made in conjunction with the development of the ECMF are described. A new, nondimensional wideband ambiguity function is presented as a way to view the behavior of the matched filter with and without the decorrelating environmental effects; a new, integrated phase broadband angle estimation method is developed and compared to existing methods; and a new, asymptotic offset phase angle variance model is presented. Several data sets are used to demonstrate these new contributions. High fidelity Sonar Simulation Toolset (SST) synthetic data is used to characterize the theoretical performance. Two in-water data sets were used to verify assumptions that were made during the development of the ECMF. Finally, a newly collected in-air data set containing ultra wideband signals was used in lieu of a cost prohibitive underwater experiment to demonstrate the effectiveness of the ECMF at improving parameter estimates.
Use of geographic information management systems (GIMS) for nitrogen management
NASA Astrophysics Data System (ADS)
Diker, Kenan
1998-11-01
Geographic Information Management Systems (GIMS) was investigated in this study to develop an efficient nitrogen management scheme for corn. The study was conducted on two experimental corn sites. The first site consisted of six non-replicated plots where the canopy reflectance of corn at six nitrogen fertilizer levels was investigated. The reflectance measurements were conducted for nadir and 75sp° view angles. Data from these plots were used to develop relationships between reflectance data and soil and plant parameters. The second site had four corn plots fertilized by different methods such as spoon-fed, pre-plant and side-dress, which created nitrogen variability within the field. Soil and plant nitrogen as well as leaf area, biomass, percent cover measurements, and canopy reflectance data were collected at various growth stages from both sites during the 1995 and 1996 growing seasons. Relationships were developed between the Nitrogen Reflectance Index (NRI) developed by Bausch et al. (1994) and soil and plant variables. Spatial dependence of data was determined by geostatistical methods; variability was mapped in ArcView. Results of this study indicated that the NRI is a better estimator of plant nitrogen status than chlorophyll meter measurements. The NRI can successfully be used to estimate the spatial distribution of soil nitrogen estimates through the plant nitrogen status as well as plant parameters and the yield potential. GIS mapping of measured and estimated soil nitrogen agreed except in locations where hot spots were measured. The NRI value of 0.95 seemed to be the critical value for plant nitrogen status especially for the 75sp° view. The nadir view tended to underestimate plant and soil parameters, whereas, the 75sp° view slightly overestimated these parameters. If available, the 75sp° view data should be used before the tasseling stage for reflectance measurements to reduce the soil background effect. However, it is sensitive to windy conditions. After tasseling, the nadir view should be used because the 75sp° view is obstructed by tassels. Total soil nitrogen at the V6 growth stage was underestimated by the NRI for both view angles. Results also indicated that a nitrogen prescription could be estimated at various growth stages.
Controlled Vocabulary Service Application for Environmental Data Store
NASA Astrophysics Data System (ADS)
Ji, P.; Piasecki, M.; Lovell, R.
2013-12-01
In this paper we present a controlled vocabulary service application for Environmental Data Store (EDS). The purpose for such application is to help researchers and investigators to archive, manage, share, search, and retrieve data efficiently in EDS. The Simple Knowledge Organization System (SKOS) is used in the application for the representation of the controlled vocabularies coming from EDS. The controlled vocabularies of EDS are created by collecting, comparing, choosing and merging controlled vocabularies, taxonomies and ontologies widely used and recognized in geoscience/environmental informatics community, such as Environment ontology (EnvO), Semantic Web for Earth and Environmental Terminology (SWEET) ontology, CUAHSI Hydrologic Ontology and ODM Controlled Vocabulary, National Environmental Methods Index (NEMI), National Water Information System (NWIS) codes, EPSG Geodetic Parameter Data Set, WQX domain value etc. TemaTres, an open-source, web -based thesaurus management package is employed and extended to create and manage controlled vocabularies of EDS in the application. TemaTresView and VisualVocabulary that work well with TemaTres, are also integrated in the application to provide tree view and graphical view of the structure of vocabularies. The Open Source Edition of Virtuoso Universal Server is set up to provide a Web interface to make SPARQL queries against controlled vocabularies hosted on the Environmental Data Store. The replicas of some of the key vocabularies commonly used in the community, are also maintained as part of the application, such as General Multilingual Environmental Thesaurus (GEMET), NetCDF Climate and Forecast (CF) Standard Names, etc.. The application has now been deployed as an elementary and experimental prototype that provides management, search and download controlled vocabularies of EDS under SKOS framework.
Sentinel-2A image quality commissioning phase final results: geometric calibration and performances
NASA Astrophysics Data System (ADS)
Languille, F.; Gaudel, A.; Dechoz, C.; Greslou, D.; de Lussy, F.; Trémas, T.; Poulain, V.; Massera, S.
2016-10-01
In the frame of the Copernicus program of the European Commission, Sentinel-2 offers multispectral high-spatial-resolution optical images over global terrestrial surfaces. In cooperation with ESA, the Centre National d'Etudes Spatiales (CNES) is in charge of the image quality of the project, and so ensures the CAL/VAL commissioning phase during the months following the launch. Sentinel-2 is a constellation of 2 satellites on a polar sun-synchronous orbit with a revisit time of 5 days (with both satellites), a high field of view - 290km, 13 spectral bands in visible and shortwave infrared, and high spatial resolution - 10m, 20m and 60m. The Sentinel-2 mission offers a global coverage over terrestrial surfaces. The satellites acquire systematically terrestrial surfaces under the same viewing conditions in order to have temporal images stacks. The first satellite was launched in June 2015. Following the launch, the CAL/VAL commissioning phase is then lasting during 6 months for geometrical calibration. This paper will point on observations and results seen on Sentinel-2 images during commissioning phase. It will provide explanations about Sentinel-2 products delivered with geometric corrections. This paper will detail calibration sites, and the methods used for geometrical parameters calibration and will present linked results. The following topics will be presented: viewing frames orientation assessment, focal plane mapping for all spectral bands, results on geolocation assessment, and multispectral registration. There is a systematic images recalibration over a same reference which is a set of S2 images produced during the 6 months of CAL/VAL. This set of images will be presented as well as the geolocation performance and the multitemporal performance after refining over this ground reference.
Reference View Selection in DIBR-Based Multiview Coding.
Maugey, Thomas; Petrazzuoli, Giovanni; Frossard, Pascal; Cagnazzo, Marco; Pesquet-Popescu, Beatrice
2016-04-01
Augmented reality, interactive navigation in 3D scenes, multiview video, and other emerging multimedia applications require large sets of images, hence larger data volumes and increased resources compared with traditional video services. The significant increase in the number of images in multiview systems leads to new challenging problems in data representation and data transmission to provide high quality of experience on resource-constrained environments. In order to reduce the size of the data, different multiview video compression strategies have been proposed recently. Most of them use the concept of reference or key views that are used to estimate other images when there is high correlation in the data set. In such coding schemes, the two following questions become fundamental: 1) how many reference views have to be chosen for keeping a good reconstruction quality under coding cost constraints? And 2) where to place these key views in the multiview data set? As these questions are largely overlooked in the literature, we study the reference view selection problem and propose an algorithm for the optimal selection of reference views in multiview coding systems. Based on a novel metric that measures the similarity between the views, we formulate an optimization problem for the positioning of the reference views, such that both the distortion of the view reconstruction and the coding rate cost are minimized. We solve this new problem with a shortest path algorithm that determines both the optimal number of reference views and their positions in the image set. We experimentally validate our solution in a practical multiview distributed coding system and in the standardized 3D-HEVC multiview coding scheme. We show that considering the 3D scene geometry in the reference view, positioning problem brings significant rate-distortion improvements and outperforms the traditional coding strategy that simply selects key frames based on the distance between cameras.
NASA Astrophysics Data System (ADS)
Tubino, Federica
2018-03-01
The effect of human-structure interaction in the vertical direction for footbridges is studied based on a probabilistic approach. The bridge is modeled as a continuous dynamic system, while pedestrians are schematized as moving single-degree-of-freedom systems with random dynamic properties. The non-dimensional form of the equations of motion allows us to obtain results that can be applied in a very wide set of cases. An extensive Monte Carlo simulation campaign is performed, varying the main non-dimensional parameters identified, and the mean values and coefficients of variation of the damping ratio and of the non-dimensional natural frequency of the coupled system are reported. The results obtained can be interpreted from two different points of view. If the characterization of pedestrians' equivalent dynamic parameters is assumed as uncertain, as revealed from a current literature review, then the paper provides a range of possible variations of the coupled system damping ratio and natural frequency as a function of pedestrians' parameters. Assuming that a reliable characterization of pedestrians' dynamic parameters is available (which is not the case at present, but could be in the future), the results presented can be adopted to estimate the damping ratio and natural frequency of the coupled footbridge-pedestrian system for a very wide range of real structures.
A unified convention for biological assemblies with helical symmetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, Chung-Jung, E-mail: tsaic@mail.nih.gov; Nussinov, Ruth; Sackler School of Medicine, Tel Aviv University, Tel Aviv 69978
A new representation of helical structure by four parameters, [n{sub 1}, n{sub 2}, twist, rise], is able to generate an entire helical construct from asymmetric units, including cases of helical assembly with a seam. Assemblies with helical symmetry can be conveniently formulated in many distinct ways. Here, a new convention is presented which unifies the two most commonly used helical systems for generating helical assemblies from asymmetric units determined by X-ray fibre diffraction and EM imaging. A helical assembly is viewed as being composed of identical repetitive units in a one- or two-dimensional lattice, named 1-D and 2-D helical systems,more » respectively. The unification suggests that a new helical description with only four parameters [n{sub 1}, n{sub 2}, twist, rise], which is called the augmented 1-D helical system, can generate the complete set of helical arrangements, including coverage of helical discontinuities (seams). A unified four-parameter characterization implies similar parameters for similar assemblies, can eliminate errors in reproducing structures of helical assemblies and facilitates the generation of polymorphic ensembles from helical atomic models or EM density maps. Further, guidelines are provided for such a unique description that reflects the structural signature of an assembly, as well as rules for manipulating the helical symmetry presentation.« less
Comparison of CME three-dimensional parameters derived from single and multi-spacecraft
NASA Astrophysics Data System (ADS)
LEE, Harim; Moon, Yong-Jae; Na, Hyeonock; Jang, Soojeong
2014-06-01
Several geometrical models (e.g., cone and flux rope models) have been suggested to infer three-dimensional parameters of CMEs using multi-view observations (STEREO/SECCHI) and single-view observations (SOHO/LASCO). To prepare for when only single view observations are available, we have made a test whether the cone model parameters from single-view observations are consistent with those from multi-view ones. For this test, we select 35 CMEs which are identified as CMEs, whose angular widths are larger than 180 degrees, by one spacecraft and as limb CMEs by the other ones. For this we use SOHO/LASCO and STEREO/SECCHI data during the period from 2010 December to 2011 July when two spacecraft were separated by 90±10 degrees. In this study, we compare the 3-D parameters of these CMEs from three different methods: (1) a triangulation method using STEREO/SECCHI and SOHO/LASCO data, (2) a Graduated Cylindrical Shell (GCS) flux rope model using STEREO/SECCHI data, and (3) an ice cream cone model using SOHO/LASCO data. The parameters used for comparison are radial velocities, angular widths and source location (angle γ between the propagation direction and the plan of the sky). We find that the radial velocities and the γ-values from three methods are well correlated with one another (CC > 0.8). However, angular widths from the three methods are somewhat different with the correlation coefficients of CC > 0.4. We also find that the correlation coefficients between the locations from the three methods and the active region locations are larger than 0.9, implying that most of the CMEs are radially ejected.
A technique for automatically extracting useful field of view and central field of view images.
Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar
2016-01-01
It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.
Kernel-Based Sensor Fusion With Application to Audio-Visual Voice Activity Detection
NASA Astrophysics Data System (ADS)
Dov, David; Talmon, Ronen; Cohen, Israel
2016-12-01
In this paper, we address the problem of multiple view data fusion in the presence of noise and interferences. Recent studies have approached this problem using kernel methods, by relying particularly on a product of kernels constructed separately for each view. From a graph theory point of view, we analyze this fusion approach in a discrete setting. More specifically, based on a statistical model for the connectivity between data points, we propose an algorithm for the selection of the kernel bandwidth, a parameter, which, as we show, has important implications on the robustness of this fusion approach to interferences. Then, we consider the fusion of audio-visual speech signals measured by a single microphone and by a video camera pointed to the face of the speaker. Specifically, we address the task of voice activity detection, i.e., the detection of speech and non-speech segments, in the presence of structured interferences such as keyboard taps and office noise. We propose an algorithm for voice activity detection based on the audio-visual signal. Simulation results show that the proposed algorithm outperforms competing fusion and voice activity detection approaches. In addition, we demonstrate that a proper selection of the kernel bandwidth indeed leads to improved performance.
Visualization and processing of computed solid-state NMR parameters: MagresView and MagresPython.
Sturniolo, Simone; Green, Timothy F G; Hanson, Robert M; Zilka, Miri; Refson, Keith; Hodgkinson, Paul; Brown, Steven P; Yates, Jonathan R
2016-09-01
We introduce two open source tools to aid the processing and visualisation of ab-initio computed solid-state NMR parameters. The Magres file format for computed NMR parameters (as implemented in CASTEP v8.0 and QuantumEspresso v5.0.0) is implemented. MagresView is built upon the widely used Jmol crystal viewer, and provides an intuitive environment to display computed NMR parameters. It can provide simple pictorial representation of one- and two-dimensional NMR spectra as well as output a selected spin-system for exact simulations with dedicated spin-dynamics software. MagresPython provides a simple scripting environment to manipulate large numbers of computed NMR parameters to search for structural correlations. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Low Cost and Efficient 3d Indoor Mapping Using Multiple Consumer Rgb-D Cameras
NASA Astrophysics Data System (ADS)
Chen, C.; Yang, B. S.; Song, S.
2016-06-01
Driven by the miniaturization, lightweight of positioning and remote sensing sensors as well as the urgent needs for fusing indoor and outdoor maps for next generation navigation, 3D indoor mapping from mobile scanning is a hot research and application topic. The point clouds with auxiliary data such as colour, infrared images derived from 3D indoor mobile mapping suite can be used in a variety of novel applications, including indoor scene visualization, automated floorplan generation, gaming, reverse engineering, navigation, simulation and etc. State-of-the-art 3D indoor mapping systems equipped with multiple laser scanners product accurate point clouds of building interiors containing billions of points. However, these laser scanner based systems are mostly expensive and not portable. Low cost consumer RGB-D Cameras provides an alternative way to solve the core challenge of indoor mapping that is capturing detailed underlying geometry of the building interiors. Nevertheless, RGB-D Cameras have a very limited field of view resulting in low efficiency in the data collecting stage and incomplete dataset that missing major building structures (e.g. ceilings, walls). Endeavour to collect a complete scene without data blanks using single RGB-D Camera is not technic sound because of the large amount of human labour and position parameters need to be solved. To find an efficient and low cost way to solve the 3D indoor mapping, in this paper, we present an indoor mapping suite prototype that is built upon a novel calibration method which calibrates internal parameters and external parameters of multiple RGB-D Cameras. Three Kinect sensors are mounted on a rig with different view direction to form a large field of view. The calibration procedure is three folds: 1, the internal parameters of the colour and infrared camera inside each Kinect are calibrated using a chess board pattern, respectively; 2, the external parameters between the colour and infrared camera inside each Kinect are calibrated using a chess board pattern; 3, the external parameters between every Kinect are firstly calculated using a pre-set calibration field and further refined by an iterative closet point algorithm. Experiments are carried out to validate the proposed method upon RGB-D datasets collected by the indoor mapping suite prototype. The effectiveness and accuracy of the proposed method is evaluated by comparing the point clouds derived from the prototype with ground truth data collected by commercial terrestrial laser scanner at ultra-high density. The overall analysis of the results shows that the proposed method achieves seamless integration of multiple point clouds form different RGB-D cameras collected at 30 frame per second.
Parents' views and experiences of childhood obesity management in primary care: a qualitative study.
Turner, Katrina M; Salisbury, Chris; Shield, Julian P H
2012-08-01
Primary care has been viewed as an appropriate setting for childhood obesity management. Little is known about parents' views and experiences of obesity management within this clinical setting. These views and experiences need to be explored, as they could affect treatment success. To explore parents' views and experiences of primary care as a treatment setting for childhood obesity. In-depth interviews were held with 15 parents of obese children aged 5-10 years, to explore their views and experiences of primary care childhood obesity management. Parents were contacted via a hospital-based childhood obesity clinic, general practices and Mind, Exercise, Nutrition … Do it! (MEND) groups based in Bristol, England. The interviews were audio-taped transcribed verbatim and analysed thematically. Parents viewed primary care as an appropriate setting in which to treat childhood obesity but were reluctant to consult due to a fear of being blamed for their child's weight and a concern about their child's mental well-being. They also questioned whether practitioners had the knowledge, time and resources to effectively manage childhood obesity. Parents varied in the extent to which they had found consulting a practitioner helpful, and their accounts suggested that GPs and school nurses offer different types of support. Parents need to be reassured that practitioners will address their child's weight in a non-judgemental sensitive manner and are able to treat childhood obesity effectively. A multidisciplinary team approach might benefit a child, as different practitioners may vary in the type of care they provide.
Development, Integration and Testing of Automated Triggering Circuit for Hybrid DC Circuit Breaker
NASA Astrophysics Data System (ADS)
Kanabar, Deven; Roy, Swati; Dodiya, Chiragkumar; Pradhan, Subrata
2017-04-01
A novel concept of Hybrid DC circuit breaker having combination of mechanical switch and static switch provides arc-less current commutation into the dump resistor during quench in superconducting magnet operation. The triggering of mechanical and static switches in Hybrid DC breaker can be automatized which can effectively reduce the overall current commutation time of hybrid DC circuit breaker and make the operation independent of opening time of mechanical switch. With this view, a dedicated control circuit (auto-triggering circuit) has been developed which can decide the timing and pulse duration for mechanical switch as well as static switch from the operating parameters. This circuit has been tested with dummy parameters and thereafter integrated with the actual test set up of hybrid DC circuit breaker. This paper deals with the conceptual design of the auto-triggering circuit, its control logic and operation. The test results of Hybrid DC circuit breaker using this circuit have also been discussed.
On evaluating the robustness of spatial-proximity-based regionalization methods
NASA Astrophysics Data System (ADS)
Lebecherel, Laure; Andréassian, Vazken; Perrin, Charles
2016-08-01
In absence of streamflow data to calibrate a hydrological model, its parameters are to be inferred by a regionalization method. In this technical note, we discuss a specific class of regionalization methods, those based on spatial proximity, which transfers hydrological information (typically calibrated parameter sets) from neighbor gauged stations to the target ungauged station. The efficiency of any spatial-proximity-based regionalization method will depend on the density of the available streamgauging network, and the purpose of this note is to discuss how to assess the robustness of the regionalization method (i.e., its resilience to an increasingly sparse hydrometric network). We compare two options: (i) the random hydrometrical reduction (HRand) method, which consists in sub-sampling the existing gauging network around the target ungauged station, and (ii) the hydrometrical desert method (HDes), which consists in ignoring the closest gauged stations. Our tests suggest that the HDes method should be preferred, because it provides a more realistic view on regionalization performance.
NASA Astrophysics Data System (ADS)
Xiong, Yan; Reichenbach, Stephen E.
1999-01-01
Understanding of hand-written Chinese characters is at such a primitive stage that models include some assumptions about hand-written Chinese characters that are simply false. So Maximum Likelihood Estimation (MLE) may not be an optimal method for hand-written Chinese characters recognition. This concern motivates the research effort to consider alternative criteria. Maximum Mutual Information Estimation (MMIE) is an alternative method for parameter estimation that does not derive its rationale from presumed model correctness, but instead examines the pattern-modeling problem in automatic recognition system from an information- theoretic point of view. The objective of MMIE is to find a set of parameters in such that the resultant model allows the system to derive from the observed data as much information as possible about the class. We consider MMIE for recognition of hand-written Chinese characters using on a simplified hidden Markov Random Field. MMIE provides improved performance improvement over MLE in this application.
Neural Net Gains Estimation Based on an Equivalent Model
Aguilar Cruz, Karen Alicia; Medel Juárez, José de Jesús; Fernández Muñoz, José Luis; Esmeralda Vigueras Velázquez, Midory
2016-01-01
A model of an Equivalent Artificial Neural Net (EANN) describes the gains set, viewed as parameters in a layer, and this consideration is a reproducible process, applicable to a neuron in a neural net (NN). The EANN helps to estimate the NN gains or parameters, so we propose two methods to determine them. The first considers a fuzzy inference combined with the traditional Kalman filter, obtaining the equivalent model and estimating in a fuzzy sense the gains matrix A and the proper gain K into the traditional filter identification. The second develops a direct estimation in state space, describing an EANN using the expected value and the recursive description of the gains estimation. Finally, a comparison of both descriptions is performed; highlighting the analytical method describes the neural net coefficients in a direct form, whereas the other technique requires selecting into the Knowledge Base (KB) the factors based on the functional error and the reference signal built with the past information of the system. PMID:27366146
Neural Net Gains Estimation Based on an Equivalent Model.
Aguilar Cruz, Karen Alicia; Medel Juárez, José de Jesús; Fernández Muñoz, José Luis; Esmeralda Vigueras Velázquez, Midory
2016-01-01
A model of an Equivalent Artificial Neural Net (EANN) describes the gains set, viewed as parameters in a layer, and this consideration is a reproducible process, applicable to a neuron in a neural net (NN). The EANN helps to estimate the NN gains or parameters, so we propose two methods to determine them. The first considers a fuzzy inference combined with the traditional Kalman filter, obtaining the equivalent model and estimating in a fuzzy sense the gains matrix A and the proper gain K into the traditional filter identification. The second develops a direct estimation in state space, describing an EANN using the expected value and the recursive description of the gains estimation. Finally, a comparison of both descriptions is performed; highlighting the analytical method describes the neural net coefficients in a direct form, whereas the other technique requires selecting into the Knowledge Base (KB) the factors based on the functional error and the reference signal built with the past information of the system.
The Transition Region Explorer: Observing the Multi-Scale Dynamics of Geospace
NASA Astrophysics Data System (ADS)
Donovan, E.
2015-12-01
Meso- and global-scale IT remote sensing is accomplished via satellite imagers and ground-based instruments. On the ground, the approach is arrays providing extensive as possible coverage (the "net") and powerful observatories that drill deep to provide detailed information about small-scale processes (the "drill"). Always, there is a trade between cost, spatial resolution, coverage (extent), number of parameters, and more, such that in general the larger the network the sparser the coverage. Where are we now? There are important gaps. With THEMIS-ASI, we see processes that quickly evolve beyond the field of view of one observatory, but involve space/time scales not captured by existing meso- and large-scale arrays. Many forefront questions require observations at heretofore unexplored space and time scales, and comprehensive inter-hemispheric conjugate observations than are presently available. To address this, a new ground-based observing initiative is being developed in Canada. Called TREx, for Transition Region Explorer, this new facility will incorporate dedicated blueline, redline, and Near-Infrared All-Sky Imagers, together with an unprecedented network of ten imaging riometers, with a combined field of view spanning more than three hours of magnetic local time and from equatorward to poleward of typical auroral latitudes (spanning the ionospheric footprint of the "nightside transition region" that separates the highly stretched tail and the inner magnetosphere). The TREx field-of-view is covered by HF radars, and contains a dense network of magnetometers and VLF receivers, as well as other geospace and upper atmospheric remote sensors. Taken together, TREx and these co-located instruments represent a quantum leap forward in terms of imaging, in multiple parameters (precipitation, ionization, convection, and currents), ionospheric dynamics in the above-mentioned scale gap. This represents an exciting new opportunity for studying geospace at the system level, especially for using the aurora to remote sense magnetospheric plasma physics and dynamics, and comes with a set of Big Data challenges that are going to be exciting. One such challenge is the development of a fundamentally new type of data product, namely time series of multi-parameter, geospatially referenced 'data cubes'.
Haimovitz, Kyla; Dweck, Carol S
2016-06-01
Children's intelligence mind-sets (i.e., their beliefs about whether intelligence is fixed or malleable) robustly influence their motivation and learning. Yet, surprisingly, research has not linked parents' intelligence mind-sets to their children's. We tested the hypothesis that a different belief of parents-their failure mind-sets-may be more visible to children and therefore more prominent in shaping their beliefs. In Study 1, we found that parents can view failure as debilitating or enhancing, and that these failure mind-sets predict parenting practices and, in turn, children's intelligence mind-sets. Study 2 probed more deeply into how parents display failure mind-sets. In Study 3a, we found that children can indeed accurately perceive their parents' failure mind-sets but not their parents' intelligence mind-sets. Study 3b showed that children's perceptions of their parents' failure mind-sets also predicted their own intelligence mind-sets. Finally, Study 4 showed a causal effect of parents' failure mind-sets on their responses to their children's hypothetical failure. Overall, parents who see failure as debilitating focus on their children's performance and ability rather than on their children's learning, and their children, in turn, tend to believe that intelligence is fixed rather than malleable. © The Author(s) 2016.
A Regionalization Approach to select the final watershed parameter set among the Pareto solutions
NASA Astrophysics Data System (ADS)
Park, G. H.; Micheletty, P. D.; Carney, S.; Quebbeman, J.; Day, G. N.
2017-12-01
The calibration of hydrological models often results in model parameters that are inconsistent with those from neighboring basins. Considering that physical similarity exists within neighboring basins some of the physically related parameters should be consistent among them. Traditional manual calibration techniques require an iterative process to make the parameters consistent, which takes additional effort in model calibration. We developed a multi-objective optimization procedure to calibrate the National Weather Service (NWS) Research Distributed Hydrological Model (RDHM), using the Nondominant Sorting Genetic Algorithm (NSGA-II) with expert knowledge of the model parameter interrelationships one objective function. The multi-objective algorithm enables us to obtain diverse parameter sets that are equally acceptable with respect to the objective functions and to choose one from the pool of the parameter sets during a subsequent regionalization step. Although all Pareto solutions are non-inferior, we exclude some of the parameter sets that show extremely values for any of the objective functions to expedite the selection process. We use an apriori model parameter set derived from the physical properties of the watershed (Koren et al., 2000) to assess the similarity for a given parameter across basins. Each parameter is assigned a weight based on its assumed similarity, such that parameters that are similar across basins are given higher weights. The parameter weights are useful to compute a closeness measure between Pareto sets of nearby basins. The regionalization approach chooses the Pareto parameter sets that minimize the closeness measure of the basin being regionalized. The presentation will describe the results of applying the regionalization approach to a set of pilot basins in the Upper Colorado basin as part of a NASA-funded project.
1. View looking north toward downtown, showing setting/context and south ...
1. View looking north toward downtown, showing setting/context and south approach. Showing French-Thompson (Rumford) House, gas holder, railroad switch house and bridge. - Water Street Bridge, Spanning Boston & Maine Railroad tracks at Water Street (U.S. Route 3), Concord, Merrimack County, NH
NASA Astrophysics Data System (ADS)
Wei, Jun; Sahiner, Berkman; Hadjiiski, Lubomir M.; Chan, Heang-Ping; Helvie, Mark A.; Roubidoux, Marilyn A.; Zhou, Chuan; Ge, Jun; Zhang, Yiheng
2006-03-01
We are developing a two-view information fusion method to improve the performance of our CAD system for mass detection. Mass candidates on each mammogram were first detected with our single-view CAD system. Potential object pairs on the two-view mammograms were then identified by using the distance between the object and the nipple. Morphological features, Hessian feature, correlation coefficients between the two paired objects and texture features were used as input to train a similarity classifier that estimated a similarity scores for each pair. Finally, a linear discriminant analysis (LDA) classifier was used to fuse the score from the single-view CAD system and the similarity score. A data set of 475 patients containing 972 mammograms with 475 biopsy-proven masses was used to train and test the CAD system. All cases contained the CC view and the MLO or LM view. We randomly divided the data set into two independent sets of 243 cases and 232 cases. The training and testing were performed using the 2-fold cross validation method. The detection performance of the CAD system was assessed by free response receiver operating characteristic (FROC) analysis. The average test FROC curve was obtained from averaging the FP rates at the same sensitivity along the two corresponding test FROC curves from the 2-fold cross validation. At the case-based sensitivities of 90%, 85% and 80% on the test set, the single-view CAD system achieved an FP rate of 2.0, 1.5, and 1.2 FPs/image, respectively. With the two-view fusion system, the FP rates were reduced to 1.7, 1.3, and 1.0 FPs/image, respectively, at the corresponding sensitivities. The improvement was found to be statistically significant (p<0.05) by the AFROC method. Our results indicate that the two-view fusion scheme can improve the performance of mass detection on mammograms.
An investigation of flat panel equipment variables on image quality with a dedicated cardiac phantom
NASA Astrophysics Data System (ADS)
Dragusin, O.; Bosmans, H.; Pappas, C.; Desmet, W.
2008-09-01
Image quality (IQ) evaluation plays a key role in the process of optimization of new x-ray systems. Ideally, this process should be supported by real clinical images, but ethical issues and differences in anatomy and pathology of patients make it impossible. Phantom studies might overcome these issues. This paper presents the IQ evaluation of 30 cineangiographic films acquired with a cardiac flat panel system. The phantom used simulates the anatomy of the heart and allows the circulation of contrast agent boluses through coronary arteries. Variables investigated with influence on IQ and radiation dose are: tube potential, detector dose, added Copper filters, dynamic density optimization (DDO) and viewing angle. The IQ evaluation consisted of scoring 4 simulated calcified lesions located on different coronary artery segments in terms of degree of visualization. Eight cardiologists rated the lesions using a five-point scale ((1) lesion not visible to (5) very good visibility). Radiation doses associated to the angiograms are expressed in terms of incident air kerma (IAK) and effective dose that has been calculated with PCXMX software (STUK, Finland) from the exposure settings assuming a standard sized patient of 70 Kg. Mean IQ scores ranged from 1.68 to 4.88. The highest IQ scores were obtained for the angiograms acquired with tube potential 80 kVp, no added Cu filters, DDO 60%, RAO and LAO views and the highest entrance detector dose that has been used in the present study, namely 0.17 μGy/im. Radiation doses (IAK ~40 mGy and effective dose of 1 mSv) were estimated for angiograms acquired at 15 frames s-1, detector field-of-view 20 cm, and a length of 5 s. The following parameters improved the IQ factor significantly: a change in tube potential from 96 to 80 kVp, detector dose from 0.10 μGy/im to 0.17 μGy/im, the absence of Copper filtration. DDO variable which is a post-processing parameter should be carefully evaluated because it alters the quality of the images independently of radiation exposure settings. The SAM anthropomorphic phantom has the advantage of visualization of stenotic lesions during the injection of a contrast agent and using an anatomical background. In the future, this phantom could potentially bridge the gap between physics tests and the clinical reality in the catheterization laboratory.
Prediction and typicality in multiverse cosmology
NASA Astrophysics Data System (ADS)
Azhar, Feraz
2014-02-01
In the absence of a fundamental theory that precisely predicts values for observable parameters, anthropic reasoning attempts to constrain probability distributions over those parameters in order to facilitate the extraction of testable predictions. The utility of this approach has been vigorously debated of late, particularly in light of theories that claim we live in a multiverse, where parameters may take differing values in regions lying outside our observable horizon. Within this cosmological framework, we investigate the efficacy of top-down anthropic reasoning based on the weak anthropic principle. We argue contrary to recent claims that it is not clear one can either dispense with notions of typicality altogether or presume typicality, in comparing resulting probability distributions with observations. We show in a concrete, top-down setting related to dark matter, that assumptions about typicality can dramatically affect predictions, thereby providing a guide to how errors in reasoning regarding typicality translate to errors in the assessment of predictive power. We conjecture that this dependence on typicality is an integral feature of anthropic reasoning in broader cosmological contexts, and argue in favour of the explicit inclusion of measures of typicality in schemes invoking anthropic reasoning, with a view to extracting predictions from multiverse scenarios.
Sherzer, Gili; Gao, Peng; Schlangen, Erik; Ye, Guang; Gal, Erez
2017-02-28
Modeling the complex behavior of concrete for a specific mixture is a challenging task, as it requires bridging the cement scale and the concrete scale. We describe a multiscale analysis procedure for the modeling of concrete structures, in which material properties at the macro scale are evaluated based on lower scales. Concrete may be viewed over a range of scale sizes, from the atomic scale (10 -10 m), which is characterized by the behavior of crystalline particles of hydrated Portland cement, to the macroscopic scale (10 m). The proposed multiscale framework is based on several models, including chemical analysis at the cement paste scale, a mechanical lattice model at the cement and mortar scales, geometrical aggregate distribution models at the mortar scale, and the Lattice Discrete Particle Model (LDPM) at the concrete scale. The analysis procedure starts from a known chemical and mechanical set of parameters of the cement paste, which are then used to evaluate the mechanical properties of the LDPM concrete parameters for the fracture, shear, and elastic responses of the concrete. Although a macroscopic validation study of this procedure is presented, future research should include a comparison to additional experiments in each scale.
Chittchang, Montakarn; Gleeson, M Paul; Ploypradith, Poonsakdi; Ruchirawat, Somsak
2010-06-01
Natural products currently represent an underutilized source of leads for the pharmaceutical industry, especially when one considers that almost 50% of all drugs were either derived from such sources or are very closely related. Lamellarins are a class of natural products with diverse biological activities and have entered into preclinical development for the treatment of multidrug-resistant tumors. Although these compounds demonstrated good cell penetration, as observed by their low microM activity in whole cell models, they have not been extensively profiled from a physicochemical point of view, and this is the goal of this study. For this study, we have determined the experimental logP values of a set of 25 lamellarins, given it is the single most important parameter in determining multiple ADMET parameters. We also discuss the relationship between this natural product class, natural product derivatives in development and on the market, oral marketed drugs, as well as drug molecules in development, using a range of physicochemical parameters in conjunction with principal components analysis (PCA). The impact of this systematic analysis on our ongoing medicinal chemistry strategy is also discussed. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.
Sherzer, Gili; Gao, Peng; Schlangen, Erik; Ye, Guang; Gal, Erez
2017-01-01
Modeling the complex behavior of concrete for a specific mixture is a challenging task, as it requires bridging the cement scale and the concrete scale. We describe a multiscale analysis procedure for the modeling of concrete structures, in which material properties at the macro scale are evaluated based on lower scales. Concrete may be viewed over a range of scale sizes, from the atomic scale (10−10 m), which is characterized by the behavior of crystalline particles of hydrated Portland cement, to the macroscopic scale (10 m). The proposed multiscale framework is based on several models, including chemical analysis at the cement paste scale, a mechanical lattice model at the cement and mortar scales, geometrical aggregate distribution models at the mortar scale, and the Lattice Discrete Particle Model (LDPM) at the concrete scale. The analysis procedure starts from a known chemical and mechanical set of parameters of the cement paste, which are then used to evaluate the mechanical properties of the LDPM concrete parameters for the fracture, shear, and elastic responses of the concrete. Although a macroscopic validation study of this procedure is presented, future research should include a comparison to additional experiments in each scale. PMID:28772605
Electron-acoustic rogue waves in a plasma with Tribeche–Tsallis–Cairns distributed electrons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merriche, Abderrzak; Tribeche, Mouloud, E-mail: mouloudtribeche@yahoo.fr; Algerian Academy of Sciences and Technologies, Algiers
2017-01-15
The problem of electron-acoustic (EA) rogue waves in a plasma consisting of fluid cold electrons, nonthermal nonextensive electrons and stationary ions, is addressed. A standard multiple scale method has been carried out to derive a nonlinear Schrödinger-like equation. The coefficients of dispersion and nonlinearity depend on the nonextensive and nonthermal parameters. The EA wave stability is analyzed. Interestingly, it is found that the wave number threshold, above which the EA wave modulational instability (MI) sets in, increases as the nonextensive parameter increases. As the nonthermal character of the electrons increases, the MI occurs at large wavelength. Moreover, it is shownmore » that as the nonextensive parameter increases, the EA rogue wave pulse grows while its width is narrowed. The amplitude of the EA rogue wave decreases with an increase of the number of energetic electrons. In the absence of nonthermal electrons, the nonextensive effects are more perceptible and more noticeable. In view of the crucial importance of rogue waves, our results can contribute to the understanding of localized electrostatic envelope excitations and underlying physical processes, that may occur in space as well as in laboratory plasmas.« less
Viewing the World: Visual Inquiry in International Settings
ERIC Educational Resources Information Center
Munn, Jean Correll
2012-01-01
This teaching note describes a course, Viewing the World: Visual Inquiry in International Settings, which the author taught in the Czech Republic in 2009. Five students successfully completed the course, which consisted of designing a project, collecting and analyzing visual data, presenting findings, and writing a final report of a qualitative…
1. CONTEXTUAL VIEW OF BRIDGE IN SETTING, LOOKING SOUTHWEST, FROM ...
1. CONTEXTUAL VIEW OF BRIDGE IN SETTING, LOOKING SOUTHWEST, FROM DOWNSTREAM. Crew, vehicles, boats, and equipment are from the California Department of Transportation's Transportation Laboratory conducting test borings for the replacement bridge. - Smith River Bridge, CA State Highway 199 Spanning Smith River, Crescent City, Del Norte County, CA
Terminology supported archiving and publication of environmental science data in PANGAEA.
Diepenbroek, Michael; Schindler, Uwe; Huber, Robert; Pesant, Stéphane; Stocker, Markus; Felden, Janine; Buss, Melanie; Weinrebe, Matthias
2017-11-10
Exemplified on the information system PANGAEA, we describe the application of terminologies for archiving and publishing environmental science data. A terminology catalogue (TC) was embedded into the system, with interfaces allowing to replicate and to manually work on terminologies. For data ingest and archiving, we show how the TC can improve structuring and harmonizing lineage and content descriptions of data sets. Key is the conceptualization of measurement and observation types (parameters) and methods, for which we have implemented a basic syntax and rule set. For data access and dissemination, we have improved findability of data through enrichment of metadata with TC terms. Semantic annotations, e.g. adding term concepts (including synonyms and hierarchies) or mapped terms of different terminologies, facilitate comprehensive data retrievals. The PANGAEA thesaurus of classifying terms, which is part of the TC is used as an umbrella vocabulary that links the various domains and allows drill downs and side drills with various facets. Furthermore, we describe how TC terms can be linked to nominal data values. This improves data harmonization and facilitates structural transformation of heterogeneous data sets to a common schema. Technical developments are complemented by work on the metadata content. Over the last 20 years, more than 100 new parameters have been defined on average per week. Recently, PANGAEA has increasingly been submitting new terms to various terminology services. Matching terms from terminology services with our parameter or method strings is supported programmatically. However, the process ultimately needs manual input by domain experts. The quality of terminology services is an additional limiting factor, and varies with respect to content, editorial, interoperability, and sustainability. Good quality terminology services are the building blocks for the conceptualization of parameters and methods. In our view, they are essential for data interoperability and arguably the most difficult hurdle for data integration. In summary, the application of terminologies has a mutual positive effect for terminology services and information systems such as PANGAEA. On both sides, the application of terminologies improves content, reliability and interoperability. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2003-04-01
A new theory of space is suggested. It represents the new point of view which has arisen from the critical analysis of the foundations of physics (in particular the theory of relativity and quantum mechanics), mathematics, cosmology and philosophy. The main idea following from the analysis is that the concept of movement represents a key to understanding of the essence of space. The starting-point of the theory is represented by the following philosophical (dialectical materialistic) principles. (a) The principle of the materiality (of the objective reality) of the Nature: the Nature (the Universe) is a system (a set) of material objects (particles, bodies, fields); each object has properties, features, and the properties, the features are inseparable characteristics of material object and belong only to material object. (b) The principle of the existence of material object: an object exists as the objective reality, and movement is a form of existence of object. (c) The principle (definition) of movement of object: the movement is change (i.e. transition of some states into others) in general; the movement determines a direction, and direction characterizes the movement. (d) The principle of existence of time: the time exists as the parameter of the system of reference. These principles lead to the following statements expressing the essence of space. (1) There is no space in general, and there exist space only as a form of existence of the properties and features of the object. It means that the space is a set of the measures of the object (the measure is the philosophical category meaning unity of the qualitative and quantitative determinacy of the object). In other words, the space of the object is a set of the states of the object. (2) The states of the object are manifested only in a system of reference. The main informational property of the unitary system researched physical object + system of reference is that the system of reference determines (measures, calculates) the parameters of the subsystem researched physical object (for example, the coordinates of the object M); the parameters characterize the system of reference (for example, the system of coordinates S). (3) Each parameter of the object is its measure. Total number of the mutually independent parameters of the object is called dimension of the space of the object. (4) The set of numerical values (i.e. the range, the spectrum) of each parameter is the subspace of the object. (The coordinate space, the momentum space and the energy space are examples of the subspaces of the object). (5) The set of the parameters of the object is divided into two non intersecting (opposite) classes: the class of the internal parameters and the class of the non internal (i.e. external) parameters. The class of the external parameters is divided into two non intersecting (opposite) subclasses: the subclass of the absolute parameters (characterizing the form, the sizes of the object) and the subclass of the non absolute (relative) parameters (characterizing the position, the coordinates of the object). (6) Set of the external parameters forms the external space of object. It is called geometrical space of object. (7) Since a macroscopic object has three mutually independent sizes, the dimension of its external absolute space is equal to three. Consequently, the dimension of its external relative space is also equal to three. Thus, the total dimension of the external space of the macroscopic object is equal to six. (8) In general case, the external absolute space (i.e. the form, the sizes) and the external relative space (i.e. the position, the coordinates) of any object are mutually dependent because of influence of a medium. The geometrical space of such object is called non Euclidean space. If the external absolute space and the external relative space of some object are mutually independent, then the external relative space of such object is the homogeneous and isotropic geometrical space. It is called Euclidean space of the object. Consequences: (i) the question of true geometry of the Universe is incorrect; (ii) the theory of relativity has no physical meaning.
Obtaining the Grobner Initialization for the Ground Flash Fraction Retrieval Algorithm
NASA Technical Reports Server (NTRS)
Solakiewicz, R.; Attele, R.; Koshak, W.
2011-01-01
At optical wavelengths and from the vantage point of space, the multiple scattering cloud medium obscures one's view and prevents one from easily determining what flashes strike the ground. However, recent investigations have made some progress examining the (easier, but still difficult) problem of estimating the ground flash fraction in a set of N flashes observed from space In the study by Koshak, a Bayesian inversion method was introduced for retrieving the fraction of ground flashes in a set of flashes observed from a (low earth orbiting or geostationary) satellite lightning imager. The method employed a constrained mixed exponential distribution model to describe the lightning optical measurements. To obtain the optimum model parameters, a scalar function of three variables (one of which is the ground flash fraction) was minimized by a numerical method. This method has formed the basis of a Ground Flash Fraction Retrieval Algorithm (GoFFRA) that is being tested as part of GOES-R GLM risk reduction.
A suggestion for royal jelly specifications.
Kanelis, Dimitrios; Tananaki, Chrysoula; Liolios, Vasilis; Dimou, Maria; Goras, Georgios; Rodopoulou, Maria Anna; Karazafiris, Emmanuel; Thrasyvoulou, Andreas
2015-12-01
This article proposes guidelines for quality standards of royal jelly. The proposals are based on two sets of data; the first from our study of the factors that may affect the royal jelly's chemical composition (protein and sugar supplementation of beehives) and the second on the analysis of a great number of samples from across Greece to establish natural variability of this product. We compared our findings with the adopted national limits, the proposals of the working group of the International Honey Commission (IHC), and the draft proposal of the International Organization of Standardization (ISO). The studied parameters included moisture, total proteins, sugars (fructose, glucose, sucrose, total sugars), and 10-hydroxy- 2-decenoic acid (10-HDA). Our results indicate that the limits for royal jelly in some countries should be amended and the proposals of the IHC and the ISO reviewed in view of recent data on variability. We believe that our proposals could be considered for setting global standards for royal jelly, as they incorporate national legislations, proposals of scientific groups, experimental data, and updated information.
Coherent states and parasupersymmetric quantum mechanics
NASA Technical Reports Server (NTRS)
Debergh, Nathalie
1992-01-01
It is well known that Parafermi and Parabose statistics are natural extensions of the usual Fermi and Bose ones, enhancing trilinear (anti)commutation relations instead of bilinear ones. Due to this generalization, positive parameters appear: the so-called orders of paraquantization p (= 1, 2, 3, ...) and h sub 0 (= 1/2, 1, 3/2, ...), respectively, the first value leading in each case to the usual statistics. The superpostion of the parabosonic and parafermionic operators gives rise to parasupermultiplets for which mixed trilinear relations have already been studied leading to two (nonequivalent) sets: the relative Parabose and the relative Parafermi ones. For the specific values p = 1 = 2h sub 0, these sets reduce to the well known supersymmetry. Coherent states associated with this last model have been recently put in evidence through the annihilation operator point of view and the group theoretical approach or displacement operator context. We propose to realize the corresponding studies within the new context p = 2 = 2h sub 0, being then directly extended to any order of paraquantization.
Area Estimation of Deep-Sea Surfaces from Oblique Still Images
Souto, Miguel; Afonso, Andreia; Calado, António; Madureira, Pedro; Campos, Aldino
2015-01-01
Estimating the area of seabed surfaces from pictures or videos is an important problem in seafloor surveys. This task is complex to achieve with moving platforms such as submersibles, towed or remotely operated vehicles (ROV), where the recording camera is typically not static and provides an oblique view of the seafloor. A new method for obtaining seabed surface area estimates is presented here, using the classical set up of two laser devices fixed to the ROV frame projecting two parallel lines over the seabed. By combining lengths measured directly from the image containing the laser lines, the area of seabed surfaces is estimated, as well as the camera’s distance to the seabed, pan and tilt angles. The only parameters required are the distance between the parallel laser lines and the camera’s horizontal and vertical angles of view. The method was validated with a controlled in situ experiment using a deep-sea ROV, yielding an area estimate error of 1.5%. Further applications and generalizations of the method are discussed, with emphasis on deep-sea applications. PMID:26177287
NASA Astrophysics Data System (ADS)
Dong, Shuai; Yu, Shanshan; Huang, Zheng; Song, Shoutan; Shao, Xinxing; Kang, Xin; He, Xiaoyuan
2017-12-01
Multiple digital image correlation (DIC) systems can enlarge the measurement field without losing effective resolution in the area of interest (AOI). However, the results calculated in substereo DIC systems are located in its local coordinate system in most cases. To stitch the data obtained by each individual system, a data merging algorithm is presented in this paper for global measurement of multiple stereo DIC systems. A set of encoded targets is employed to assist the extrinsic calibration, of which the three-dimensional (3-D) coordinates are reconstructed via digital close range photogrammetry. Combining the 3-D targets with precalibrated intrinsic parameters of all cameras, the extrinsic calibration is significantly simplified. After calculating in substereo DIC systems, all data can be merged into a universal coordinate system based on the extrinsic calibration. Four stereo DIC systems are applied to a four point bending experiment of a steel reinforced concrete beam structure. Results demonstrate high accuracy for the displacement data merging in the overlapping field of views (FOVs) and show feasibility for the distributed FOVs measurement.
3D Flow visualization in virtual reality
NASA Astrophysics Data System (ADS)
Pietraszewski, Noah; Dhillon, Ranbir; Green, Melissa
2017-11-01
By viewing fluid dynamic isosurfaces in virtual reality (VR), many of the issues associated with the rendering of three-dimensional objects on a two-dimensional screen can be addressed. In addition, viewing a variety of unsteady 3D data sets in VR opens up novel opportunities for education and community outreach. In this work, the vortex wake of a bio-inspired pitching panel was visualized using a three-dimensional structural model of Q-criterion isosurfaces rendered in virtual reality using the HTC Vive. Utilizing the Unity cross-platform gaming engine, a program was developed to allow the user to control and change this model's position and orientation in three-dimensional space. In addition to controlling the model's position and orientation, the user can ``scroll'' forward and backward in time to analyze the formation and shedding of vortices in the wake. Finally, the user can toggle between different quantities, while keeping the time step constant, to analyze flow parameter relationships at specific times during flow development. The information, data, or work presented herein was funded in part by an award from NYS Department of Economic Development (DED) through the Syracuse Center of Excellence.
Bayesian Parameter Estimation for Heavy-Duty Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Eric; Konan, Arnaud; Duran, Adam
2017-03-28
Accurate vehicle parameters are valuable for design, modeling, and reporting. Estimating vehicle parameters can be a very time-consuming process requiring tightly-controlled experimentation. This work describes a method to estimate vehicle parameters such as mass, coefficient of drag/frontal area, and rolling resistance using data logged during standard vehicle operation. The method uses Monte Carlo to generate parameter sets which is fed to a variant of the road load equation. Modeled road load is then compared to measured load to evaluate the probability of the parameter set. Acceptance of a proposed parameter set is determined using the probability ratio to the currentmore » state, so that the chain history will give a distribution of parameter sets. Compared to a single value, a distribution of possible values provides information on the quality of estimates and the range of possible parameter values. The method is demonstrated by estimating dynamometer parameters. Results confirm the method's ability to estimate reasonable parameter sets, and indicates an opportunity to increase the certainty of estimates through careful selection or generation of the test drive cycle.« less
Cues to viewing distance for stereoscopic depth constancy.
Glennerster, A; Rogers, B J; Bradshaw, M F
1998-01-01
A veridical estimate of viewing distance is required in order to determine the metric structure of objects from binocular stereopsis. One example of a judgment of metric structure, which we used in our experiment, is the apparently circular cylinder task (E B Johnston, 1991 Vision Research 31 1351-1360). Most studies report underconstancy in this task when the stimulus is defined purely by binocular disparities. We examined the effect of two factors on performance: (i) the richness of the cues to viewing distance (using either a naturalistic setting with many cues to viewing distance or a condition in which the room and the monitors were obscured from view), and (ii) the range of stimulus disparities (cylinder depths) presented during an experimental run. We tested both experienced subjects (who had performed the task many times before under full-cue conditions) and naïve subjects. Depth constancy was reduced for the naïve subjects (from 62% to 46%) when the position of the monitors was obscured. Under similar conditions, the experienced subjects showed no reduction in constancy. In a second experiment, using a forced-choice method of constant stimuli, we found that depth constancy was reduced from 64% to 23% in naïve subjects and from 77% to 55% in experienced subjects when the same set of images was presented at all viewing distances rather than using a set of stimulus disparities proportional to the correct setting. One possible explanation of these results is that, under reduced-cue conditions, the range of disparities presented is used by the visual system as a cue to viewing distance.
Knowledge mobilization in healthcare organizations: a view from the resource-based view of the firm.
Ferlie, Ewan; Crilly, Tessa; Jashapara, Ashok; Trenholm, Susan; Peckham, Anna; Currie, Graeme
2015-03-01
This short literature review argues that the Resource-Based View (RBV) school of strategic management has recently become of increased interest to scholars of healthcare organizations. RBV links well to the broader interest in more effective Knowledge Mobilization (KM) in healthcare. The paper outlines and discusses key concepts, texts and authors from the RBV tradition and gives recent examples of how RBV concepts have been applied fruitfully to healthcare settings. It concludes by setting out a future research agenda.
Design and laboratory calibration of the compact pushbroom hyperspectral imaging system
NASA Astrophysics Data System (ADS)
Zhou, Jiankang; Ji, Yiqun; Chen, Yuheng; Chen, Xinhua; Shen, Weimin
2009-11-01
The designed hyperspectral imaging system is composed of three main parts, that is, optical subsystem, electronic subsystem and capturing subsystem. And a three-dimensional "image cube" can be obtained through push-broom. The fore-optics is commercial-off-the-shelf with high speed and three continuous zoom ratios. Since the dispersive imaging part is based on Offner relay configuration with an aberration-corrected convex grating, high power of light collection and variable view field are obtained. The holographic recording parameters of the convex grating are optimized, and the aberration of the Offner configuration dispersive system is balanced. The electronic system adopts module design, which can minimize size, mass, and power consumption. Frame transfer area-array CCD is chosen as the image sensor and the spectral line can be binned to achieve better SNR and sensitivity without any deterioration in spatial resolution. The capturing system based on the computer can set the capturing parameters, calibrate the spectrometer, process and display spectral imaging data. Laboratory calibrations are prerequisite for using precise spectral data. The spatial and spectral calibration minimize smile and keystone distortion caused by optical system, assembly and so on and fix positions of spatial and spectral line on the frame area-array CCD. Gases excitation lamp is used in smile calibration and the keystone calculation is carried out by different viewing field point source created by a series of narrow slit. The laboratory and field imaging results show that this pushbroom hyperspectral imaging system can acquire high quality spectral images.
Finding Intrinsic and Extrinsic Viewing Parameters from a Single Realist Painting
NASA Astrophysics Data System (ADS)
Jordan, Tadeusz; Stork, David G.; Khoo, Wai L.; Zhu, Zhigang
In this paper we studied the geometry of a three-dimensional tableau from a single realist painting - Scott Fraser’s Three way vanitas (2006). The tableau contains a carefully chosen complex arrangement of objects including a moth, egg, cup, and strand of string, glass of water, bone, and hand mirror. Each of the three plane mirrors presents a different view of the tableau from a virtual camera behind each mirror and symmetric to the artist’s viewing point. Our new contribution was to incorporate single-view geometric information extracted from the direct image of the wooden mirror frames in order to obtain the camera models of both the real camera and the three virtual cameras. Both the intrinsic and extrinsic parameters are estimated for the direct image and the images in three plane mirrors depicted within the painting.
Gates, A
1991-12-01
Data were collected from a study of 49 patients in 1990 and 106 patients in 1991 admitted into Country View Treatment Center and Green Country Counseling Center. Country View is a 30-bed chemical dependency residential center operating under St. John Medical Center in Tulsa, Oklahoma. Green Country is an evening partial hospital chemical dependency program operating under St. John Medical Center in Tulsa, Oklahoma, The tools used in this study were the Country View Patient Self-Reporting Questionnaire, the global Rating Scale, and the Model of Recovering Alcoholics Behavior Stages and Goal Setting (Wing, 1990). These assessments were specifically designed to measure the patient's perceptions of goal setting and the patient's perspective on treatment outcome. The study outcome resulted in program improvement (Green Country evening partial hospital program) and the development of the Country View Substance Abuse Intermediate Link (SAIL) Program (day partial hospital).
A Bayesian approach to modeling 2D gravity data using polygon states
NASA Astrophysics Data System (ADS)
Titus, W. J.; Titus, S.; Davis, J. R.
2015-12-01
We present a Bayesian Markov chain Monte Carlo (MCMC) method for the 2D gravity inversion of a localized subsurface object with constant density contrast. Our models have four parameters: the density contrast, the number of vertices in a polygonal approximation of the object, an upper bound on the ratio of the perimeter squared to the area, and the vertices of a polygon container that bounds the object. Reasonable parameter values can be estimated prior to inversion using a forward model and geologic information. In addition, we assume that the field data have a common random uncertainty that lies between two bounds but that it has no systematic uncertainty. Finally, we assume that there is no uncertainty in the spatial locations of the measurement stations. For any set of model parameters, we use MCMC methods to generate an approximate probability distribution of polygons for the object. We then compute various probability distributions for the object, including the variance between the observed and predicted fields (an important quantity in the MCMC method), the area, the center of area, and the occupancy probability (the probability that a spatial point lies within the object). In addition, we compare probabilities of different models using parallel tempering, a technique which also mitigates trapping in local optima that can occur in certain model geometries. We apply our method to several synthetic data sets generated from objects of varying shape and location. We also analyze a natural data set collected across the Rio Grande Gorge Bridge in New Mexico, where the object (i.e. the air below the bridge) is known and the canyon is approximately 2D. Although there are many ways to view results, the occupancy probability proves quite powerful. We also find that the choice of the container is important. In particular, large containers should be avoided, because the more closely a container confines the object, the better the predictions match properties of object.
Oka, Yasunori; Suzuki, Shuhei; Inoue, Yuich
2008-01-01
Bedtime activities, sleep environment, and their impact on sleep/wake patterns were assessed in 509 elementary school children (6-12 years of age; 252 males and 257 females). Television viewing, playing video games, and surfing the Internet had negative impact on sleep/wake parameters. Moreover, presence of a television set or video game in the child's bedroom increased their activity before bedtime. Time to return home later than 8 p.m. from after-school activity also had a negative impact on sleep/wake patterns. Health care practitioners should be aware of the potential negative impact of television, video games, and the Internet before bedtime, and also the possibility that late after-school activity can disturb sleep/wake patterns.
NASA Astrophysics Data System (ADS)
Štaffenová, Daniela; Rybárik, Ján; Jakubčík, Miroslav
2017-06-01
The aim of experimental research in the area of exterior walls and windows suitable for wooden buildings was to build special pavilion laboratories. These laboratories are ideally isolated from the surrounding environment, airtight and controlled by the constant internal climate. The principle of experimental research is measuring and recording of required physical parameters (e.g. temperature or relative humidity). This is done in layers of experimental fragment sections in the direction from exterior to interior, as well as in critical places by stable interior and real exterior climatic conditions. The outputs are evaluations of experimental structures behaviour during the specified time period, possibly during the whole year by stable interior and real exterior boundary conditions. The main aim of this experimental research is processing of long-term measurements of experimental structures and the subsequent analysis. The next part of the research consists of collecting measurements obtained with assistance of the experimental detached weather station, analysis, evaluation for later setting up of reference data set for the research locality, from the point of view of its comparison to the data sets from Slovak Hydrometeorological Institute (SHMU) and to localities with similar climate conditions. Later on, the data sets could lead to recommendations for design of wooden buildings.
3. CONTEXTUAL VIEW OF BRIDGE IN SETTING, LOOKING SOUTHWEST FROM ...
3. CONTEXTUAL VIEW OF BRIDGE IN SETTING, LOOKING SOUTHWEST FROM ELEVATED GRADE OF EUREKA SOUTHERN RAILROAD. EUREKA SOUTHERN TRUSS BRIDGE AT EXTREME LEFT, 1924 HIGHWAY BRIDGE IN CENTER, 1952 HIGHWAY BRIDGE AT RIGHT - Van Duzen River Bridge, Spanning Van Duzen River at CA State Highway 101, Alton, Humboldt County, CA
Explanation of the cw operation of the Er3+ 3-μm crystal laser
NASA Astrophysics Data System (ADS)
Pollnau, M.; Graf, Th.; Balmer, J. E.; Lüthy, W.; Weber, H. P.
1994-05-01
A computer simulation of the Er3+ 3-μm crystal laser considering the full rate-equation scheme up to the 4F7/2 level has been performed. The influence of the important system parameters on lasing and the interaction of these parameters has been clarified with multiple-parameter variations. Stimulated emission is fed mainly by up-conversion from the lower laser level and in many cases is reduced by the quenching of the lifetime of this level. However, also without up-conversion a set of parameters can be found that allows lasing. Up-conversion from the upper laser level is detrimental to stimulated emission but may be compensated by cross relaxation from the 4S3/2 level. For a typical experimental situation we started with the parameters of Er3+:LiYF4. In addition, the host materials Y3Al5O12 (YAG), YAlO3, Y3Sc2Al3O12 (YSGG), and BaY2F8, as well as the possibilities of codoping, are discussed. In view of the consideration of all excited levels up to 4F7/2, all lifetimes and branching ratios, ground-state depletion, excited-state absorption, three up-conversion processes as well as their inverse processes, stimulated emission, and a realistic resonator design, this is, to our knowledge, the most detailed investigation of the Er3+ 3-μm crystal laser performed so far.
Tresadern, Gary; Agrafiotis, Dimitris K
2009-12-01
Stochastic proximity embedding (SPE) and self-organizing superimposition (SOS) are two recently introduced methods for conformational sampling that have shown great promise in several application domains. Our previous validation studies aimed at exploring the limits of these methods and have involved rather exhaustive conformational searches producing a large number of conformations. However, from a practical point of view, such searches have become the exception rather than the norm. The increasing popularity of virtual screening has created a need for 3D conformational search methods that produce meaningful answers in a relatively short period of time and work effectively on a large scale. In this work, we examine the performance of these algorithms and the effects of different parameter settings at varying levels of sampling. Our goal is to identify search protocols that can produce a diverse set of chemically sensible conformations and have a reasonable probability of sampling biologically active space within a small number of trials. Our results suggest that both SPE and SOS are extremely competitive in this regard and produce very satisfactory results with as few as 500 conformations per molecule. The results improve even further when the raw conformations are minimized with a molecular mechanics force field to remove minor imperfections and any residual strain. These findings provide additional evidence that these methods are suitable for many everyday modeling tasks, both high- and low-throughput.
Elk viewing in Pennsylvania: an evolving eco-tourism system
Bruce E. Lord; Charles H. Strauss; Michael J. Powell
2002-01-01
In 1997, the Pennsylvania Game Commission established an Elk Viewing Area within Pennsylvania's elk range. The viewing area has become the focus for a developing eco-tourism system. During the four years of operation, a research team from Penn State has measured the number of visitors, their expenditure patterns, and other parameters of their visit. The trends...
Tracking and characterizing the head motion of unanaesthetized rats in positron emission tomography
Kyme, Andre; Meikle, Steven; Baldock, Clive; Fulton, Roger
2012-01-01
Positron emission tomography (PET) is an important in vivo molecular imaging technique for translational research. Imaging unanaesthetized rats using motion-compensated PET avoids the confounding impact of anaesthetic drugs and enables animals to be imaged during normal or evoked behaviour. However, there is little published data on the nature of rat head motion to inform the design of suitable marker-based motion-tracking set-ups for brain imaging—specifically, set-ups that afford close to uninterrupted tracking. We performed a systematic study of rat head motion parameters for unanaesthetized tube-bound and freely moving rats with a view to designing suitable motion-tracking set-ups in each case. For tube-bound rats, using a single appropriately placed binocular tracker, uninterrupted tracking was possible greater than 95 per cent of the time. For freely moving rats, simulations and measurements of a live subject indicated that two opposed binocular trackers are sufficient (less than 10% interruption to tracking) for a wide variety of behaviour types. We conclude that reliable tracking of head pose can be achieved with marker-based optical-motion-tracking systems for both tube-bound and freely moving rats undergoing PET studies without sedation. PMID:22718992
Anisotropic universe with magnetized dark energy
NASA Astrophysics Data System (ADS)
Goswami, G. K.; Dewangan, R. N.; Yadav, Anil Kumar
2016-04-01
In the present work we have searched the existence of the late time acceleration of the Universe filled with cosmic fluid and uniform magnetic field as source of matter in anisotropic Heckmann-Schucking space-time. The observed acceleration of universe has been explained by introducing a positive cosmological constant Λ in the Einstein's field equation which is mathematically equivalent to vacuum energy with equation of state (EOS) parameter set equal to -1. The present values of the matter and the dark energy parameters (Ωm)0 & (Ω_{Λ})0 are estimated in view of the latest 287 high red shift (0.3 ≤ z ≤1.4) SN Ia supernova data's of observed apparent magnitude along with their possible error taken from Union 2.1 compilation. It is found that the best fit value for (Ωm)0 & (Ω_{Λ})0 are 0.2820 & 0.7177 respectively which are in good agreement with recent astrophysical observations in the latest surveys like WMAP [2001-2013], Planck [latest 2015] & BOSS. Various physical parameters such as the matter and dark energy densities, the present age of the universe and deceleration parameter have been obtained on the basis of the values of (Ωm)0 & (Ω_{Λ})0. Also we have estimated that the acceleration would have begun in the past at z = 0.71131 ˜6.2334 Gyrs before from present.
Closed loop adaptive control of spectrum-producing step using neural networks
Fu, Chi Yung
1998-01-01
Characteristics of the plasma in a plasma-based manufacturing process step are monitored directly and in real time by observing the spectrum which it produces. An artificial neural network analyzes the plasma spectrum and generates control signals to control one or more of the process input parameters in response to any deviation of the spectrum beyond a narrow range. In an embodiment, a plasma reaction chamber forms a plasma in response to input parameters such as gas flow, pressure and power. The chamber includes a window through which the electromagnetic spectrum produced by a plasma in the chamber, just above the subject surface, may be viewed. The spectrum is conducted to an optical spectrometer which measures the intensity of the incoming optical spectrum at different wavelengths. The output of optical spectrometer is provided to an analyzer which produces a plurality of error signals, each indicating whether a respective one of the input parameters to the chamber is to be increased or decreased. The microcontroller provides signals to control respective controls, but these lines are intercepted and first added to the error signals, before being provided to the controls for the chamber. The analyzer can include a neural network and an optional spectrum preprocessor to reduce background noise, as well as a comparator which compares the parameter values predicted by the neural network with a set of desired values provided by the microcontroller.
Closed loop adaptive control of spectrum-producing step using neural networks
Fu, C.Y.
1998-11-24
Characteristics of the plasma in a plasma-based manufacturing process step are monitored directly and in real time by observing the spectrum which it produces. An artificial neural network analyzes the plasma spectrum and generates control signals to control one or more of the process input parameters in response to any deviation of the spectrum beyond a narrow range. In an embodiment, a plasma reaction chamber forms a plasma in response to input parameters such as gas flow, pressure and power. The chamber includes a window through which the electromagnetic spectrum produced by a plasma in the chamber, just above the subject surface, may be viewed. The spectrum is conducted to an optical spectrometer which measures the intensity of the incoming optical spectrum at different wavelengths. The output of optical spectrometer is provided to an analyzer which produces a plurality of error signals, each indicating whether a respective one of the input parameters to the chamber is to be increased or decreased. The microcontroller provides signals to control respective controls, but these lines are intercepted and first added to the error signals, before being provided to the controls for the chamber. The analyzer can include a neural network and an optional spectrum preprocessor to reduce background noise, as well as a comparator which compares the parameter values predicted by the neural network with a set of desired values provided by the microcontroller. 7 figs.
NASA Astrophysics Data System (ADS)
Zeng, Rongping; Badano, Aldo; Myers, Kyle J.
2017-04-01
We showed in our earlier work that the choice of reconstruction methods does not affect the optimization of DBT acquisition parameters (angular span and number of views) using simulated breast phantom images in detecting lesions with a channelized Hotelling observer (CHO). In this work we investigate whether the model-observer based conclusion is valid when using humans to interpret images. We used previously generated DBT breast phantom images and recruited human readers to find the optimal geometry settings associated with two reconstruction algorithms, filtered back projection (FBP) and simultaneous algebraic reconstruction technique (SART). The human reader results show that image quality trends as a function of the acquisition parameters are consistent between FBP and SART reconstructions. The consistent trends confirm that the optimization of DBT system geometry is insensitive to the choice of reconstruction algorithm. The results also show that humans perform better in SART reconstructed images than in FBP reconstructed images. In addition, we applied CHOs with three commonly used channel models, Laguerre-Gauss (LG) channels, square (SQR) channels and sparse difference-of-Gaussian (sDOG) channels. We found that LG channels predict human performance trends better than SQR and sDOG channel models for the task of detecting lesions in tomosynthesis backgrounds. Overall, this work confirms that the choice of reconstruction algorithm is not critical for optimizing DBT system acquisition parameters.
Heisenberg scaling with weak measurement: a quantum state discrimination point of view
2015-03-18
a quantum state discrimination point of view. The Heisenberg scaling of the photon number for the precision of the interaction parameter between...coherent light and a spin one-half particle (or pseudo-spin) has a simple interpretation in terms of the interaction rotating the quantum state to an...release; distribution is unlimited. Heisenberg scaling with weak measurement: a quantum state discrimination point of view The views, opinions and/or
Drill Bit Tip on Mars Rover Curiosity, Head-on View
2013-02-04
This head-on view shows the tip of the drill bit on NASA Mars rover Curiosity. The view merges two exposures taken by the remote micro-imager in the rover ChemCam instrument at different focus settings.
ODTbrain: a Python library for full-view, dense diffraction tomography.
Müller, Paul; Schürmann, Mirjam; Guck, Jochen
2015-11-04
Analyzing the three-dimensional (3D) refractive index distribution of a single cell makes it possible to describe and characterize its inner structure in a marker-free manner. A dense, full-view tomographic data set is a set of images of a cell acquired for multiple rotational positions, densely distributed from 0 to 360 degrees. The reconstruction is commonly realized by projection tomography, which is based on the inversion of the Radon transform. The reconstruction quality of projection tomography is greatly improved when first order scattering, which becomes relevant when the imaging wavelength is comparable to the characteristic object size, is taken into account. This advanced reconstruction technique is called diffraction tomography. While many implementations of projection tomography are available today, there is no publicly available implementation of diffraction tomography so far. We present a Python library that implements the backpropagation algorithm for diffraction tomography in 3D. By establishing benchmarks based on finite-difference time-domain (FDTD) simulations, we showcase the superiority of the backpropagation algorithm over the backprojection algorithm. Furthermore, we discuss how measurment parameters influence the reconstructed refractive index distribution and we also give insights into the applicability of diffraction tomography to biological cells. The present software library contains a robust implementation of the backpropagation algorithm. The algorithm is ideally suited for the application to biological cells. Furthermore, the implementation is a drop-in replacement for the classical backprojection algorithm and is made available to the large user community of the Python programming language.
Improved central confidence intervals for the ratio of Poisson means
NASA Astrophysics Data System (ADS)
Cousins, R. D.
The problem of confidence intervals for the ratio of two unknown Poisson means was "solved" decades ago, but a closer examination reveals that the standard solution is far from optimal from the frequentist point of view. We construct a more powerful set of central confidence intervals, each of which is a (typically proper) subinterval of the corresponding standard interval. They also provide upper and lower confidence limits which are more restrictive than the standard limits. The construction follows Neyman's original prescription, though discreteness of the Poisson distribution and the presence of a nuisance parameter (one of the unknown means) lead to slightly conservative intervals. Philosophically, the issue of the appropriateness of the construction method is similar to the issue of conditioning on the margins in 2×2 contingency tables. From a frequentist point of view, the new set maintains (over) coverage of the unknown true value of the ratio of means at each stated confidence level, even though the new intervals are shorter than the old intervals by any measure (except for two cases where they are identical). As an example, when the number 2 is drawn from each Poisson population, the 90% CL central confidence interval on the ratio of means is (0.169, 5.196), rather than (0.108, 9.245). In the cited literature, such confidence intervals have applications in numerous branches of pure and applied science, including agriculture, wildlife studies, manufacturing, medicine, reliability theory, and elementary particle physics.
Systems and methods for optimal power flow on a radial network
Low, Steven H.; Peng, Qiuyu
2018-04-24
Node controllers and power distribution networks in accordance with embodiments of the invention enable distributed power control. One embodiment includes a node controller including a distributed power control application; a plurality of node operating parameters describing the operating parameter of a node and a set of at least one node selected from the group consisting of an ancestor node and at least one child node; wherein send node operating parameters to nodes in the set of at least one node; receive operating parameters from the nodes in the set of at least one node; calculate a plurality of updated node operating parameters using an iterative process to determine the updated node operating parameters using the node operating parameters that describe the operating parameters of the node and the set of at least one node, where the iterative process involves evaluation of a closed form solution; and adjust node operating parameters.
Svolos, Patricia; Tsougos, Ioannis; Kyrgias, Georgios; Kappas, Constantine; Theodorou, Kiki
2011-04-01
In this study we sought to evaluate and accent the importance of radiobiological parameter selection and implementation to the normal tissue complication probability (NTCP) models. The relative seriality (RS) and the Lyman-Kutcher-Burman (LKB) models were studied. For each model, a minimum and maximum set of radiobiological parameter sets was selected from the overall published sets applied in literature and a theoretical mean parameter set was computed. In order to investigate the potential model weaknesses in NTCP estimation and to point out the correct use of model parameters, these sets were used as input to the RS and the LKB model, estimating radiation induced complications for a group of 36 breast cancer patients treated with radiotherapy. The clinical endpoint examined was Radiation Pneumonitis. Each model was represented by a certain dose-response range when the selected parameter sets were applied. Comparing the models with their ranges, a large area of coincidence was revealed. If the parameter uncertainties (standard deviation) are included in the models, their area of coincidence might be enlarged, constraining even greater their predictive ability. The selection of the proper radiobiological parameter set for a given clinical endpoint is crucial. Published parameter values are not definite but should be accompanied by uncertainties, and one should be very careful when applying them to the NTCP models. Correct selection and proper implementation of published parameters provides a quite accurate fit of the NTCP models to the considered endpoint.
Greek In-Service and Preservice Teachers' Views about Bullying in Early Childhood Settings
ERIC Educational Resources Information Center
Psalti, Anastasia
2017-01-01
Despite the plethora of studies regarding bullying worldwide, there are limited studies at the early childhood level. This article presents the results of a pilot study aiming at exploring preservice and in-service early childhood teachers' views on bullying in Greek early childhood settings. A total of 192 early childhood teachers completed a…
ERIC Educational Resources Information Center
Bossing, Lewis; Mikulcik, Marilyn
Parents from rural and urban areas of Calloway County, Kentucky were surveyed regarding their children's television viewing habits. Fifteen survey questions were asked, among them whether there was a television set in the home; whether the child had a personal set; whether the family ate meals while watching television; whether television sound…
Ice flood velocity calculating approach based on single view metrology
NASA Astrophysics Data System (ADS)
Wu, X.; Xu, L.
2017-02-01
Yellow River is the river in which the ice flood occurs most frequently in China, hence, the Ice flood forecasting has great significance for the river flood prevention work. In various ice flood forecast models, the flow velocity is one of the most important parameters. In spite of the great significance of the flow velocity, its acquisition heavily relies on manual observation or deriving from empirical formula. In recent years, with the high development of video surveillance technology and wireless transmission network, the Yellow River Conservancy Commission set up the ice situation monitoring system, in which live videos can be transmitted to the monitoring center through 3G mobile networks. In this paper, an approach to get the ice velocity based on single view metrology and motion tracking technique using monitoring videos as input data is proposed. First of all, River way can be approximated as a plane. On this condition, we analyze the geometry relevance between the object side and the image side. Besides, we present the principle to measure length in object side from image. Secondly, we use LK optical flow which support pyramid data to track the ice in motion. Combining the result of camera calibration and single view metrology, we propose a flow to calculate the real velocity of ice flood. At last we realize a prototype system by programming and use it to test the reliability and rationality of the whole solution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Guoping; Mayes, Melanie; Parker, Jack C
2010-01-01
We implemented the widely used CXTFIT code in Excel to provide flexibility and added sensitivity and uncertainty analysis functions to improve transport parameter estimation and to facilitate model discrimination for multi-tracer experiments on structured soils. Analytical solutions for one-dimensional equilibrium and nonequilibrium convection dispersion equations were coded as VBA functions so that they could be used as ordinary math functions in Excel for forward predictions. Macros with user-friendly interfaces were developed for optimization, sensitivity analysis, uncertainty analysis, error propagation, response surface calculation, and Monte Carlo analysis. As a result, any parameter with transformations (e.g., dimensionless, log-transformed, species-dependent reactions, etc.) couldmore » be estimated with uncertainty and sensitivity quantification for multiple tracer data at multiple locations and times. Prior information and observation errors could be incorporated into the weighted nonlinear least squares method with a penalty function. Users are able to change selected parameter values and view the results via embedded graphics, resulting in a flexible tool applicable to modeling transport processes and to teaching students about parameter estimation. The code was verified by comparing to a number of benchmarks with CXTFIT 2.0. It was applied to improve parameter estimation for four typical tracer experiment data sets in the literature using multi-model evaluation and comparison. Additional examples were included to illustrate the flexibilities and advantages of CXTFIT/Excel. The VBA macros were designed for general purpose and could be used for any parameter estimation/model calibration when the forward solution is implemented in Excel. A step-by-step tutorial, example Excel files and the code are provided as supplemental material.« less
Conditions for l =1 Pomeranchuk instability in a Fermi liquid
NASA Astrophysics Data System (ADS)
Wu, Yi-Ming; Klein, Avraham; Chubukov, Andrey V.
2018-04-01
We perform a microscopic analysis of how the constraints imposed by conservation laws affect q =0 Pomeranchuk instabilities in a Fermi liquid. The conventional view is that these instabilities are determined by the static interaction between low-energy quasiparticles near the Fermi surface, in the limit of vanishing momentum transfer q . The condition for a Pomeranchuk instability is set by Flc (s )=-1 , where Flc (s ) (a Landau parameter) is a properly normalized partial component of the antisymmetrized static interaction F (k ,k +q ;p ,p -q ) in a charge (c) or spin (s) subchannel with angular momentum l . However, it is known that conservation laws for total spin and charge prevent Pomeranchuk instabilities for l =1 spin- and charge-current order parameters. Our study aims to understand whether this holds only for these special forms of l =1 order parameters or is a more generic result. To this end we perform a diagrammatic analysis of spin and charge susceptibilities for charge and spin density order parameters, as well as perturbative calculations to second order in the Hubbard U . We argue that for l =1 spin-current and charge-current order parameters, certain vertex functions, which are determined by high-energy fermions, vanish at Fl=1 c (s )=-1 , preventing a Pomeranchuk instability from taking place. For an order parameter with a generic l =1 form factor, the vertex function is not expressed in terms of Fl=1 c (s ), and a Pomeranchuk instability may occur when F1c (s )=-1 . We argue that for other values of l , a Pomeranchuk instability may occur at Flc (s )=-1 for an order parameter with any form factor.
Sunrise view taken by the STS-109 crew
2002-03-10
STS109-345-032 (1-12 March 2002) --- One of the astronauts aboard the Space Shuttle Columbia photographed this west-looking view featuring the profile of the atmosphere and the setting sun. The shuttle was located over the Java Sea to the south of Kalimantan (Borneo) in Indonesia when this image was acquired. Visible to the right of the setting sun are cloud tops from some thunderstorms. The sun's reflection (bright spot over the setting sun) can be seen off the upper layers of the earth's atmosphere.
Cost-effectiveness of point-of-care testing for dehydration in the pediatric ED.
Whitney, Rachel E; Santucci, Karen; Hsiao, Allen; Chen, Lei
2016-08-01
Acute gastroenteritis (AGE) and subsequent dehydration account for a large proportion of pediatric emergency department (PED) visits. Point-of-care (POC) testing has been used in conjunction with clinical assessment to determine the degree of dehydration. Despite the wide acceptance of POC testing, little formal cost-effective analysis of POC testing in the PED exists. We aim to examine the cost-effectiveness of using POC electrolyte testing vs traditional serum chemistry testing in the PED for children with AGE. This was a cost-effective analysis using data from a randomized control trial of children with AGE. A decision analysis model was constructed to calculate cost-savings from the point of view of the payer and the provider. We used parameters obtained from the trial, including cost of testing, admission rates, cost of admission, and length of stay. Sensitivity analyses were performed to evaluate the stability of our model. Using the data set of 225 subjects, POC testing results in a cost savings of $303.30 per patient compared with traditional serum testing from the point of the view of the payer. From the point-of-view of the provider, POC testing results in consistent mean savings of $36.32 ($8.29-$64.35) per patient. Sensitivity analyses demonstrated the stability of the model and consistent savings. This decision analysis provides evidence that POC testing in children with gastroenteritis-related moderate dehydration results in significant cost savings from the points of view of payers and providers compared to traditional serum chemistry testing. Copyright © 2016 Elsevier Inc. All rights reserved.
Using simplifications of reality in the real world: Robust benefits of models for decision making
NASA Astrophysics Data System (ADS)
Hunt, R. J.
2008-12-01
Models are by definition simplifications of reality; the degree and nature of simplification, however, is debated. One view is "the world is 3D, heterogeneous, and transient, thus good models are too" - the more a model directly simulates the complexity of the real world the better it is considered to be. An alternative view is to only use simple models up front because real-world complexity can never be truly known. A third view is construct and calibrate as many models as predictions. A fourth is to build highly parameterized models and either look at an ensemble of results, or use mathematical regularization to identify an optimal most reasonable parameter set and fit. Although each view may have utility for a given decision-making process, there are common threads that perhaps run through all views. First, the model-construction process itself can help the decision-making process because it raises the discussion of opposing parties from one of contrasting professional opinions to discussion of reasonable types and ranges of model inputs and processes. Secondly, no matter what view is used to guide the model building, model predictions for the future might be expected to perform poorly in the future due to unanticipated future changes and stressors to the underlying system simulated. Although this does not reduce the obligation of the modeler to build representative tools for the system, it should serve to temper expectations of model performance. Finally, perhaps the most under-appreciated utility of models is for calculating the reduction in prediction uncertainty resulting from different data collection strategies - an attractive feature separate from the calculation and minimization of absolute prediction uncertainty itself. This type of model output facilitates focusing on efficient use of current and future monitoring resources - something valued by many decision-makers regardless of background, system managed, and societal context.
Interactive model evaluation tool based on IPython notebook
NASA Astrophysics Data System (ADS)
Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet
2015-04-01
In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the remaining parameter sets. As such, by interactively changing the settings and interpreting the graph, the user gains insight in the model structural behaviour. Moreover, a more deliberate choice of objective function and periods of high information content can be identified. The environment is written in an IPython notebook and uses the available interactive functions provided by the IPython community. As such, the power of the IPython notebook as a development environment for scientific computing is illustrated (Shen, 2014).
NASA Astrophysics Data System (ADS)
Sahiner, Berkman; Gurcan, Metin N.; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Petrick, Nicholas; Helvie, Mark A.
2002-05-01
We are developing new techniques to improve the accuracy of computerized microcalcification detection by using the joint two-view information on craniocaudal (CC) and mediolateral-oblique (MLO) views. After cluster candidates were detected using a single-view detection technique, candidates on CC and MLO views were paired using their radial distances from the nipple. Object pairs were classified with a joint two-view classifier that used the similarity of objects in a pair. Each cluster candidate was also classified as a true microcalcification cluster or a false-positive (FP) using its single-view features. The outputs of these two classifiers were fused. A data set of 38 pairs of mammograms from our database was used to train the new detection technique. The independent test set consisted of 77 pairs of mammograms from the University of South Florida public database. At a per-film sensitivity of 70%, the FP rates were 0.17 and 0.27 with the fusion and single-view detection methods, respectively. Our results indicate that correspondence of cluster candidates on two different views provides valuable additional information for distinguishing false from true microcalcification clusters.
View Estimation Based on Value System
NASA Astrophysics Data System (ADS)
Takahashi, Yasutake; Shimada, Kouki; Asada, Minoru
Estimation of a caregiver's view is one of the most important capabilities for a child to understand the behavior demonstrated by the caregiver, that is, to infer the intention of behavior and/or to learn the observed behavior efficiently. We hypothesize that the child develops this ability in the same way as behavior learning motivated by an intrinsic reward, that is, he/she updates the model of the estimated view of his/her own during the behavior imitated from the observation of the behavior demonstrated by the caregiver based on minimizing the estimation error of the reward during the behavior. From this view, this paper shows a method for acquiring such a capability based on a value system from which values can be obtained by reinforcement learning. The parameters of the view estimation are updated based on the temporal difference error (hereafter TD error: estimation error of the state value), analogous to the way such that the parameters of the state value of the behavior are updated based on the TD error. Experiments with simple humanoid robots show the validity of the method, and the developmental process parallel to young children's estimation of its own view during the imitation of the observed behavior of the caregiver is discussed.
Contact Versus Non-Contact Measurement of a Helicopter Main Rotor Composite Blade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luczak, Marcin; Dziedziech, Kajetan; Peeters, Bart
2010-05-28
The dynamic characterization of lightweight structures is particularly complex as the impact of the weight of sensors and instrumentation (cables, mounting of exciters...) can distort the results. Varying mass loading or constraint effects between partial measurements may determine several errors on the final conclusions. Frequency shifts can lead to erroneous interpretations of the dynamics parameters. Typically these errors remain limited to a few percent. Inconsistent data sets however can result in major processing errors, with all related consequences towards applications based on the consistency assumption, such as global modal parameter identification, model-based damage detection and FRF-based matrix inversion in substructuring,more » load identification and transfer path analysis [1]. This paper addresses the subject of accuracy in the context of the measurement of the dynamic properties of a particular lightweight structure. It presents a comprehensive comparative study between the use of accelerometer, laser vibrometer (scanning LDV) and PU-probe (acoustic particle velocity and pressure) measurements to measure the structural responses, with as final aim the comparison of modal model quality assessment. The object of the investigation is a composite material blade from the main rotor of a helicopter. The presented results are part of an extensive test campaign performed with application of SIMO, MIMO, random and harmonic excitation, and the use of the mentioned contact and non-contact measurement techniques. The advantages and disadvantages of the applied instrumentation are discussed. Presented are real-life measurement problems related to the different set up conditions. Finally an analysis of estimated models is made in view of assessing the applicability of the various measurement approaches for successful fault detection based on modal parameters observation as well as in uncertain non-deterministic numerical model updating.« less
An extensive analysis of the triple W UMa type binary FI BOO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christopoulou, P.-E.; Papageorgiou, A.
We present a detailed analysis of the interesting W UMa binary FI Boo in view of the spectroscopic signature of a third body through photometry, period variation, and a thorough investigation of solution uniqueness. We obtained new BVR{sub c}I{sub c} photometric data that, when combined with spectroscopic data, enable us to analyze the system FI Boo and determine its basic orbital and physical properties through PHOEBE, as well as the period variation by studying the times of the minima. This combined approach allows us to study the long-term period changes in the system for the first time in order tomore » investigate the presence of a third body and to check extensively the solution uniqueness and the uncertainties of derived parameters. Our modeling indicates that FI Boo is a W-type moderate (f = 50.15% ± 8.10%) overcontact binary with component masses of M {sub h} = 0.40 ± 0.05 M {sub ☉} and M {sub c} = 1.07 ± 0.05 M {sub ☉}, temperatures of T {sub h} = 5746 ± 33 K and T {sub c} = 5420 ± 56 K, and a third body, which may play an important role in the formation and evolution. The results were tested by heuristic scanning and parameter kicking to provide the consistent and reliable set of parameters that was used to obtain the initial masses of the progenitors (1.71 ± 0.10 M {sub ☉} and 0.63 ± 0.01 M {sub ☉}, respectively). We also investigated the evolutionary status of massive components with several sets of widely used isochrones.« less
Contact Versus Non-Contact Measurement of a Helicopter Main Rotor Composite Blade
NASA Astrophysics Data System (ADS)
Luczak, Marcin; Dziedziech, Kajetan; Vivolo, Marianna; Desmet, Wim; Peeters, Bart; Van der Auweraer, Herman
2010-05-01
The dynamic characterization of lightweight structures is particularly complex as the impact of the weight of sensors and instrumentation (cables, mounting of exciters…) can distort the results. Varying mass loading or constraint effects between partial measurements may determine several errors on the final conclusions. Frequency shifts can lead to erroneous interpretations of the dynamics parameters. Typically these errors remain limited to a few percent. Inconsistent data sets however can result in major processing errors, with all related consequences towards applications based on the consistency assumption, such as global modal parameter identification, model-based damage detection and FRF-based matrix inversion in substructuring, load identification and transfer path analysis [1]. This paper addresses the subject of accuracy in the context of the measurement of the dynamic properties of a particular lightweight structure. It presents a comprehensive comparative study between the use of accelerometer, laser vibrometer (scanning LDV) and PU-probe (acoustic particle velocity and pressure) measurements to measure the structural responses, with as final aim the comparison of modal model quality assessment. The object of the investigation is a composite material blade from the main rotor of a helicopter. The presented results are part of an extensive test campaign performed with application of SIMO, MIMO, random and harmonic excitation, and the use of the mentioned contact and non-contact measurement techniques. The advantages and disadvantages of the applied instrumentation are discussed. Presented are real-life measurement problems related to the different set up conditions. Finally an analysis of estimated models is made in view of assessing the applicability of the various measurement approaches for successful fault detection based on modal parameters observation as well as in uncertain non-deterministic numerical model updating.
Schleider, Jessica L; Weisz, John R
2018-01-24
Because parents are primary gatekeepers to mental health care for their children, parental expectations that mental health treatment is ineffective may undermine treatment seeking, retention, and response. Thus, a need exists to understand parents' expectations about treatment and to develop scalable interventions that can instill more favorable views. We examined parents' treatment expectancies and preferences for their offspring and themselves in relation to two global beliefs: mind-sets (malleability beliefs) of emotions and anxiety, and views of failure as enhancing versus debilitating. Study 1 (N = 200; 49.5% fathers; 70.4% Caucasian) examined associations among parents' emotion mind-sets, anxiety mind-sets, failure beliefs, and treatment expectancies and preferences. Study 2 (N = 430; 44.70% fathers; 75.80% Caucasian) tested whether online inductions teaching "growth emotion mind-sets" (viewing emotions as malleable), adaptive failure beliefs, or both improved parents' treatment expectancies and hypothetical preferences for treatment (vs. no-treatment). Participants received one of three 8- to 15-min inductions or a psychoeducation control, rating treatment expectancies. and preferences pre- and postinduction. In Study 1, fixed emotion mind-sets and failure-is-debilitating beliefs were associated with lower parent psychotherapy expectancies for offspring and themselves and stronger "no-treatment" preferences for offspring. In Study 2, inductions teaching (a) growth emotion mind-sets only and (b) growth emotion mind-sets and failure-is-enhancing beliefs improved parents' psychotherapy expectancies for themselves (ds = .38, .51) and offspring (ds = .30, .43). No induction increased parents' hypothetical preferences for treatment (vs. no-treatment). Findings suggest scalable strategies for strengthening parents' psychotherapy effectiveness beliefs for themselves and their children.
35. INTERIOR VIEW OF THE GUARD LOCKS LOCK HOUSE: CLOSED ...
35. INTERIOR VIEW OF THE GUARD LOCKS LOCK HOUSE: CLOSED LOCK GATES AND TWO SETS OF MACHINERY TO ASSIST IN OPERATING THEM. VIEW FROM THE FAST END OF THE BUILDING LOOKING WEST 1976 - Pawtucket Canal, Guard Locks, Lowell, Middlesex County, MA
Horiuchi, Masahiro; Endo, Junko; Takayama, Norimasa; Murase, Kazutaka; Nishiyama, Norio; Saito, Haruo; Fujiwara, Akio
2014-01-01
We investigated the impact of viewing versus not viewing a real forest on human subjects’ physiological and psychological responses in the same setting. Fifteen healthy volunteers (11 males, four females, mean age 36 years) participated. Each participant was asked to view a forest while seated in a comfortable chair for 15 min (Forest condition) vs. sitting the same length of time with a curtain obscuring the forest view (Enclosed condition). Both conditions significantly decreased blood pressure (BP) variables, i.e., systolic BP, diastolic BP, and mean arterial pressure between pre and post experimental stimuli, but these reductions showed no difference between conditions. Interestingly, the Forest viewing reduced cerebral oxygenated hemoglobin (HbO2) assessed by near-infrared spectroscopy (NIRS) and improved the subjects’ Profile of Mood States (POMS) scores, whereas the Enclosed condition increased the HbO2 and did not affect the POMS scores. There were no significant differences in saliva amylase or heart rate variability (HRV) between the two conditions. Collectively, these results suggest that viewing a real forest may have a positive effect on cerebral activity and psychological responses. However, both viewing and not viewing the forest had similar effects on cardiovascular responses such as BP variables and HRV. PMID:25333924
2007-03-01
front of a large area blackbody as background. The viewing angle , defined as the angle between surface normal and camera line of sight, was varied by...and polarization angle were derived from the Stokes parameters. The dependence of these polarization characteristics on viewing angle was investigated
Group-based strategy diffusion in multiplex networks with weighted values
NASA Astrophysics Data System (ADS)
Yu, Jianyong; Jiang, J. C.; Xiang, Leijun
2017-03-01
The information diffusion of multiplex social networks has received increasing interests in recent years. Actually, the multiplex networks are made of many communities, and it should be gotten more attention for the influences of community level diffusion, besides of individual level interactions. In view of this, this work explores strategy interactions and diffusion processes in multiplex networks with weighted values from a new perspective. Two different groups consisting of some agents with different influential strength are firstly built in each layer network, the authority and non-authority groups. The strategy interactions between different groups in intralayer and interlayer networks are performed to explore community level diffusion, by playing two classical strategy games, Prisoner's Dilemma and Snowdrift Game. The impact forces from the different groups and the reactive forces from individual agents are simultaneously taken into account in intralayer and interlayer interactions. This paper reveals and explains the evolutions of cooperation diffusion and the influences of interlayer interaction tight degrees in multiplex networks with weighted values. Some thresholds of critical parameters of interaction degrees and games parameters settings are also discussed in group-based strategy diffusion.
Use of Linear Perspective Scene Cues in a Simulated Height Regulation Task
NASA Technical Reports Server (NTRS)
Levison, W. H.; Warren, R.
1984-01-01
As part of a long-term effort to quantify the effects of visual scene cuing and non-visual motion cuing in flight simulators, an experimental study of the pilot's use of linear perspective cues in a simulated height-regulation task was conducted. Six test subjects performed a fixed-base tracking task with a visual display consisting of a simulated horizon and a perspective view of a straight, infinitely-long roadway of constant width. Experimental parameters were (1) the central angle formed by the roadway perspective and (2) the display gain. The subject controlled only the pitch/height axis; airspeed, bank angle, and lateral track were fixed in the simulation. The average RMS height error score for the least effective display configuration was about 25% greater than the score for the most effective configuration. Overall, larger and more highly significant effects were observed for the pitch and control scores. Model analysis was performed with the optimal control pilot model to characterize the pilot's use of visual scene cues, with the goal of obtaining a consistent set of independent model parameters to account for display effects.
Modeling of polymer photodegradation for solar cell modules
NASA Technical Reports Server (NTRS)
Somersall, A. C.; Guillet, J. E.
1982-01-01
It was shown that many of the experimental observations in the photooxidation of hydrocarbon polymers can be accounted for with a computer simulation using an elementary mechanistic model with corresponding rate constants for each reaction. For outdoor applications, however, such as in photovoltaics, the variation of temperature must have important effects on the useful lifetimes of such materials. The data bank necessary to replace the isothermal rate constant values with Arrhenius activation parameters: A (the pre-exponential factor) and E (the activation energy) was searched. The best collection of data assembled to data is summarized. Note, however, that the problem is now considerably enlarged since from a theoretical point of view, with 51 of the input variables replaced with 102 parameters. The sensitivity of the overall scheme is such that even after many computer simulations, a successful photooxidation simulation with the expanded variable set was not completed. Many of the species in the complex process undergo a number of competitive pathways, the relative importance of each being often sensitive to small changes in the calculated rate constant values.
Computational screening of organic materials towards improved photovoltaic properties
NASA Astrophysics Data System (ADS)
Dai, Shuo; Olivares-Amaya, Roberto; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan; Borunda, Mario
2015-03-01
The world today faces an energy crisis that is an obstruction to the development of the human civilization. One of the most promising solutions is solar energy harvested by economical solar cells. Being the third generation of solar cell materials, organic photovoltaic (OPV) materials is now under active development from both theoretical and experimental points of view. In this study, we constructed a parameter to select the desired molecules based on their optical spectra performance. We applied it to investigate a large collection of potential OPV materials, which were from the CEPDB database set up by the Harvard Clean Energy Project. Time dependent density functional theory (TD-DFT) modeling was used to calculate the absorption spectra of the molecules. Then based on the parameter, we screened out the top performing molecules for their potential OPV usage and suggested experimental efforts toward their synthesis. In addition, from those molecules, we summarized the functional groups that provided molecules certain spectrum capability. It is hoped that useful information could be mined out to provide hints to molecular design of OPV materials.
NASA Astrophysics Data System (ADS)
Keane, Tommy P.; Saber, Eli; Rhody, Harvey; Savakis, Andreas; Raj, Jeffrey
2012-04-01
Contemporary research in automated panorama creation utilizes camera calibration or extensive knowledge of camera locations and relations to each other to achieve successful results. Research in image registration attempts to restrict these same camera parameters or apply complex point-matching schemes to overcome the complications found in real-world scenarios. This paper presents a novel automated panorama creation algorithm by developing an affine transformation search based on maximized mutual information (MMI) for region-based registration. Standard MMI techniques have been limited to applications with airborne/satellite imagery or medical images. We show that a novel MMI algorithm can approximate an accurate registration between views of realistic scenes of varying depth distortion. The proposed algorithm has been developed using stationary, color, surveillance video data for a scenario with no a priori camera-to-camera parameters. This algorithm is robust for strict- and nearly-affine-related scenes, while providing a useful approximation for the overlap regions in scenes related by a projective homography or a more complex transformation, allowing for a set of efficient and accurate initial conditions for pixel-based registration.
NASA Astrophysics Data System (ADS)
Arab, M.; Khodam-Mohammadi, A.
2018-03-01
As a deformed matter bounce scenario with a dark energy component, we propose a deformed one with running vacuum model (RVM) in which the dark energy density ρ _{Λ } is written as a power series of H^2 and \\dot{H} with a constant equation of state parameter, same as the cosmological constant, w=-1. Our results in analytical and numerical point of views show that in some cases same as Λ CDM bounce scenario, although the spectral index may achieve a good consistency with observations, a positive value of running of spectral index (α _s) is obtained which is not compatible with inflationary paradigm where it predicts a small negative value for α _s. However, by extending the power series up to H^4, ρ _{Λ }=n_0+n_2 H^2+n_4 H^4, and estimating a set of consistent parameters, we obtain the spectral index n_s, a small negative value of running α _s and tensor to scalar ratio r, which these reveal a degeneracy between deformed matter bounce scenario with RVM-DE and inflationary cosmology.
Lucas Martínez, Néstor; Martínez Ortega, José-Fernán; Hernández Díaz, Vicente; Del Toro Matamoros, Raúl M
2016-05-12
The deployment of the nodes in a Wireless Sensor and Actuator Network (WSAN) is typically restricted by the sensing and acting coverage. This implies that the locations of the nodes may be, and usually are, not optimal from the point of view of the radio communication. Additionally, when the transmission power is tuned for those locations, there are other unpredictable factors that can cause connectivity failures, like interferences, signal fading due to passing objects and, of course, radio irregularities. A control-based self-adaptive system is a typical solution to improve the energy consumption while keeping good connectivity. In this paper, we explore how the communication range for each node evolves along the iterations of an energy saving self-adaptive transmission power controller when using different parameter sets in an outdoor scenario, providing a WSAN that automatically adapts to surrounding changes keeping good connectivity. The results obtained in this paper show how the parameters with the best performance keep a k-connected network, where k is in the range of the desired node degree plus or minus a specified tolerance value.
Single and tandem Fabry-Perot etalons as solar background filters for lidar.
McKay, J A
1999-09-20
Atmospheric lidar is difficult in daylight because of sunlight scattered into the receiver field of view. In this research methods for the design and performance analysis of Fabry-Perot etalons as solar background filters are presented. The factor by which the signal to background ratio is enhanced is defined as a measure of the performance of the etalon as a filter. Equations for evaluating this parameter are presented for single-, double-, and triple-etalon filter systems. The role of reflective coupling between etalons is examined and shown to substantially reduce the contributions of the second and third etalons to the filter performance. Attenuators placed between the etalons can improve the filter performance, at modest cost to the signal transmittance. The principal parameter governing the performance of the etalon filters is the etalon defect finesse. Practical limitations on etalon plate smoothness and parallelism cause the defect finesse to be relatively low, especially in the ultraviolet, and this sets upper limits to the capability of tandem etalon filters to suppress the solar background at tolerable cost to the signal.
Impact of floating windows on the accuracy of depth perception in games
NASA Astrophysics Data System (ADS)
Stanfield, Brodie; Zerebecki, Christopher; Hogue, Andrew; Kapralos, Bill; Collins, Karen
2013-03-01
The floating window technique is commonly employed by stereoscopic 3D filmmakers to reduce the effects of window violations by masking out portions of the screen that contain visual information that doesn't exist in one of the views. Although widely adopted in the film industry, and despite its potential benefits, the technique has not been adopted by video game developers to the same extent possibly because of the lack of understanding of how the floating window can be utilized in such an interactive medium. Here, we describe a quantitative study that investigates how the floating window technique affects users' depth perception in a simple game-like environment. Our goal is to determine how various stereoscopic 3D parameters such as the existence, shape, and size of the floating window affect the user experience and to devise a set of guidelines for game developers wishing to develop stereoscopic 3D content. Providing game designers with quantitative knowledge of how these parameters can affect user experience is invaluable when choosing to design interactive stereoscopic 3D content.
NASA Astrophysics Data System (ADS)
Zakharenko, Olena; Motiyenko, R. A.; Aviles Moreno, Juan-Ramon; Huet, T. R.
2016-06-01
Methacrolein and methyl vinyl ketone are the two major oxidation products of isoprene emitted in the troposphere. New spectroscopic information is provided with the aim to allow unambiguous identification of these molecules, characterized by a large amplitude motion associated with the methyl top. State-of-the-art millimeter-wave spectroscopy experiments coupled to quantum chemical calculations have been performed. Comprehensive sets of molecular parameters have been obtained. The torsion-rotation-vibration effects will be discussed in detail. From the atmospheric application point of view the results provide precise ground state molecular constants essential as a foundation (by using the Ground State Combination Differences method) for the analysis of high resolution spectrum, recorded from 600 to 1600 wn. The infrared range can be then refitted using appropriate Hamiltonian parameters. The present work is funded by the French ANR through the PIA under contract ANR-11-LABX-0005-01 (Labex CaPPA), by the Regional Council Nord-Pas de Calais and by the European Funds for Regional Economic Development (FEDER).
Lucas Martínez, Néstor; Martínez Ortega, José-Fernán; Hernández Díaz, Vicente; del Toro Matamoros, Raúl M.
2016-01-01
The deployment of the nodes in a Wireless Sensor and Actuator Network (WSAN) is typically restricted by the sensing and acting coverage. This implies that the locations of the nodes may be, and usually are, not optimal from the point of view of the radio communication. Additionally, when the transmission power is tuned for those locations, there are other unpredictable factors that can cause connectivity failures, like interferences, signal fading due to passing objects and, of course, radio irregularities. A control-based self-adaptive system is a typical solution to improve the energy consumption while keeping good connectivity. In this paper, we explore how the communication range for each node evolves along the iterations of an energy saving self-adaptive transmission power controller when using different parameter sets in an outdoor scenario, providing a WSAN that automatically adapts to surrounding changes keeping good connectivity. The results obtained in this paper show how the parameters with the best performance keep a k-connected network, where k is in the range of the desired node degree plus or minus a specified tolerance value. PMID:27187397
Using Diffraction Tomography to Estimate Marine Animal Size
NASA Astrophysics Data System (ADS)
Jaffe, J. S.; Roberts, P.
In this article we consider the development of acoustic methods which have the potential to size marine animals. The proposed technique uses scattered sound in order to invert for both animal size and shape. The technique uses the Distorted Wave Born Approximation (DWBA) in order to model sound scattered from these organisms. The use of the DWBA also provides a valuable context for formulating data analysis techniques in order to invert for parameters of the animal. Although 3-dimensional observations can be obtained from a complete set of views, due to the difficulty of collecting full 3-dimensional scatter, it is useful to simplify the inversion by approximating the animal by a few parameters. Here, the animals are modeled as 3-dimensional ellipsoids. This reduces the complexity of the problem to a determination of the 3 semi axes for the x, y and z dimensions from just a few radial spokes through the 3-dimensional Fourier Transform. In order to test the idea, simulated scatter data is taken from a 3-dimensional model of a marine animal and the resultant data are inverted in order to estimate animal shape
Temperature Dependences of Air-Broadening and Shift Parameters in the ν_3 Band of Ozone
NASA Astrophysics Data System (ADS)
Smith, Mary Ann H.; Devi, V. Malathy; Benner, D. Chris
2015-06-01
Line parameter errors can contribute significantly to the total errors in retrievals of terrestrial atmospheric ozone concentration profiles using the strong 9.6-μm band, particularly for nadir-viewing experiments Detailed knowledge of the interfering ozone signal is also needed for retrievals of other atmospheric species in this spectral region. We have determined Lorentz air-broadening and pressure-induced shift coefficients along with their temperature dependences for a number of transitions in the ν_3 fundamental band of 16O_3. These results were obtained by applying the multispectrum nonlinear least-squares fitting technique to a set of 31 high-resolution infrared absorption spectra of O_3 recorded at temperatures between 160 and 300 K with several different room-temperature and coolable sample cells at the McMath-Pierce Fourier transform spectrometer at the National Solar Observatory on Kitt Peak. We compare our results with other available measurements and with the ozone line parameters in the HITRAN database. J.~Worden et al., J.~Geophys.~Res. 109 (2004) 9308-9319. R.~Beer et al., Geophys.~Res.~Lett. 35 (2008) L09801. D.~Chris Benner et al., JQSRT 53 (1995) 705-721. Rothman et al., J. Quant. Spectrosc. Radiat. Transfer 130 (2013) 4. JQSRT 130 (2013) 4-50.
Preliminary Evaluation of a Commercial 360 Multi-Camera Rig for Photogrammetric Purposes
NASA Astrophysics Data System (ADS)
Teppati Losè, L.; Chiabrando, F.; Spanò, A.
2018-05-01
The research presented in this paper is focused on a preliminary evaluation of a 360 multi-camera rig: the possibilities to use the images acquired by the system in a photogrammetric workflow and for the creation of spherical images are investigated and different tests and analyses are reported. Particular attention is dedicated to different operative approaches for the estimation of the interior orientation parameters of the cameras, both from an operative and theoretical point of view. The consistency of the six cameras that compose the 360 system was in depth analysed adopting a self-calibration approach in a commercial photogrammetric software solution. A 3D calibration field was projected and created, and several topographic measurements were performed in order to have a set of control points to enhance and control the photogrammetric process. The influence of the interior parameters of the six cameras were analyse both in the different phases of the photogrammetric workflow (reprojection errors on the single tie point, dense cloud generation, geometrical description of the surveyed object, etc.), both in the stitching of the different images into a single spherical panorama (some consideration on the influence of the camera parameters on the overall quality of the spherical image are reported also in these section).
Probing Primordial Non-Gaussianity with Weak-lensing Minkowski Functionals
NASA Astrophysics Data System (ADS)
Shirasaki, Masato; Yoshida, Naoki; Hamana, Takashi; Nishimichi, Takahiro
2012-11-01
We study the cosmological information contained in the Minkowski functionals (MFs) of weak gravitational lensing convergence maps. We show that the MFs provide strong constraints on the local-type primordial non-Gaussianity parameter f NL. We run a set of cosmological N-body simulations and perform ray-tracing simulations of weak lensing to generate 100 independent convergence maps of a 25 deg2 field of view for f NL = -100, 0 and 100. We perform a Fisher analysis to study the degeneracy among other cosmological parameters such as the dark energy equation of state parameter w and the fluctuation amplitude σ8. We use fully nonlinear covariance matrices evaluated from 1000 ray-tracing simulations. For upcoming wide-field observations such as those from the Subaru Hyper Suprime-Cam survey with a proposed survey area of 1500 deg2, the primordial non-Gaussianity can be constrained with a level of f NL ~ 80 and w ~ 0.036 by weak-lensing MFs. If simply scaled by the effective survey area, a 20,000 deg2 lensing survey using the Large Synoptic Survey Telescope will yield constraints of f NL ~ 25 and w ~ 0.013. We show that these constraints can be further improved by a tomographic method using source galaxies in multiple redshift bins.
Exploring the limits of frequency lowering
Souza, Pamela E.; Arehart, Kathryn H.; Kates, James M.; Croghan, Naomi B.H.; Gehani, Namita
2013-01-01
Objective This study examined how frequency lowering affected sentence intelligibility and quality, for adults with postlingually acquired, mild-to-moderate hearing loss. Method Listeners included adults aged 60–92 years with sloping sensorineural loss and a control group of similarly-aged adults with normal hearing. Sentences were presented in quiet and babble at a range of signal-to-noise ratios. Intelligibility and quality were measured with varying amounts of frequency lowering, implemented using a form of frequency compression. Results Moderate amounts of compression, particularly with high cutoff frequencies, had minimal effects on intelligibility. Listeners with the greatest high-frequency hearing loss showed the greatest benefit. Sentence intelligibility decreased with more compression. Listeners were more affected by a given set of parameters in noise. In quiet, any amount of compression resulted in lower speech quality for most listeners, with the greatest degradation for listeners with better high-frequency hearing. Quality ratings were lower with background noise, and in noise the effect of changing compression parameters was small. Conclusions The benefits of frequency lowering in adults were affected by the compression parameters as well as individual hearing thresholds. Data are consistent with the idea that frequency lowering can be viewed in terms of an improved audibility vs increased distortion tradeoff. PMID:23785188
Generalized multiple kernel learning with data-dependent priors.
Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li
2015-06-01
Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.
Setting priorities in health care organizations: criteria, processes, and parameters of success.
Gibson, Jennifer L; Martin, Douglas K; Singer, Peter A
2004-09-08
Hospitals and regional health authorities must set priorities in the face of resource constraints. Decision-makers seek practical ways to set priorities fairly in strategic planning, but find limited guidance from the literature. Very little has been reported from the perspective of Board members and senior managers about what criteria, processes and parameters of success they would use to set priorities fairly. We facilitated workshops for board members and senior leadership at three health care organizations to assist them in developing a strategy for fair priority setting. Workshop participants identified 8 priority setting criteria, 10 key priority setting process elements, and 6 parameters of success that they would use to set priorities in their organizations. Decision-makers in other organizations can draw lessons from these findings to enhance the fairness of their priority setting decision-making. Lessons learned in three workshops fill an important gap in the literature about what criteria, processes, and parameters of success Board members and senior managers would use to set priorities fairly.
NASA Technical Reports Server (NTRS)
1974-01-01
Activities related to the National Geodetic Satellite Program are reported and include a discussion of Ohio State University's OSU275 set of tracking station coordinates and transformation parameters, determination of network distortions, and plans for data acquisition and processing. The problems encountered in the development of the LAGEOS satellite are reported in an account of activities related to the Earth and Ocean Physics Applications Program. The LAGEOS problem involves transmission and reception of the laser pulse designed to make accurate determinations of the earth's crustal and rotational motions. Pulse motion, ephemeris, arc range measurements, and accuracy estimates are discussed in view of the problem. Personnel involved in the two programs are also listed, along with travel activities and reports published to date.
Up Periscope! Designing a New Perceptual Metric for Imaging System Performance
NASA Technical Reports Server (NTRS)
Watson, Andrew B.
2016-01-01
Modern electronic imaging systems include optics, sensors, sampling, noise, processing, compression, transmission and display elements, and are viewed by the human eye. Many of these elements cannot be assessed by traditional imaging system metrics such as the MTF. More complex metrics such as NVTherm do address these elements, but do so largely through parametric adjustment of an MTF-like metric. The parameters are adjusted through subjective testing of human observers identifying specific targets in a set of standard images. We have designed a new metric that is based on a model of human visual pattern classification. In contrast to previous metrics, ours simulates the human observer identifying the standard targets. One application of this metric is to quantify performance of modern electronic periscope systems on submarines.
Where to look? Automating attending behaviors of virtual human characters
NASA Technical Reports Server (NTRS)
Chopra Khullar, S.; Badler, N. I.
2001-01-01
This research proposes a computational framework for generating visual attending behavior in an embodied simulated human agent. Such behaviors directly control eye and head motions, and guide other actions such as locomotion and reach. The implementation of these concepts, referred to as the AVA, draws on empirical and qualitative observations known from psychology, human factors and computer vision. Deliberate behaviors, the analogs of scanpaths in visual psychology, compete with involuntary attention capture and lapses into idling or free viewing. Insights provided by implementing this framework are: a defined set of parameters that impact the observable effects of attention, a defined vocabulary of looking behaviors for certain motor and cognitive activity, a defined hierarchy of three levels of eye behavior (endogenous, exogenous and idling) and a proposed method of how these types interact.
Normalizing Heterogeneous Medical Imaging Data to Measure the Impact of Radiation Dose.
Silva, Luís A Bastião; Ribeiro, Luís S; Santos, Milton; Neves, Nuno; Francisco, Dulce; Costa, Carlos; Oliveira, José Luis
2015-12-01
The production of medical imaging is a continuing trend in healthcare institutions. Quality assurance for planned radiation exposure situations (e.g. X-ray, computer tomography) requires examination-specific set-ups according to several parameters, such as patient's age and weight, body region and clinical indication. These data are normally stored in several formats and with different nomenclatures, which hinder the continuous and automatic monitoring of these indicators and the comparison between several institutions and equipment. This article proposes a framework that aggregates, normalizes and provides different views over collected indicators. The developed tool can be used to improve the quality of radiologic procedures and also for benchmarking and auditing purposes. Finally, a case study and several experimental results related to radiation exposure and productivity are presented and discussed.
Chouet, B.
1988-01-01
A dynamic source model is presented, in which a 3-D crack containing a viscous compressible fluid is excited into resonance by an impulsive pressure transient applied over a small area DELTA S of the crack surface. The crack excitation depends critically on two dimensionless parameters called the crack stiffness and viscous damping loss. According to the model, the long-period event and harmonic tremor share the same source but differ in the boundary conditions for fluid flow and in the triggering mechanism setting up the resonance of the source, the former being viewed as the impulse response of the tremor generating system and the later representing the excitation due to more complex forcing functions.-from Author
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, M Pauline
2007-06-30
The VisPort visualization portal is an experiment in providing Web-based access to visualization functionality from any place and at any time. VisPort adopts a service-oriented architecture to encapsulate visualization functionality and to support remote access. Users employ browser-based client applications to choose data and services, set parameters, and launch visualization jobs. Visualization products typically images or movies are viewed in the user's standard Web browser. VisPort emphasizes visualization solutions customized for specific application communities. Finally, VisPort relies heavily on XML, and introduces the notion of visualization informatics - the formalization and specialization of information related to the process and productsmore » of visualization.« less
Lift Production on Flapping and Rotary Wings at Low Reynolds Numbers
2016-02-26
though parameter variations were also performed. For the rotating cases, the wing was an aspect ratio 2 rectangular flat plate , and the root cutout (i.e...rectangular flat plate . 2 U (Side View) (a) 1A: Rectilinear pitch U (Side View) (b) 1B: Rectilinear surge (Top View) (Side View) (c) 2A: Rotational...0.5c φ (b) A=2 flat plate wing Figure 2: Schematic of the AVT-202 rotating wing kinematics and geometry, from Ref. 12. 3.2 Experimental Setup Rotating
In silico models for predicting ready biodegradability under REACH: a comparative study.
Pizzo, Fabiola; Lombardo, Anna; Manganaro, Alberto; Benfenati, Emilio
2013-10-01
REACH (Registration Evaluation Authorization and restriction of Chemicals) legislation is a new European law which aims to raise the human protection level and environmental health. Under REACH all chemicals manufactured or imported for more than one ton per year must be evaluated for their ready biodegradability. Ready biodegradability is also used as a screening test for persistent, bioaccumulative and toxic (PBT) substances. REACH encourages the use of non-testing methods such as QSAR (quantitative structure-activity relationship) models in order to save money and time and to reduce the number of animals used for scientific purposes. Some QSAR models are available for predicting ready biodegradability. We used a dataset of 722 compounds to test four models: VEGA, TOPKAT, BIOWIN 5 and 6 and START and compared their performance on the basis of the following parameters: accuracy, sensitivity, specificity and Matthew's correlation coefficient (MCC). Performance was analyzed from different points of view. The first calculation was done on the whole dataset and VEGA and TOPKAT gave the best accuracy (88% and 87% respectively). Then we considered the compounds inside and outside the training set: BIOWIN 6 and 5 gave the best results for accuracy (81%) outside training set. Another analysis examined the applicability domain (AD). VEGA had the highest value for compounds inside the AD for all the parameters taken into account. Finally, compounds outside the training set and in the AD of the models were considered to assess predictive ability. VEGA gave the best accuracy results (99%) for this group of chemicals. Generally, START model gave poor results. Since BIOWIN, TOPKAT and VEGA models performed well, they may be used to predict ready biodegradability. Copyright © 2013 Elsevier B.V. All rights reserved.
Querying graphs in protein-protein interactions networks using feedback vertex set.
Blin, Guillaume; Sikora, Florian; Vialette, Stéphane
2010-01-01
Recent techniques increase rapidly the amount of our knowledge on interactions between proteins. The interpretation of these new information depends on our ability to retrieve known substructures in the data, the Protein-Protein Interactions (PPIs) networks. In an algorithmic point of view, it is an hard task since it often leads to NP-hard problems. To overcome this difficulty, many authors have provided tools for querying patterns with a restricted topology, i.e., paths or trees in PPI networks. Such restriction leads to the development of fixed parameter tractable (FPT) algorithms, which can be practicable for restricted sizes of queries. Unfortunately, Graph Homomorphism is a W[1]-hard problem, and hence, no FPT algorithm can be found when patterns are in the shape of general graphs. However, Dost et al. gave an algorithm (which is not implemented) to query graphs with a bounded treewidth in PPI networks (the treewidth of the query being involved in the time complexity). In this paper, we propose another algorithm for querying pattern in the shape of graphs, also based on dynamic programming and the color-coding technique. To transform graphs queries into trees without loss of informations, we use feedback vertex set coupled to a node duplication mechanism. Hence, our algorithm is FPT for querying graphs with a bounded size of their feedback vertex set. It gives an alternative to the treewidth parameter, which can be better or worst for a given query. We provide a python implementation which allows us to validate our implementation on real data. Especially, we retrieve some human queries in the shape of graphs into the fly PPI network.
Fast alternating projection methods for constrained tomographic reconstruction
Liu, Li; Han, Yongxin
2017-01-01
The alternating projection algorithms are easy to implement and effective for large-scale complex optimization problems, such as constrained reconstruction of X-ray computed tomography (CT). A typical method is to use projection onto convex sets (POCS) for data fidelity, nonnegative constraints combined with total variation (TV) minimization (so called TV-POCS) for sparse-view CT reconstruction. However, this type of method relies on empirically selected parameters for satisfactory reconstruction and is generally slow and lack of convergence analysis. In this work, we use a convex feasibility set approach to address the problems associated with TV-POCS and propose a framework using full sequential alternating projections or POCS (FS-POCS) to find the solution in the intersection of convex constraints of bounded TV function, bounded data fidelity error and non-negativity. The rationale behind FS-POCS is that the mathematically optimal solution of the constrained objective function may not be the physically optimal solution. The breakdown of constrained reconstruction into an intersection of several feasible sets can lead to faster convergence and better quantification of reconstruction parameters in a physical meaningful way than that in an empirical way of trial-and-error. In addition, for large-scale optimization problems, first order methods are usually used. Not only is the condition for convergence of gradient-based methods derived, but also a primal-dual hybrid gradient (PDHG) method is used for fast convergence of bounded TV. The newly proposed FS-POCS is evaluated and compared with TV-POCS and another convex feasibility projection method (CPTV) using both digital phantom and pseudo-real CT data to show its superior performance on reconstruction speed, image quality and quantification. PMID:28253298
Measuring saliency in images: which experimental parameters for the assessment of image quality?
NASA Astrophysics Data System (ADS)
Fredembach, Clement; Woolfe, Geoff; Wang, Jue
2012-01-01
Predicting which areas of an image are perceptually salient or attended to has become an essential pre-requisite of many computer vision applications. Because observers are notoriously unreliable in remembering where they look a posteriori, and because asking where they look while observing the image necessarily in uences the results, ground truth about saliency and visual attention has to be obtained by gaze tracking methods. From the early work of Buswell and Yarbus to the most recent forays in computer vision there has been, perhaps unfortunately, little agreement on standardisation of eye tracking protocols for measuring visual attention. As the number of parameters involved in experimental methodology can be large, their individual in uence on the nal results is not well understood. Consequently, the performance of saliency algorithms, when assessed by correlation techniques, varies greatly across the literature. In this paper, we concern ourselves with the problem of image quality. Specically: where people look when judging images. We show that in this case, the performance gap between existing saliency prediction algorithms and experimental results is signicantly larger than otherwise reported. To understand this discrepancy, we rst devise an experimental protocol that is adapted to the task of measuring image quality. In a second step, we compare our experimental parameters with the ones of existing methods and show that a lot of the variability can directly be ascribed to these dierences in experimental methodology and choice of variables. In particular, the choice of a task, e.g., judging image quality vs. free viewing, has a great impact on measured saliency maps, suggesting that even for a mildly cognitive task, ground truth obtained by free viewing does not adapt well. Careful analysis of the prior art also reveals that systematic bias can occur depending on instrumental calibration and the choice of test images. We conclude this work by proposing a set of parameters, tasks and images that can be used to compare the various saliency prediction methods in a manner that is meaningful for image quality assessment.
NOAA Photo Library - Navigating the Collection
will have to change the setting to 800x600 to view the full image without having to scroll from left to view or download the highest resolution image available, click on the message "High Resolution viewing individual images associated with albums. If wishing to view the image ID number of a thumbnail
VIEW-Station software and its graphical user interface
NASA Astrophysics Data System (ADS)
Kawai, Tomoaki; Okazaki, Hiroshi; Tanaka, Koichiro; Tamura, Hideyuki
1992-04-01
VIEW-Station is a workstation-based image processing system which merges the state-of-the- art software environment of Unix with the computing power of a fast image processor. VIEW- Station has a hierarchical software architecture, which facilitates device independence when porting across various hardware configurations, and provides extensibility in the development of application systems. The core image computing language is V-Sugar. V-Sugar provides a set of image-processing datatypes and allows image processing algorithms to be simply expressed, using a functional notation. VIEW-Station provides a hardware independent window system extension called VIEW-Windows. In terms of GUI (Graphical User Interface) VIEW-Station has two notable aspects. One is to provide various types of GUI as visual environments for image processing execution. Three types of interpreters called (mu) V- Sugar, VS-Shell and VPL are provided. Users may choose whichever they prefer based on their experience and tasks. The other notable aspect is to provide facilities to create GUI for new applications on the VIEW-Station system. A set of widgets are available for construction of task-oriented GUI. A GUI builder called VIEW-Kid is developed for WYSIWYG interactive interface design.
VizieR Online Data Catalog: A catalog of exoplanet physical parameters (Foreman-Mackey+, 2014)
NASA Astrophysics Data System (ADS)
Foreman-Mackey, D.; Hogg, D. W.; Morton, T. D.
2017-05-01
The first ingredient for any probabilistic inference is a likelihood function, a description of the probability of observing a specific data set given a set of model parameters. In this particular project, the data set is a catalog of exoplanet measurements and the model parameters are the values that set the shape and normalization of the occurrence rate density. (2 data files).
ERIC Educational Resources Information Center
Nikolopoulou, Kleopatra; Gialamas, Vasilis
2009-01-01
This paper discusses the compilation of an instrument in order to investigate pre-service early childhood teachers' views and intentions about integrating and using computers in early childhood settings. For the purpose of this study a questionnaire was compiled and administered to 258 pre-service early childhood teachers (PECTs), in Greece. A…
ERIC Educational Resources Information Center
Öztürk Yilmaztekin, Elif; Erden, Feyza Tantekin
2017-01-01
This study investigates early childhood teachers' views about science teaching practices in an early childhood settings. It was conducted in a preschool located in Ankara, Turkey. The data of the study were collected through multiple sources of information such as interviews with early childhood teachers and observations of their practices in the…
NASA Astrophysics Data System (ADS)
Beck, Hylke; de Roo, Ad; van Dijk, Albert; McVicar, Tim; Miralles, Diego; Schellekens, Jaap; Bruijnzeel, Sampurno; de Jeu, Richard
2015-04-01
Motivated by the lack of large-scale model parameter regionalization studies, a large set of 3328 small catchments (< 10000 km2) around the globe was used to set up and evaluate five model parameterization schemes at global scale. The HBV-light model was chosen because of its parsimony and flexibility to test the schemes. The catchments were calibrated against observed streamflow (Q) using an objective function incorporating both behavioral and goodness-of-fit measures, after which the catchment set was split into subsets of 1215 donor and 2113 evaluation catchments based on the calibration performance. The donor catchments were subsequently used to derive parameter sets that were transferred to similar grid cells based on a similarity measure incorporating climatic and physiographic characteristics, thereby producing parameter maps with global coverage. Overall, there was a lack of suitable donor catchments for mountainous and tropical environments. The schemes with spatially-uniform parameter sets (EXP2 and EXP3) achieved the worst Q estimation performance in the evaluation catchments, emphasizing the importance of parameter regionalization. The direct transfer of calibrated parameter sets from donor catchments to similar grid cells (scheme EXP1) performed best, although there was still a large performance gap between EXP1 and HBV-light calibrated against observed Q. The schemes with parameter sets obtained by simultaneously calibrating clusters of similar donor catchments (NC10 and NC58) performed worse than EXP1. The relatively poor Q estimation performance achieved by two (uncalibrated) macro-scale hydrological models suggests there is considerable merit in regionalizing the parameters of such models. The global HBV-light parameter maps and ancillary data are freely available via http://water.jrc.ec.europa.eu.
Optimization of multilayer neural network parameters for speaker recognition
NASA Astrophysics Data System (ADS)
Tovarek, Jaromir; Partila, Pavol; Rozhon, Jan; Voznak, Miroslav; Skapa, Jan; Uhrin, Dominik; Chmelikova, Zdenka
2016-05-01
This article discusses the impact of multilayer neural network parameters for speaker identification. The main task of speaker identification is to find a specific person in the known set of speakers. It means that the voice of an unknown speaker (wanted person) belongs to a group of reference speakers from the voice database. One of the requests was to develop the text-independent system, which means to classify wanted person regardless of content and language. Multilayer neural network has been used for speaker identification in this research. Artificial neural network (ANN) needs to set parameters like activation function of neurons, steepness of activation functions, learning rate, the maximum number of iterations and a number of neurons in the hidden and output layers. ANN accuracy and validation time are directly influenced by the parameter settings. Different roles require different settings. Identification accuracy and ANN validation time were evaluated with the same input data but different parameter settings. The goal was to find parameters for the neural network with the highest precision and shortest validation time. Input data of neural networks are a Mel-frequency cepstral coefficients (MFCC). These parameters describe the properties of the vocal tract. Audio samples were recorded for all speakers in a laboratory environment. Training, testing and validation data set were split into 70, 15 and 15 %. The result of the research described in this article is different parameter setting for the multilayer neural network for four speakers.
Vehicle Re-Identification by Deep Hidden Multi-View Inference.
Zhou, Yi; Liu, Li; Shao, Ling
2018-07-01
Vehicle re-identification (re-ID) is an area that has received far less attention in the computer vision community than the prevalent person re-ID. Possible reasons for this slow progress are the lack of appropriate research data and the special 3D structure of a vehicle. Previous works have generally focused on some specific views (e.g., front); but, these methods are less effective in realistic scenarios, where vehicles usually appear in arbitrary views to cameras. In this paper, we focus on the uncertainty of vehicle viewpoint in re-ID, proposing two end-to-end deep architectures: the Spatially Concatenated ConvNet and convolutional neural network (CNN)-LSTM bi-directional loop. Our models exploit the great advantages of the CNN and long short-term memory (LSTM) to learn transformations across different viewpoints of vehicles. Thus, a multi-view vehicle representation containing all viewpoints' information can be inferred from the only one input view, and then used for learning to measure distance. To verify our models, we also introduce a Toy Car RE-ID data set with images from multiple viewpoints of 200 vehicles. We evaluate our proposed methods on the Toy Car RE-ID data set and the public Multi-View Car, VehicleID, and VeRi data sets. Experimental results illustrate that our models achieve consistent improvements over the state-of-the-art vehicle re-ID approaches.
NASA Astrophysics Data System (ADS)
Neuville, R.; Pouliot, J.; Poux, F.; Hallot, P.; De Rudder, L.; Billen, R.
2017-10-01
This paper deals with the establishment of a comprehensive methodological framework that defines 3D visualisation rules and its application in a decision support tool. Whilst the use of 3D models grows in many application fields, their visualisation remains challenging from the point of view of mapping and rendering aspects to be applied to suitability support the decision making process. Indeed, there exists a great number of 3D visualisation techniques but as far as we know, a decision support tool that facilitates the production of an efficient 3D visualisation is still missing. This is why a comprehensive methodological framework is proposed in order to build decision tables for specific data, tasks and contexts. Based on the second-order logic formalism, we define a set of functions and propositions among and between two collections of entities: on one hand static retinal variables (hue, size, shape…) and 3D environment parameters (directional lighting, shadow, haze…) and on the other hand their effect(s) regarding specific visual tasks. It enables to define 3D visualisation rules according to four categories: consequence, compatibility, potential incompatibility and incompatibility. In this paper, the application of the methodological framework is demonstrated for an urban visualisation at high density considering a specific set of entities. On the basis of our analysis and the results of many studies conducted in the 3D semiotics, which refers to the study of symbols and how they relay information, the truth values of propositions are determined. 3D visualisation rules are then extracted for the considered context and set of entities and are presented into a decision table with a colour coding. Finally, the decision table is implemented into a plugin developed with three.js, a cross-browser JavaScript library. The plugin consists of a sidebar and warning windows that help the designer in the use of a set of static retinal variables and 3D environment parameters.
Performance optimization of the Varian aS500 EPID system.
Berger, Lucie; François, Pascal; Gaboriaud, Geneviève; Rosenwald, Jean-Claude
2006-01-01
Today, electronic portal imaging devices (EPIDs) are widely used as a replacement to portal films for patient position verification, but the image quality is not always optimal. The general aim of this study was to optimize the acquisition parameters of an amorphous silicon EPID commercially available for clinical use in radiation therapy with the view to avoid saturation of the system. Special attention was paid to selection of the parameter corresponding to the number of rows acquired between accelerator pulses (NRP) for various beam energies and dose rates. The image acquisition system (IAS2) has been studied, and portal image acquisition was found to be strongly dependent on the accelerator pulse frequency. This frequency is set for each "energy - dose rate" combination of the linear accelerator. For all combinations, the image acquisition parameters were systematically changed to determine their influence on the performances of the Varian aS500 EPID system. New parameters such as the maximum number of rows (MNR) and the number of pulses per frame (NPF) were introduced to explain portal image acquisition theory. Theoretical and experimental values of MNR and NPF were compared, and they were in good agreement. Other results showed that NRP had a major influence on detector saturation and dose per image. A rule of thumb was established to determine the optimum NRP value to be used. This practical application was illustrated by a clinical example in which the saturation of the aSi EPID was avoided by NRP optimization. Moreover, an additional study showed that image quality was relatively insensitive to this parameter.
A quasi-dense matching approach and its calibration application with Internet photos.
Wan, Yanli; Miao, Zhenjiang; Wu, Q M Jonathan; Wang, Xifu; Tang, Zhen; Wang, Zhifei
2015-03-01
This paper proposes a quasi-dense matching approach to the automatic acquisition of camera parameters, which is required for recovering 3-D information from 2-D images. An affine transformation-based optimization model and a new matching cost function are used to acquire quasi-dense correspondences with high accuracy in each pair of views. These correspondences can be effectively detected and tracked at the sub-pixel level in multiviews with our neighboring view selection strategy. A two-layer iteration algorithm is proposed to optimize 3-D quasi-dense points and camera parameters. In the inner layer, different optimization strategies based on local photometric consistency and a global objective function are employed to optimize the 3-D quasi-dense points and camera parameters, respectively. In the outer layer, quasi-dense correspondences are resampled to guide a new estimation and optimization process of the camera parameters. We demonstrate the effectiveness of our algorithm with several experiments.
Casadebaig, Pierre; Zheng, Bangyou; Chapman, Scott; Huth, Neil; Faivre, Robert; Chenu, Karine
2016-01-01
A crop can be viewed as a complex system with outputs (e.g. yield) that are affected by inputs of genetic, physiology, pedo-climatic and management information. Application of numerical methods for model exploration assist in evaluating the major most influential inputs, providing the simulation model is a credible description of the biological system. A sensitivity analysis was used to assess the simulated impact on yield of a suite of traits involved in major processes of crop growth and development, and to evaluate how the simulated value of such traits varies across environments and in relation to other traits (which can be interpreted as a virtual change in genetic background). The study focused on wheat in Australia, with an emphasis on adaptation to low rainfall conditions. A large set of traits (90) was evaluated in a wide target population of environments (4 sites × 125 years), management practices (3 sowing dates × 3 nitrogen fertilization levels) and CO2 (2 levels). The Morris sensitivity analysis method was used to sample the parameter space and reduce computational requirements, while maintaining a realistic representation of the targeted trait × environment × management landscape (∼ 82 million individual simulations in total). The patterns of parameter × environment × management interactions were investigated for the most influential parameters, considering a potential genetic range of +/- 20% compared to a reference cultivar. Main (i.e. linear) and interaction (i.e. non-linear and interaction) sensitivity indices calculated for most of APSIM-Wheat parameters allowed the identification of 42 parameters substantially impacting yield in most target environments. Among these, a subset of parameters related to phenology, resource acquisition, resource use efficiency and biomass allocation were identified as potential candidates for crop (and model) improvement. PMID:26799483
Casadebaig, Pierre; Zheng, Bangyou; Chapman, Scott; Huth, Neil; Faivre, Robert; Chenu, Karine
2016-01-01
A crop can be viewed as a complex system with outputs (e.g. yield) that are affected by inputs of genetic, physiology, pedo-climatic and management information. Application of numerical methods for model exploration assist in evaluating the major most influential inputs, providing the simulation model is a credible description of the biological system. A sensitivity analysis was used to assess the simulated impact on yield of a suite of traits involved in major processes of crop growth and development, and to evaluate how the simulated value of such traits varies across environments and in relation to other traits (which can be interpreted as a virtual change in genetic background). The study focused on wheat in Australia, with an emphasis on adaptation to low rainfall conditions. A large set of traits (90) was evaluated in a wide target population of environments (4 sites × 125 years), management practices (3 sowing dates × 3 nitrogen fertilization levels) and CO2 (2 levels). The Morris sensitivity analysis method was used to sample the parameter space and reduce computational requirements, while maintaining a realistic representation of the targeted trait × environment × management landscape (∼ 82 million individual simulations in total). The patterns of parameter × environment × management interactions were investigated for the most influential parameters, considering a potential genetic range of +/- 20% compared to a reference cultivar. Main (i.e. linear) and interaction (i.e. non-linear and interaction) sensitivity indices calculated for most of APSIM-Wheat parameters allowed the identification of 42 parameters substantially impacting yield in most target environments. Among these, a subset of parameters related to phenology, resource acquisition, resource use efficiency and biomass allocation were identified as potential candidates for crop (and model) improvement.
Pathway collages: personalized multi-pathway diagrams.
Paley, Suzanne; O'Maille, Paul E; Weaver, Daniel; Karp, Peter D
2016-12-13
Metabolic pathway diagrams are a classical way of visualizing a linked cascade of biochemical reactions. However, to understand some biochemical situations, viewing a single pathway is insufficient, whereas viewing the entire metabolic network results in information overload. How do we enable scientists to rapidly construct personalized multi-pathway diagrams that depict a desired collection of interacting pathways that emphasize particular pathway interactions? We define software for constructing personalized multi-pathway diagrams called pathway-collages using a combination of manual and automatic layouts. The user specifies a set of pathways of interest for the collage from a Pathway/Genome Database. Layouts for the individual pathways are generated by the Pathway Tools software, and are sent to a Javascript Pathway Collage application implemented using Cytoscape.js. That application allows the user to re-position pathways; define connections between pathways; change visual style parameters; and paint metabolomics, gene expression, and reaction flux data onto the collage to obtain a desired multi-pathway diagram. We demonstrate the use of pathway collages in two application areas: a metabolomics study of pathogen drug response, and an Escherichia coli metabolic model. Pathway collages enable facile construction of personalized multi-pathway diagrams.
The uncertainty processing theory of motivation.
Anselme, Patrick
2010-04-02
Most theories describe motivation using basic terminology (drive, 'wanting', goal, pleasure, etc.) that fails to inform well about the psychological mechanisms controlling its expression. This leads to a conception of motivation as a mere psychological state 'emerging' from neurophysiological substrates. However, the involvement of motivation in a large number of behavioural parameters (triggering, intensity, duration, and directedness) and cognitive abilities (learning, memory, decision, etc.) suggest that it should be viewed as an information processing system. The uncertainty processing theory (UPT) presented here suggests that motivation is the set of cognitive processes allowing organisms to extract information from the environment by reducing uncertainty about the occurrence of psychologically significant events. This processing of information is shown to naturally result in the highlighting of specific stimuli. The UPT attempts to solve three major problems: (i) how motivations can affect behaviour and cognition so widely, (ii) how motivational specificity for objects and events can result from nonspecific neuropharmacological causal factors (such as mesolimbic dopamine), and (iii) how motivational interactions can be conceived in psychological terms, irrespective of their biological correlates. The UPT is in keeping with the conceptual tradition of the incentive salience hypothesis while trying to overcome the shortcomings inherent to this view. Copyright 2009 Elsevier B.V. All rights reserved.
Self-adaptive calibration for staring infrared sensors
NASA Astrophysics Data System (ADS)
Kendall, William B.; Stocker, Alan D.
1993-10-01
This paper presents a new, self-adaptive technique for the correlation of non-uniformities (fixed-pattern noise) in high-density infrared focal-plane detector arrays. We have developed a new approach to non-uniformity correction in which we use multiple image frames of the scene itself, and take advantage of the aim-point wander caused by jitter, residual tracking errors, or deliberately induced motion. Such wander causes each detector in the array to view multiple scene elements, and each scene element to be viewed by multiple detectors. It is therefore possible to formulate (and solve) a set of simultaneous equations from which correction parameters can be computed for the detectors. We have tested our approach with actual images collected by the ARPA-sponsored MUSIC infrared sensor. For these tests we employed a 60-frame (0.75-second) sequence of terrain images for which an out-of-date calibration was deliberately used. The sensor was aimed at a point on the ground via an operator-assisted tracking system having a maximum aim point wander on the order of ten pixels. With these data, we were able to improve the calibration accuracy by a factor of approximately 100.
General view, south fourthfloor (attic) room, center block, looking northeast. ...
General view, south fourth-floor (attic) room, center block, looking northeast. Originally two rooms, the partition wall was likely removed when a cistern was installed, formerly set on the platform at the center of this view. - Lazaretto Quarantine Station, Wanamaker Avenue and East Second Street, Essington, Delaware County, PA
Native-View Paradigms: Multiple Cultures and Culture Conflicts in Organizations.
ERIC Educational Resources Information Center
Gregory, Kathleen L.
1983-01-01
After reviewing organizational culture studies done in industrial settings, this paper proposes a native-view paradigm from anthropology for exploring the multiple perspectives of participants in large organizations and describes a study--of Silicon Valley technical professionals' native views--that applies the methods of ethnoscience ethnography.…
A method and data for video monitor sizing. [human CRT viewing requirements
NASA Technical Reports Server (NTRS)
Kirkpatrick, M., III; Shields, N. L., Jr.; Malone, T. B.; Guerin, E. G.
1976-01-01
The paper outlines an approach consisting of using analytical methods and empirical data to determine monitor size constraints based on the human operator's CRT viewing requirements in a context where panel space and volume considerations for the Space Shuttle aft cabin constrain the size of the monitor to be used. Two cases are examined: remote scene imaging and alphanumeric character display. The central parameter used to constrain monitor size is the ratio M/L where M is the monitor dimension and L the viewing distance. The study is restricted largely to 525 line video systems having an SNR of 32 db and bandwidth of 4.5 MHz. Degradation in these parameters would require changes in the empirically determined visual angle constants presented. The data and methods described are considered to apply to cases where operators are required to view via TV target objects which are well differentiated from the background and where the background is relatively sparse. It is also necessary to identify the critical target dimensions and cues.
Dezawa, Akira; Sairyo, Koichi
2014-05-01
Organic electroluminescence displays (OELD) use organic materials that self-emit light with the passage of an electric current. OELD provide high contrast, excellent color reproducibility at low brightness, excellent video images, and less restricted viewing angles. OELD are thus promising for medical use. This study compared the utility of an OELD with conventional liquid crystal displays (LCD) for imaging in orthopedic endoscopic surgery. One OELD and two conventional LCD that were indistinguishable in external appearance were used in this study. Images from 18 patients were displayed simultaneously on three monitors and evaluated by six orthopedic surgeons with extensive surgical experience. Images were shown for 2 min, repeated twice, and viewed from the front and side (diagonally). Surgeon rated both clinical utility (12 parameters) and image quality (11 parameters) for each image on a 5-point scale: 1, very good; 2, good; 3, average; 4, poor; and 5, very poor. For clinical utility in 16 percutaneous endoscopic discectomy cases, mean scores for all 12 parameters were significantly better on the OELD than on the LCD, including organ distinguishability (2.1 vs 3.2, respectively), lesion identification (2.2 vs 3.1), and overall viewing impression (2.1 vs 3.1). For image quality, all 11 parameters were better on the OELD than on LCD. Significant differences were identified in six parameters, including contrast (1.8 vs 2.9), color reproducibility in dark areas (1.8 vs 2.9), and viewing angle (2.2 vs 2.9). The high contrast and excellent color reproducibility of the OELD reduced the constraints of imaging under endoscopy, in which securing a field of view may be difficult. Distinguishability of organs was good, including ligaments, dura mater, nerves, and adipose tissue, contributing to good stereoscopic images of the surgical field. These findings suggest the utility of OELD for excellent display of surgical images and for enabling safe and highly accurate endoscopic surgery. © 2014 Japan Society for Endoscopic Surgery, Asia Endosurgery Task Force and Wiley Publishing Asia Pty Ltd.
Using Ensemble Decisions and Active Selection to Improve Low-Cost Labeling for Multi-View Data
NASA Technical Reports Server (NTRS)
Rebbapragada, Umaa; Wagstaff, Kiri L.
2011-01-01
This paper seeks to improve low-cost labeling in terms of training set reliability (the fraction of correctly labeled training items) and test set performance for multi-view learning methods. Co-training is a popular multiview learning method that combines high-confidence example selection with low-cost (self) labeling. However, co-training with certain base learning algorithms significantly reduces training set reliability, causing an associated drop in prediction accuracy. We propose the use of ensemble labeling to improve reliability in such cases. We also discuss and show promising results on combining low-cost ensemble labeling with active (low-confidence) example selection. We unify these example selection and labeling strategies under collaborative learning, a family of techniques for multi-view learning that we are developing for distributed, sensor-network environments.
Numerical modelling of series-parallel cooling systems in power plant
NASA Astrophysics Data System (ADS)
Regucki, Paweł; Lewkowicz, Marek; Kucięba, Małgorzata
2017-11-01
The paper presents a mathematical model allowing one to study series-parallel hydraulic systems like, e.g., the cooling system of a power boiler's auxiliary devices or a closed cooling system including condensers and cooling towers. The analytical approach is based on a set of non-linear algebraic equations solved using numerical techniques. As a result of the iterative process, a set of volumetric flow rates of water through all the branches of the investigated hydraulic system is obtained. The calculations indicate the influence of changes in the pipeline's geometrical parameters on the total cooling water flow rate in the analysed installation. Such an approach makes it possible to analyse different variants of the modernization of the studied systems, as well as allowing for the indication of its critical elements. Basing on these results, an investor can choose the optimal variant of the reconstruction of the installation from the economic point of view. As examples of such a calculation, two hydraulic installations are described. One is a boiler auxiliary cooling installation including two screw ash coolers. The other is a closed cooling system consisting of cooling towers and condensers.
Vaala, Sarah E.
2014-01-01
Viewing television and video programming has become a normative behavior among US infants and toddlers. Little is understood about parents’ decision-making about the extent of their young children’s viewing, though numerous organizations are interested in reducing time spent viewing among infants and toddlers. Prior research has examined parents’ belief in the educational value of TV/videos for young children and the predictive value of this belief for understanding infant/toddler viewing rates, though other possible salient beliefs remain largely unexplored. This study employs the integrative model of behavioral prediction (Fishbein & Ajzen, 2010) to examine 30 maternal beliefs about infants’ and toddlers’ TV/video viewing which were elicited from a prior sample of mothers. Results indicate that mothers tend to hold more positive than negative beliefs about the outcomes associated with young children’s TV/video viewing, and that the nature of the aggregate set of beliefs is predictive of their general attitudes and intentions to allow their children to view, as well as children’s estimated viewing rates. Analyses also uncover multiple dimensions within the full set of beliefs, which explain more variance in mothers’ attitudes and intentions and children’s viewing than the uni-dimensional index. The theoretical and practical implications of the findings are discussed. PMID:25431537
Soong, David T.; Over, Thomas M.
2015-01-01
Recalibration of the HSPF parameters to the updated inputs and land covers was completed on two representative watershed models selected from the nine by using a manual method (HSPEXP) and an automatic method (PEST). The objective of the recalibration was to develop a regional parameter set that improves the accuracy in runoff volume prediction for the nine study watersheds. Knowledge about flow and watershed characteristics plays a vital role for validating the calibration in both manual and automatic methods. The best performing parameter set was determined by the automatic calibration method on a two-watershed model. Applying this newly determined parameter set to the nine watersheds for runoff volume simulation resulted in “very good” ratings in five watersheds, an improvement as compared to “very good” ratings achieved for three watersheds by the North Branch parameter set.
An enhanced digital line graph design
Guptill, Stephen C.
1990-01-01
In response to increasing information demands on its digital cartographic data, the U.S. Geological Survey has designed an enhanced version of the Digital Line Graph, termed Digital Line Graph - Enhanced (DLG-E). In the DLG-E model, the phenomena represented by geographic and cartographic data are termed entities. Entities represent individual phenomena in the real world. A feature is an abstraction of a set of entities, with the feature description encompassing only selected properties of the entities (typically the properties that have been portrayed cartographically on a map). Buildings, bridges, roads, streams, grasslands, and counties are examples of features. A feature instance, that is, one occurrence of a feature, is described in the digital environment by feature objects and spatial objects. A feature object identifies a feature instance and its nonlocational attributes. Nontopological relationships are associated with feature objects. The locational aspects of the feature instance are represented by spatial objects. Four spatial objects (points, nodes, chains, and polygons) and their topological relationships are defined. To link the locational and nonlocational aspects of the feature instance, a given feature object is associated with (or is composed of) a set of spatial objects. These objects, attributes, and relationships are the components of the DLG-E data model. To establish a domain of features for DLG-E, an approach using a set of classes, or views, of spatial entities was adopted. The five views that were developed are cover, division, ecosystem, geoposition, and morphology. The views are exclusive; each view is a self-contained analytical approach to the entire range of world features. Because each view is independent of the others, a single point on the surface of the Earth can be represented under multiple views. Under the five views, over 200 features were identified and defined. This set constitutes an initial domain of DLG-E features.
Lamprecht, J; Behrens, J; Mau, W; Schubert, M
2011-06-01
An aftercare programme following medical rehabilitation may be beneficial in order to reinforce and stabilize the positive effects of rehabilitation and to encourage individual health-related modifications of behaviour and lifestyle. Medical rehabilitation and the aftercare programme of the German Pension Insurance Fund primarily are intended to sustain earning capacity. As part of an evaluation of the Intensified Rehabilitation Aftercare Programme (IRENA) established by the German Pension Insurance Fund, work-related aspects in orthopaedic patients were analyzed based on various data sources. Firstly, the significance of institutional and individual conditions for utilization of IRENA alongside work was of interest. Secondly, the IRENA participants' judgements of the changes of work-related parameters due to the programme were examined, differentiating specifically by extent of earning capacity impairments as well as by particular work problems. The data set used for the analysis is composed of person-related routine data of the German Pension Insurance Fund relative to IRENA records of the year 2007 (n=30 663), interview data from orthopaedic rehabilitation centres providing IRENA (n=225), and questionnaires of IRENA participants (n=750) that were either collected during a broad evaluation of the IRENA programme or provided by the German Pension Insurance Fund. The results show that the compatibility of IRENA and work is facilitated by the institutional conditions. However, differences between inpatient and outpatient settings have to be recognized. The possibilities to participate in IRENA throughout the day frequently are more diverse in an outpatient setting. In contrast to inpatient centres, outpatient rehabilitation centres see clearly better chances for patients to return to work and to participate in IRENA alongside. With respect to the work-related parameters (work ability, periods of sick leave), clear improvements were reported by participants from the start of rehabilitation to the survey time after the end of IRENA. Particular work problems were reported by 33% of the IRENA participants. The work ability at the end of rehabilitation was found to have been the essential factor for improvement of work ability following IRENA. Particular work problems, however, had no influence, these individuals profited from IRENA to an equal extent. Institutional and individual view show that IRENA is compatible with utilization alongside work. Also, IRENA combined with prior medical rehabilitation will bring about subjective improvements in health and work-related parameters. © Georg Thieme Verlag KG Stuttgart · New York.
Particle size distribution of the stratospheric aerosol from SCIAMACHY limb measurements
NASA Astrophysics Data System (ADS)
Rozanov, Alexei; Malinina, Elizaveta; Bovensmann, Heinrich; Burrows, John
2017-04-01
A crucial role of the stratospheric aerosols for the radiative budget of the Earth's atmosphere and the consequences for the climate change are widely recognized. A reliable knowledge on physical and optical properties of the stratospheric aerosols as well as on their vertical and spatial distributing is a key issue to assure a proper initialization and running conditions for climate models. On a global scale this information can only be gained from space borne measurements. While a series of past, present and future instruments provide extensive date sets of such aerosol characteristics as extinction coefficient or backscattering ratio, information on a size distribution of the stratospheric aerosols is sparse. One of the important sources on vertically and spatially resolved information on the particle size distribution of stratospheric aerosols is provided by space borne measurements of the scattered solar light in limb viewing geometry performed in visible, near-infrared and short-wave infrared spectral ranges. SCIAMACHY (SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY) instrument operated on the European satellite Envisat from 2002 to 2102 was capable of providing spectral information needed to retrieve parameters of aerosol particle size distributions. In this presentation we discuss the retrieval method, present first validation results with SAGE II data and analyze first data sets of stratospheric aerosol particle size distribution parameters obtained from SCIAMACHY limb measurements. The research work was performed in the framework of ROMIC (Role of the middle atmosphere in climate) project.
Evolutionary Computing Methods for Spectral Retrieval
NASA Technical Reports Server (NTRS)
Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna
2009-01-01
A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.
Chaiyakunapruk, Nathorn; Somkrua, Ratchadaporn; Hutubessy, Raymond; Henao, Ana Maria; Hombach, Joachim; Melegaro, Alessia; Edmunds, John W; Beutels, Philippe
2011-05-12
Several decision support tools have been developed to aid policymaking regarding the adoption of pneumococcal conjugate vaccine (PCV) into national pediatric immunization programs. The lack of critical appraisal of these tools makes it difficult for decision makers to understand and choose between them. With the aim to guide policymakers on their optimal use, we compared publicly available decision-making tools in relation to their methods, influential parameters and results. The World Health Organization (WHO) requested access to several publicly available cost-effectiveness (CE) tools for PCV from both public and private provenance. All tools were critically assessed according to the WHO's guide for economic evaluations of immunization programs. Key attributes and characteristics were compared and a series of sensitivity analyses was performed to determine the main drivers of the results. The results were compared based on a standardized set of input parameters and assumptions. Three cost-effectiveness modeling tools were provided, including two cohort-based (Pan-American Health Organization (PAHO) ProVac Initiative TriVac, and PneumoADIP) and one population-based model (GlaxoSmithKline's SUPREMES). They all compared the introduction of PCV into national pediatric immunization program with no PCV use. The models were different in terms of model attributes, structure, and data requirement, but captured a similar range of diseases. Herd effects were estimated using different approaches in each model. The main driving parameters were vaccine efficacy against pneumococcal pneumonia, vaccine price, vaccine coverage, serotype coverage and disease burden. With a standardized set of input parameters developed for cohort modeling, TriVac and PneumoADIP produced similar incremental costs and health outcomes, and incremental cost-effectiveness ratios. Vaccine cost (dose price and number of doses), vaccine efficacy and epidemiology of critical endpoint (for example, incidence of pneumonia, distribution of serotypes causing pneumonia) were influential parameters in the models we compared. Understanding the differences and similarities of such CE tools through regular comparisons could render decision-making processes in different countries more efficient, as well as providing guiding information for further clinical and epidemiological research. A tool comparison exercise using standardized data sets can help model developers to be more transparent about their model structure and assumptions and provide analysts and decision makers with a more in-depth view behind the disease dynamics. Adherence to the WHO guide of economic evaluations of immunization programs may also facilitate this process. Please see related article: http://www.biomedcentral.com/1741-7007/9/55.
MCNP5 CALCULATIONS REPLICATING ARH-600 NITRATE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
FINFROCK SH
This report serves to extend the previous document: 'MCNP Calculations Replicating ARH-600 Data' by replicating the nitrate curves found in ARH-600. This report includes the MCNP models used, the calculated critical dimension for each analyzed parameter set, and the resulting data libraries for use with the CritView code. As with the ARH-600 data, this report is not meant to replace the analysis of the fissile systems by qualified criticality personnel. The M CNP data is presented without accounting for the statistical uncertainty (although this is typically less than 0.001) or bias and, as such, the application of a reasonable safetymore » margin is required. The data that follows pertains to the uranyl nitrate and plutonium nitrate spheres, infinite cylinders, and infinite slabs of varying isotopic composition, reflector thickness, and molarity. Each of the cases was modeled in MCNP (version 5.1.40), using the ENDF/B-VI cross section set. Given a molarity, isotopic composition, and reflector thickness, the fissile concentration and diameter (or thicknesses in the case of the slab geometries) were varied. The diameter for which k-effective equals 1.00 for a given concentration could then be calculated and graphed. These graphs are included in this report. The pages that follow describe the regions modeled, formulas for calculating the various parameters, a list of cross-sections used in the calculations, a description of the automation routine and data, and finally the data output. The data of most interest are the critical dimensions of the various systems analyzed. This is presented graphically, and in table format, in Appendix B. Appendix C provides a text listing of the same data in a format that is compatible with the CritView code. Appendices D and E provide listing of example Template files and MCNP input files (these are discussed further in Section 4). Appendix F is a complete listing of all of the output data (i.e., all of the analyzed dimensions and the resulting k{sub eff} values).« less
IPO: a tool for automated optimization of XCMS parameters.
Libiseller, Gunnar; Dvorzak, Michaela; Kleb, Ulrike; Gander, Edgar; Eisenberg, Tobias; Madeo, Frank; Neumann, Steffen; Trausinger, Gert; Sinner, Frank; Pieber, Thomas; Magnes, Christoph
2015-04-16
Untargeted metabolomics generates a huge amount of data. Software packages for automated data processing are crucial to successfully process these data. A variety of such software packages exist, but the outcome of data processing strongly depends on algorithm parameter settings. If they are not carefully chosen, suboptimal parameter settings can easily lead to biased results. Therefore, parameter settings also require optimization. Several parameter optimization approaches have already been proposed, but a software package for parameter optimization which is free of intricate experimental labeling steps, fast and widely applicable is still missing. We implemented the software package IPO ('Isotopologue Parameter Optimization') which is fast and free of labeling steps, and applicable to data from different kinds of samples and data from different methods of liquid chromatography - high resolution mass spectrometry and data from different instruments. IPO optimizes XCMS peak picking parameters by using natural, stable (13)C isotopic peaks to calculate a peak picking score. Retention time correction is optimized by minimizing relative retention time differences within peak groups. Grouping parameters are optimized by maximizing the number of peak groups that show one peak from each injection of a pooled sample. The different parameter settings are achieved by design of experiments, and the resulting scores are evaluated using response surface models. IPO was tested on three different data sets, each consisting of a training set and test set. IPO resulted in an increase of reliable groups (146% - 361%), a decrease of non-reliable groups (3% - 8%) and a decrease of the retention time deviation to one third. IPO was successfully applied to data derived from liquid chromatography coupled to high resolution mass spectrometry from three studies with different sample types and different chromatographic methods and devices. We were also able to show the potential of IPO to increase the reliability of metabolomics data. The source code is implemented in R, tested on Linux and Windows and it is freely available for download at https://github.com/glibiseller/IPO . The training sets and test sets can be downloaded from https://health.joanneum.at/IPO .
NASA Technical Reports Server (NTRS)
Minnis, P.; Sun-Mack, S.; Bedka, K. M.; Yost, C. R.; Trepte, Q. Z.; Smith, W. L., Jr.; Painemal, D.; Chen, Y.; Palikonda, R.; Dong, X.;
2016-01-01
Validation is a key component of remote sensing that can take many different forms. The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) is applied to many different imager datasets including those from the geostationary satellites, Meteosat, Himiwari-8, INSAT-3D, GOES, and MTSAT, as well as from the low-Earth orbiting satellite imagers, MODIS, AVHRR, and VIIRS. While each of these imagers have similar sets of channels with wavelengths near 0.65, 3.7, 11, and 12 micrometers, many differences among them can lead to discrepancies in the retrievals. These differences include spatial resolution, spectral response functions, viewing conditions, and calibrations, among others. Even when analyzed with nearly identical algorithms, it is necessary, because of those discrepancies, to validate the results from each imager separately in order to assess the uncertainties in the individual parameters. This paper presents comparisons of various SatCORPS-retrieved cloud parameters with independent measurements and retrievals from a variety of instruments. These include surface and space-based lidar and radar data from CALIPSO and CloudSat, respectively, to assess the cloud fraction, height, base, optical depth, and ice water path; satellite and surface microwave radiometers to evaluate cloud liquid water path; surface-based radiometers to evaluate optical depth and effective particle size; and airborne in-situ data to evaluate ice water content, effective particle size, and other parameters. The results of comparisons are compared and contrasted and the factors influencing the differences are discussed.
Artificial Intelligence in Mitral Valve Analysis
Jeganathan, Jelliffe; Knio, Ziyad; Amador, Yannis; Hai, Ting; Khamooshian, Arash; Matyal, Robina; Khabbaz, Kamal R; Mahmood, Feroze
2017-01-01
Background: Echocardiographic analysis of mitral valve (MV) has become essential for diagnosis and management of patients with MV disease. Currently, the various software used for MV analysis require manual input and are prone to interobserver variability in the measurements. Aim: The aim of this study is to determine the interobserver variability in an automated software that uses artificial intelligence for MV analysis. Settings and Design: Retrospective analysis of intraoperative three-dimensional transesophageal echocardiography data acquired from four patients with normal MV undergoing coronary artery bypass graft surgery in a tertiary hospital. Materials and Methods: Echocardiographic data were analyzed using the eSie Valve Software (Siemens Healthcare, Mountain View, CA, USA). Three examiners analyzed three end-systolic (ES) frames from each of the four patients. A total of 36 ES frames were analyzed and included in the study. Statistical Analysis: A multiple mixed-effects ANOVA model was constructed to determine if the examiner, the patient, and the loop had a significant effect on the average value of each parameter. A Bonferroni correction was used to correct for multiple comparisons, and P = 0.0083 was considered to be significant. Results: Examiners did not have an effect on any of the six parameters tested. Patient and loop had an effect on the average parameter value for each of the six parameters as expected (P < 0.0083 for both). Conclusion: We were able to conclude that using automated analysis, it is possible to obtain results with good reproducibility, which only requires minimal user intervention. PMID:28393769
In-Situ Visualization Experiments with ParaView Cinema in RAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kares, Robert John
2015-10-15
A previous paper described some numerical experiments performed using the ParaView/Catalyst in-situ visualization infrastructure deployed in the Los Alamos RAGE radiation-hydrodynamics code to produce images from a running large scale 3D ICF simulation. One challenge of the in-situ approach apparent in these experiments was the difficulty of choosing parameters likes isosurface values for the visualizations to be produced from the running simulation without the benefit of prior knowledge of the simulation results and the resultant cost of recomputing in-situ generated images when parameters are chosen suboptimally. A proposed method of addressing this difficulty is to simply render multiple images atmore » runtime with a range of possible parameter values to produce a large database of images and to provide the user with a tool for managing the resulting database of imagery. Recently, ParaView/Catalyst has been extended to include such a capability via the so-called Cinema framework. Here I describe some initial experiments with the first delivery of Cinema and make some recommendations for future extensions of Cinema’s capabilities.« less
Patient-specific bronchoscopy visualization through BRDF estimation and disocclusion correction.
Chung, Adrian J; Deligianni, Fani; Shah, Pallav; Wells, Athol; Yang, Guang-Zhong
2006-04-01
This paper presents an image-based method for virtual bronchoscope with photo-realistic rendering. The technique is based on recovering bidirectional reflectance distribution function (BRDF) parameters in an environment where the choice of viewing positions, directions, and illumination conditions are restricted. Video images of bronchoscopy examinations are combined with patient-specific three-dimensional (3-D) computed tomography data through two-dimensional (2-D)/3-D registration and shading model parameters are then recovered by exploiting the restricted lighting configurations imposed by the bronchoscope. With the proposed technique, the recovered BRDF is used to predict the expected shading intensity, allowing a texture map independent of lighting conditions to be extracted from each video frame. To correct for disocclusion artefacts, statistical texture synthesis was used to recreate the missing areas. New views not present in the original bronchoscopy video are rendered by evaluating the BRDF with different viewing and illumination parameters. This allows free navigation of the acquired 3-D model with enhanced photo-realism. To assess the practical value of the proposed technique, a detailed visual scoring that involves both real and rendered bronchoscope images is conducted.
Chen, Liang-Chieh; Papandreou, George; Kokkinos, Iasonas; Murphy, Kevin; Yuille, Alan L
2018-04-01
In this work we address the task of semantic image segmentation with Deep Learning and make three main contributions that are experimentally shown to have substantial practical merit. First, we highlight convolution with upsampled filters, or 'atrous convolution', as a powerful tool in dense prediction tasks. Atrous convolution allows us to explicitly control the resolution at which feature responses are computed within Deep Convolutional Neural Networks. It also allows us to effectively enlarge the field of view of filters to incorporate larger context without increasing the number of parameters or the amount of computation. Second, we propose atrous spatial pyramid pooling (ASPP) to robustly segment objects at multiple scales. ASPP probes an incoming convolutional feature layer with filters at multiple sampling rates and effective fields-of-views, thus capturing objects as well as image context at multiple scales. Third, we improve the localization of object boundaries by combining methods from DCNNs and probabilistic graphical models. The commonly deployed combination of max-pooling and downsampling in DCNNs achieves invariance but has a toll on localization accuracy. We overcome this by combining the responses at the final DCNN layer with a fully connected Conditional Random Field (CRF), which is shown both qualitatively and quantitatively to improve localization performance. Our proposed "DeepLab" system sets the new state-of-art at the PASCAL VOC-2012 semantic image segmentation task, reaching 79.7 percent mIOU in the test set, and advances the results on three other datasets: PASCAL-Context, PASCAL-Person-Part, and Cityscapes. All of our code is made publicly available online.
Vařeková, Radka Svobodová; Jiroušková, Zuzana; Vaněk, Jakub; Suchomel, Šimon; Koča, Jaroslav
2007-01-01
The Electronegativity Equalization Method (EEM) is a fast approach for charge calculation. A challenging part of the EEM is the parameterization, which is performed using ab initio charges obtained for a set of molecules. The goal of our work was to perform the EEM parameterization for selected sets of organic, organohalogen and organometal molecules. We have performed the most robust parameterization published so far. The EEM parameterization was based on 12 training sets selected from a database of predicted 3D structures (NCI DIS) and from a database of crystallographic structures (CSD). Each set contained from 2000 to 6000 molecules. We have shown that the number of molecules in the training set is very important for quality of the parameters. We have improved EEM parameters (STO-3G MPA charges) for elements that were already parameterized, specifically: C, O, N, H, S, F and Cl. The new parameters provide more accurate charges than those published previously. We have also developed new parameters for elements that were not parameterized yet, specifically for Br, I, Fe and Zn. We have also performed crossover validation of all obtained parameters using all training sets that included relevant elements and confirmed that calculated parameters provide accurate charges.
Persistent aerial video registration and fast multi-view mosaicing.
Molina, Edgardo; Zhu, Zhigang
2014-05-01
Capturing aerial imagery at high resolutions often leads to very low frame rate video streams, well under full motion video standards, due to bandwidth, storage, and cost constraints. Low frame rates make registration difficult when an aircraft is moving at high speeds or when global positioning system (GPS) contains large errors or it fails. We present a method that takes advantage of persistent cyclic video data collections to perform an online registration with drift correction. We split the persistent aerial imagery collection into individual cycles of the scene, identify and correct the registration errors on the first cycle in a batch operation, and then use the corrected base cycle as a reference pass to register and correct subsequent passes online. A set of multi-view panoramic mosaics is then constructed for each aerial pass for representation, presentation and exploitation of the 3D dynamic scene. These sets of mosaics are all in alignment to the reference cycle allowing their direct use in change detection, tracking, and 3D reconstruction/visualization algorithms. Stereo viewing with adaptive baselines and varying view angles is realized by choosing a pair of mosaics from a set of multi-view mosaics. Further, the mosaics for the second pass and later can be generated and visualized online as their is no further batch error correction.
Survey of Magnetosheath Plasma Properties at Saturn and Inference of Upstream Flow Conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomsen, M. F.; Coates, A. J.; Jackman, C. M.
A new Cassini magnetosheath data set is introduced that is based on a comprehensive survey of intervals in which the observed magnetosheath flow was encompassed within the plasma analyzer field of view and for which the computed numerical moments are therefore expected to be accurate. The data extend from 2004 day 299 to 2012 day 151 and comprise 19,155 416-s measurements. In addition to the plasma ion moments (density, temperature, and flow velocity), merged values of the plasma electron density and temperature, the energetic particle pressure, and the magnetic field vector are included in the data set. Statistical properties ofmore » various magnetosheath parameters, including dependence on local time, are presented. The magnetosheath field and flow are found to be only weakly aligned, primarily because of a relatively large z-component of the magnetic field, attributable to the field being pulled out of the equatorial orientation by flows at higher latitudes. A new procedure for using magnetosheath properties to estimate the upstream solar wind speed is proposed and used to determine that the amount of electron heating at Saturn's high Mach-number bow shock is ~4% of the dissipated flow energy. The data set is available as an electronic supplement to this paper.« less
Survey of Magnetosheath Plasma Properties at Saturn and Inference of Upstream Flow Conditions
Thomsen, M. F.; Coates, A. J.; Jackman, C. M.; ...
2018-03-01
A new Cassini magnetosheath data set is introduced that is based on a comprehensive survey of intervals in which the observed magnetosheath flow was encompassed within the plasma analyzer field of view and for which the computed numerical moments are therefore expected to be accurate. The data extend from 2004 day 299 to 2012 day 151 and comprise 19,155 416-s measurements. In addition to the plasma ion moments (density, temperature, and flow velocity), merged values of the plasma electron density and temperature, the energetic particle pressure, and the magnetic field vector are included in the data set. Statistical properties ofmore » various magnetosheath parameters, including dependence on local time, are presented. The magnetosheath field and flow are found to be only weakly aligned, primarily because of a relatively large z-component of the magnetic field, attributable to the field being pulled out of the equatorial orientation by flows at higher latitudes. A new procedure for using magnetosheath properties to estimate the upstream solar wind speed is proposed and used to determine that the amount of electron heating at Saturn's high Mach-number bow shock is ~4% of the dissipated flow energy. The data set is available as an electronic supplement to this paper.« less
HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.
Song, Chi; Tseng, George C
2014-01-01
Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.
NASA Astrophysics Data System (ADS)
Bergeron, Charles; Labelle, Hubert; Ronsky, Janet; Zernicke, Ronald
2005-04-01
Spinal curvature progression in scoliosis patients is monitored from X-rays, and this serial exposure to harmful radiation increases the incidence of developing cancer. With the aim of reducing the invasiveness of follow-up, this study seeks to relate the three-dimensional external surface to the internal geometry, having assumed that that the physiological links between these are sufficiently regular across patients. A database was used of 194 quasi-simultaneous acquisitions of two X-rays and a 3D laser scan of the entire trunk. Data was processed to sets of datapoints representing the trunk surface and spinal curve. Functional data analyses were performed using generalized Fourier series using a Haar basis and functional minimum noise fractions. The resulting coefficients became inputs and outputs, respectively, to an array of support vector regression (SVR) machines. SVR parameters were set based on theoretical results, and cross-validation increased confidence in the system's performance. Predicted lateral and frontal views of the spinal curve from the back surface demonstrated average L2-errors of 6.13 and 4.38 millimetres, respectively, across the test set; these compared favourably with measurement error in data. This constitutes a first robust prediction of the 3D spinal curve from external data using learning techniques.
JAIL: a structure-based interface library for macromolecules.
Günther, Stefan; von Eichborn, Joachim; May, Patrick; Preissner, Robert
2009-01-01
The increasing number of solved macromolecules provides a solid number of 3D interfaces, if all types of molecular contacts are being considered. JAIL annotates three different kinds of macromolecular interfaces, those between interacting protein domains, interfaces of different protein chains and interfaces between proteins and nucleic acids. This results in a total number of about 184,000 database entries. All the interfaces can easily be identified by a detailed search form or by a hierarchical tree that describes the protein domain architectures classified by the SCOP database. Visual inspection of the interfaces is possible via an interactive protein viewer. Furthermore, large scale analyses are supported by an implemented sequential and by a structural clustering. Similar interfaces as well as non-redundant interfaces can be easily picked out. Additionally, the sequential conservation of binding sites was also included in the database and is retrievable via Jmol. A comprehensive download section allows the composition of representative data sets with user defined parameters. The huge data set in combination with various search options allow a comprehensive view on all interfaces between macromolecules included in the Protein Data Bank (PDB). The download of the data sets supports numerous further investigations in macromolecular recognition. JAIL is publicly available at http://bioinformatics.charite.de/jail.
Martinek, Radek; Kelnar, Michal; Koudelka, Petr; Vanus, Jan; Bilik, Petr; Janku, Petr; Nazeran, Homer; Zidek, Jan
2016-02-01
This paper describes the design, construction, and testing of a multi-channel fetal electrocardiogram (fECG) signal generator based on LabVIEW. Special attention is paid to the fetal heart development in relation to the fetus' anatomy, physiology, and pathology. The non-invasive signal generator enables many parameters to be set, including fetal heart rate (FHR), maternal heart rate (MHR), gestational age (GA), fECG interferences (biological and technical artifacts), as well as other fECG signal characteristics. Furthermore, based on the change in the FHR and in the T wave-to-QRS complex ratio (T/QRS), the generator enables manifestations of hypoxic states (hypoxemia, hypoxia, and asphyxia) to be monitored while complying with clinical recommendations for classifications in cardiotocography (CTG) and fECG ST segment analysis (STAN). The generator can also produce synthetic signals with defined properties for 6 input leads (4 abdominal and 2 thoracic). Such signals are well suited to the testing of new and existing methods of fECG processing and are effective in suppressing maternal ECG while non-invasively monitoring abdominal fECG. They may also contribute to the development of a new diagnostic method, which may be referred to as non-invasive trans-abdominal CTG + STAN. The functional prototype is based on virtual instrumentation using the LabVIEW developmental environment and its associated data acquisition measurement cards (DAQmx). The generator also makes it possible to create synthetic signals and measure actual fetal and maternal ECGs by means of bioelectrodes.
Performance characterization of a single bi-axial scanning MEMS mirror-based head-worn display
NASA Astrophysics Data System (ADS)
Liang, Minhua
2002-06-01
The NomadTM Personal Display System is a head-worn display (HWD) with a see-through, high-resolution, high-luminance display capability. It is based on a single bi-axial scanning MEMS mirror. In the Nomad HWD system, a red laser diode emits a beam of light that is scanned bi-axially by a single MEMS mirror. A diffractive beam diffuser and an ocular expand the beam to form a 12mm exit pupil for comfortable viewing. The Nomad display has an SVGA (800x600) resolution, 60Hz frame rate, 23-degree horizontal field of view (FOV) and 3:4 vertical to horizontal aspect ratio, a luminance of 800~900 foot-Lamberts, see-through capability, 30mm eye-relief distance, and 1-foot to infinity focusing adjustment. We have characterized the performance parameters, such as field of view, distortion, contrast ratio (4x4 black and white checker board), modulation depth, exit pupil size, eye relief distance, maximum luminance, dynamic range ratio (full-on-to-full-off ratio), dimming ratio, and luminance uniformity at image plane. The Class-1 eye-safety requirements per IEC 60825-1 Amendment 2 (CDRH Laser Notice No. 50) are analyzed and verified by experiments. The paper describes all of the testing methods and set-ups as well as the representative test results. The test results demonstrate that the Nomad display is an eye-safe display product with good image quality and good user ergonomics.
Securing information display by use of visual cryptography.
Yamamoto, Hirotsugu; Hayasaki, Yoshio; Nishida, Nobuo
2003-09-01
We propose a secure display technique based on visual cryptography. The proposed technique ensures the security of visual information. The display employs a decoding mask based on visual cryptography. Without the decoding mask, the displayed information cannot be viewed. The viewing zone is limited by the decoding mask so that only one person can view the information. We have developed a set of encryption codes to maintain the designed viewing zone and have demonstrated a display that provides a limited viewing zone.
Evaluation of a Passive Nature Viewing Program Set to Music.
Cadman, Sally J
2014-09-01
Research has revealed that passive nature viewing (viewing nature scenes without actually being in nature) has many health benefits but little is known about the best method of offering this complementary modality. The purpose of this pilot program was to evaluate the impact of a passive nature viewing program set to music on stress reduction in adults living in the community. A pre- and postsurvey design along with weekly recordings of stress and relaxation levels were used to evaluate the effect of this passive nature viewing program on stress reduction. Participants watched one of three preselected nature scenes for 5 minutes a day over 1 month and rated their stress and relaxation levels weekly on a 100-mm Visual Analogue Scale before and after viewing the nature DVD. Quantitative analysis were not performed because of the less number of subjects (n = 10) completing the study. Qualitative analysis found five key categories that have an impact on program use: (a) technology, (b) personal preferences, (c) time, (d) immersion, and (e) use of the program. Holistic nurses may consider integrating patient preferences and immersion strategies in the design of future passive nature viewing programs to reduce attrition and improve success. © The Author(s) 2013.
Overall view of test set-up in bldg 13 at JSC during docking set-up tests
1974-08-04
S74-27049 (4 Aug. 1974) --- Overall view of test set-up in Building 23 at the Johnson Space Center during testing of the docking mechanisms for the joint U.S.-USSR Apollo-Soyuz Test Project. The cinematic check was being made when this picture was taken. The test control room is on the right. The Soviet-developed docking system is atop the USA-NASA developed docking system. Both American and Soviet engineers can be seen taking part in the docking testing. The ASTP docking mission in Earth orbit is scheduled for July 1975.
Raspberry Pi: a 35-dollar device for viewing DICOM images.
Paiva, Omir Antunes; Moreira, Renata de Oliveira
2014-01-01
Raspberry Pi is a low-cost computer created with educational purposes. It uses Linux and, most of times, freeware applications, particularly a software for viewing DICOM images. With an external monitor, the supported resolution (1920 × 1200 pixels) allows for the set up of simple viewing workstations at a reduced cost.
Raspberry Pi: a 35-dollar device for viewing DICOM images*
Paiva, Omir Antunes; Moreira, Renata de Oliveira
2014-01-01
Raspberry Pi is a low-cost computer created with educational purposes. It uses Linux and, most of times, freeware applications, particularly a software for viewing DICOM images. With an external monitor, the supported resolution (1920 × 1200 pixels) allows for the set up of simple viewing workstations at a reduced cost. PMID:25741057
Design of 4D x-ray tomography experiments for reconstruction using regularized iterative algorithms
NASA Astrophysics Data System (ADS)
Mohan, K. Aditya
2017-10-01
4D X-ray computed tomography (4D-XCT) is widely used to perform non-destructive characterization of time varying physical processes in various materials. The conventional approach to improving temporal resolution in 4D-XCT involves the development of expensive and complex instrumentation that acquire data faster with reduced noise. It is customary to acquire data with many tomographic views at a high signal to noise ratio. Instead, temporal resolution can be improved using regularized iterative algorithms that are less sensitive to noise and limited views. These algorithms benefit from optimization of other parameters such as the view sampling strategy while improving temporal resolution by reducing the total number of views or the detector exposure time. This paper presents the design principles of 4D-XCT experiments when using regularized iterative algorithms derived using the framework of model-based reconstruction. A strategy for performing 4D-XCT experiments is presented that allows for improving the temporal resolution by progressively reducing the number of views or the detector exposure time. Theoretical analysis of the effect of the data acquisition parameters on the detector signal to noise ratio, spatial reconstruction resolution, and temporal reconstruction resolution is also presented in this paper.
Display gamma is an important factor in Web image viewing
NASA Astrophysics Data System (ADS)
Zhang, Xuemei; Lavin, Yingmei; Silverstein, D. Amnon
2001-06-01
We conducted a perceptual image preference experiment over the web to find our (1) if typical computer users have significant variations in their display gamma settings, and (2) if so, do the gamma settings have significant perceptual effect on the appearance of images in their web browsers. The digital image renderings used were found to have preferred tone characteristics from a previous lab- controlled experiment. They were rendered with 4 different gamma settings. The subjects were asked to view the images over the web, with their own computer equipment and web browsers. The subjects werewe asked to view the images over the web, with their own computer equipment and web browsers. The subjects made pair-wise subjective preference judgements on which rendering they liked bets for each image. Each subject's display gamma setting was estimated using a 'gamma estimator' tool, implemented as a Java applet. The results indicated that (1) the user's gamma settings, as estimated in the experiment, span a wide range from about 1.8 to about 3.0; (2) the subjects preferred images that werewe rendered with a 'correct' gamma value matching their display setting. Subjects disliked images rendered with a gamma value not matching their displays'. This indicates that display gamma estimation is a perceptually significant factor in web image optimization.
1. Distant view, showing bridge in context with agricultural (pastures ...
1. Distant view, showing bridge in context with agricultural (pastures and cornfields) setting; looking southeast. - Eureka Bridge, Spanning Yellow River (Moved to City Park, Castalia), Frankville, Winneshiek County, IA
Whose point-of-view is it anyway?
NASA Astrophysics Data System (ADS)
Garvey, Gregory P.
2011-03-01
Shared virtual worlds such as Second Life privilege a single point-of-view, namely that of the user. When logged into Second Life a user sees the virtual world from a default viewpoint, which is from slightly above and behind the user's avatar (the user's alter ego 'in-world.') This point-of-view is as if the user were viewing his or her avatar using a camera floating a few feet behind it. In fact it is possible to set the view to as if you were seeing the world through the eyes of your avatar or you can even move the camera completely independent of your avatar. A change in point-of-view, means, more than just a different camera point-of-view. The practice of using multiple avatars requires a transformation of identity and personality. When a user 'enacts' the identity of a particular avatar, their 'real' personality is masked by the assumed personality. The technology of virtual worlds permits both a change of point-of -view and also facilitates a change in identity. Does this cause any psychological distress? Or is the ability to be someone else and see a world (a game, a virtual world) through a different set of eyes somehow liberating and even beneficial?
Multifacet structure of observed reconstructed integral images.
Martínez-Corral, Manuel; Javidi, Bahram; Martínez-Cuenca, Raúl; Saavedra, Genaro
2005-04-01
Three-dimensional images generated by an integral imaging system suffer from degradations in the form of grid of multiple facets. This multifacet structure breaks the continuity of the observed image and therefore reduces its visual quality. We perform an analysis of this effect and present the guidelines in the design of lenslet imaging parameters for optimization of viewing conditions with respect to the multifacet degradation. We consider the optimization of the system in terms of field of view, observer position and pupil function, lenslet parameters, and type of reconstruction. Numerical tests are presented to verify the theoretical analysis.
Investigation of designated eye position and viewing zone for a two-view autostereoscopic display.
Huang, Kuo-Chung; Chou, Yi-Heng; Lin, Lang-chin; Lin, Hoang Yan; Chen, Fu-Hao; Liao, Ching-Chiu; Chen, Yi-Han; Lee, Kuen; Hsu, Wan-Hsuan
2014-02-24
Designated eye position (DEP) and viewing zone (VZ) are important optical parameters for designing a two-view autostereoscopic display. Although much research has been done to date, little empirical evidence has been found to establish a direct relationship between design and measurement. More rigorous studies and verifications to investigate DEP and to ascertain the VZ criterion will be valuable. We propose evaluation metrics based on equivalent luminance (EL) and binocular luminance (BL) to figure out DEP and VZ for a two-view autostereoscopic display. Simulation and experimental results prove that our proposed evaluation metrics can be used to find the DEP and VZ accurately.
Military display market segment: avionics (Invited Paper)
NASA Astrophysics Data System (ADS)
Desjardins, Daniel D.; Hopper, Darrel G.
2005-05-01
The military display market is analyzed in terms of one of its segments: avionics. Requirements are summarized for 13 technology-driving parameters for direct-view and virtual-view displays in cockpits and cabins. Technical specifications are discussed for selected programs. Avionics stresses available technology and usually requires custom display designs.
Jig For Stereoscopic Photography
NASA Technical Reports Server (NTRS)
Nielsen, David J.
1990-01-01
Separations between views adjusted precisely for best results. Simple jig adjusted to set precisely, distance between right and left positions of camera used to make stereoscopic photographs. Camera slides in slot between extreme positions, where it takes stereoscopic pictures. Distance between extreme positions set reproducibly with micrometer. In view of trend toward very-large-scale integration of electronic circuits, training method and jig used to make training photographs useful to many companies to reduce cost of training manufacturing personnel.
Visual space under free viewing conditions.
Doumen, Michelle J A; Kappers, Astrid M L; Koenderink, Jan J
2005-10-01
Most research on visual space has been done under restricted viewing conditions and in reduced environments. In our experiments, observers performed an exocentric pointing task, a collinearity task, and a parallelity task in a entirely visible room. We varied the relative distances between the objects and the observer and the separation angle between the two objects. We were able to compare our data directly with data from experiments in an environment with less monocular depth information present. We expected that in a richer environment and under less restrictive viewing conditions, the settings would deviate less from the veridical settings. However, large systematic deviations from veridical settings were found for all three tasks. The structure of these deviations was task dependent, and the structure and the deviations themselves were comparable to those obtained under more restricted circumstances. Thus, the additional information was not used effectively by the observers.
Effect of structured visual environments on apparent eye level.
Stoper, A E; Cohen, M M
1989-11-01
Each of 12 subjects set a binocularly viewed target to apparent eye level; the target was projected on the rear wall of an open box, the floor of which was horizontal or pitched up and down at angles of 7.5 degrees and 15 degrees. Settings of the target were systematically biased by 60% of the pitch angle when the interior of the box was illuminated, but by only 5% when the interior of the box was darkened. Within-subjects variability of the settings was less under illuminated viewing conditions than in the dark, but was independent of box pitch angle. In a second experiment, 11 subjects were tested with an illuminated pitched box, yielding biases of 53% and 49% for binocular and monocular viewing conditions, respectively. The results are discussed in terms of individual and interactive effects of optical, gravitational, and extraretinal eye-position information in determining judgements of eye level.
Farrington, Conor; Burt, Jenni; Boiko, Olga; Campbell, John; Roland, Martin
2017-06-01
Patient experience surveys are increasingly important in the measurement of, and attempts to improve, health-care quality. To date, little research has focused upon doctors' attitudes to surveys which give them personalized feedback. This paper explores doctors' perceptions of patient experience surveys in primary and secondary care settings in order to deepen understandings of how doctors view the plausibility of such surveys. We conducted a qualitative study with doctors in two regions of England, involving in-depth semi-structured interviews with doctors working in primary care (n = 21) and secondary care (n = 20) settings. The doctors in both settings had recently received individualized feedback from patient experience surveys. Doctors in both settings express strong personal commitments to incorporating patient feedback in quality improvement efforts. However, they also concurrently express strong negative views about the credibility of survey findings and patients' motivations and competence in providing feedback. Thus, individual doctors demonstrate contradictory views regarding the plausibility of patient surveys, leading to complex, varied and on balance negative engagements with patient feedback. Doctors' contradictory views towards patient experience surveys are likely to limit the impact of such surveys in quality improvement initiatives in primary and secondary care. We highlight the need for 'sensegiving' initiatives (i.e. attempts to influence perceptions by communicating particular ideas, narratives and visions) to engage with doctors regarding the plausibility of patient experience surveys. This study highlights the importance of engaging with doctors' views about patient experience surveys when developing quality improvement initiatives. © 2016 The Authors. Health Expectations Published by John Wiley & Sons Ltd.
Classical topological paramagnetism
NASA Astrophysics Data System (ADS)
Bondesan, R.; Ringel, Z.
2017-05-01
Topological phases of matter are one of the hallmarks of quantum condensed matter physics. One of their striking features is a bulk-boundary correspondence wherein the topological nature of the bulk manifests itself on boundaries via exotic massless phases. In classical wave phenomena, analogous effects may arise; however, these cannot be viewed as equilibrium phases of matter. Here, we identify a set of rules under which robust equilibrium classical topological phenomena exist. We write simple and analytically tractable classical lattice models of spins and rotors in two and three dimensions which, at suitable parameter ranges, are paramagnetic in the bulk but nonetheless exhibit some unusual long-range or critical order on their boundaries. We point out the role of simplicial cohomology as a means of classifying, writing, and analyzing such models. This opens an experimental route for studying strongly interacting topological phases of spins.
Renormalization of the inflationary perturbations revisited
NASA Astrophysics Data System (ADS)
Markkanen, Tommi
2018-05-01
In this work we clarify aspects of renormalization on curved backgrounds focussing on the potential ramifications on the amplitude of inflationary perturbations. We provide an alternate view of the often used adiabatic prescription by deriving a correspondence between the adiabatic subtraction terms and traditional renormalization. Specifically, we show how adiabatic subtraction can be expressed as a set of counter terms that are introduced by redefining the bare parameters of the action. Our representation of adiabatic subtraction then allows us to easily find other renormalization prescriptions differing only in the finite parts of the counter terms. As our main result, we present for quadratic inflation how one may consistently express the renormalization of the spectrum of perturbations from inflation as a redefinition of the bare cosmological constant and Planck mass such that the observable predictions coincide with the unrenormalized result.
The Evolution of Psychology as a Basic Bio-behavioral Science in Healthcare Education.
Carr, John E
2017-12-01
For over a century, researchers and educators have called for the integration of psychological science into medical school curricula, but such efforts have been impeded by barriers within medicine and psychology. In addressing these barriers, Psychology has re-examined its relationship to Medicine, incorporated psychological practices into health care, and redefined its parameters as a science. In response to interdisciplinary research into the mechanisms of bio-behavioral interaction, Psychology evolved from an ancillary social science to a bio-behavioral science that is fundamental to medicine and health care. However, in recent medical school curriculum innovations, psychological science is being reduced to a set of "clinical skills," and once again viewed as an ancillary social science. These developments warrant concern and consideration of new approaches to integrating psychological science in medical education.
Self-consistent Hartree-Fock RPA calculations in 208Pb
NASA Astrophysics Data System (ADS)
Taqi, Ali H.; Ali, Mohammed S.
2018-01-01
The nuclear structure of 208Pb is studied in the framework of the self-consistent random phase approximation (SCRPA). The Hartree-Fock mean field and single particle states are used to implement a completely SCRPA with Skyrme-type interactions. The Hamiltonian is diagonalised within a model space using five Skyrme parameter sets, namely LNS, SkI3, SkO, SkP and SLy4. In view of the huge number of the existing Skyrme-force parameterizations, the question remains which of them provide the best description of data. The approach attempts to accurately describe the structure of the spherical even-even nucleus 208Pb. To illustrate our approach, we compared the binding energy, charge density distribution, excitation energy levels scheme with the available experimental data. Moreover, we calculated isoscalar and isovector monopole, dipole, and quadrupole transition densities and strength functions.
Using Arden Syntax for the creation of a multi-patient surveillance dashboard.
Kraus, Stefan; Drescher, Caroline; Sedlmayr, Martin; Castellanos, Ixchel; Prokosch, Hans-Ulrich; Toddenroth, Dennis
2015-10-09
Most practically deployed Arden-Syntax-based clinical decision support (CDS) modules process data from individual patients. The specification of Arden Syntax, however, would in principle also support multi-patient CDS. The patient data management system (PDMS) at our local intensive care units does not natively support patient overviews from customizable CDS routines, but local physicians indicated a demand for multi-patient tabular overviews of important clinical parameters such as key laboratory measurements. As our PDMS installation provides Arden Syntax support, we set out to explore the capability of Arden Syntax for multi-patient CDS by implementing a prototypical dashboard for visualizing laboratory findings from patient sets. Our implementation leveraged the object data type, supported by later versions of Arden, which turned out to be serviceable for representing complex input data from several patients. For our prototype, we designed a modularized architecture that separates the definition of technical operations, in particular the control of the patient context, from the actual clinical knowledge. Individual Medical Logic Modules (MLMs) for processing single patient attributes could then be developed according to well-tried Arden Syntax conventions. We successfully implemented a working dashboard prototype entirely in Arden Syntax. The architecture consists of a controller MLM to handle the patient context, a presenter MLM to generate a dashboard view, and a set of traditional MLMs containing the clinical decision logic. Our prototype could be integrated into the graphical user interface of the local PDMS. We observed that with realistic input data the average execution time of about 200ms for generating dashboard views attained applicable performance. Our study demonstrated the general feasibility of creating multi-patient CDS routines in Arden Syntax. We believe that our prototypical dashboard also suggests that such implementations can be relatively easy, and may simultaneously hold promise for sharing dashboards between institutions and reusing elementary components for additional dashboards. Copyright © 2015 Elsevier B.V. All rights reserved.
Advanced engine management of individual cylinders for control of exhaust species
Graves, Ronald L [Knoxville, TN; West, Brian H [Knoxville, TN; Huff, Shean P [Knoxville, TN; Parks, II, James E
2008-12-30
A method and system controls engine-out exhaust species of a combustion engine having a plurality of cylinders. The method typically includes various combinations of steps such as controlling combustion parameters in individual cylinders, grouping the individual cylinders into a lean set and a rich set of one or more cylinders, combusting the lean set in a lean combustion parameter condition having a lean air:fuel equivalence ratio, combusting the rich set in a rich combustion parameter condition having a rich air:fuel equivalence ratio, and adjusting the lean set and the rich set of one or more cylinders to generate net-lean combustion. The exhaust species may have elevated concentrations of hydrogen and oxygen.
Waller, Niels G; Feuerstahler, Leah
2017-01-01
In this study, we explored item and person parameter recovery of the four-parameter model (4PM) in over 24,000 real, realistic, and idealized data sets. In the first analyses, we fit the 4PM and three alternative models to data from three Minnesota Multiphasic Personality Inventory-Adolescent form factor scales using Bayesian modal estimation (BME). Our results indicated that the 4PM fits these scales better than simpler item Response Theory (IRT) models. Next, using the parameter estimates from these real data analyses, we estimated 4PM item parameters in 6,000 realistic data sets to establish minimum sample size requirements for accurate item and person parameter recovery. Using a factorial design that crossed discrete levels of item parameters, sample size, and test length, we also fit the 4PM to an additional 18,000 idealized data sets to extend our parameter recovery findings. Our combined results demonstrated that 4PM item parameters and parameter functions (e.g., item response functions) can be accurately estimated using BME in moderate to large samples (N ⩾ 5, 000) and person parameters can be accurately estimated in smaller samples (N ⩾ 1, 000). In the supplemental files, we report annotated [Formula: see text] code that shows how to estimate 4PM item and person parameters in [Formula: see text] (Chalmers, 2012 ).
Mshana, Simon; Shemilu, Haji; Ndawi, Benedict; Momburi, Roman; Olsen, Oystein Evjen; Byskov, Jens; Martin, Douglas K
2007-01-01
Background Priority setting in every health system is complex and difficult. In less wealthy countries the dominant approach to priority setting has been Burden of Disease (BOD) and cost-effectiveness analysis (CEA), which is helpful, but insufficient because it focuses on a narrow range of values – need and efficiency – and not the full range of relevant values, including legitimacy and fairness. 'Accountability for reasonableness' is a conceptual framework for legitimate and fair priority setting and is empirically based and ethically justified. It connects priority setting to broader, more fundamental, democratic deliberative processes that have an impact on social justice and equity. Can 'accountability for reasonableness' be helpful for improving priority setting in less wealthy countries? Methods In 2005, Tanzanian scholars from the Primary Health Care Institute (PHCI) conducted 6 capacity building workshops with senior health staff, district planners and managers, and representatives of the Tanzanian Ministry of Health to discussion improving priority setting in Tanzania using 'accountability for reasonableness'. The purpose of this paper is to describe this initiative and the participants' views about the approach. Results The approach to improving priority setting using 'accountability for reasonableness' was viewed by district decision makers with enthusiastic favour because it was the first framework that directly addressed their priority setting concerns. High level Ministry of Health participants were also very supportive of the approach. Conclusion Both Tanzanian district and governmental health planners viewed the 'accountability for reasonableness' approach with enthusiastic favour because it was the first framework that directly addressed their concerns. PMID:17997824
NASA Astrophysics Data System (ADS)
Nasonova, O. N.; Gusev, Ye. M.; Kovalev, Ye. E.
2009-04-01
Global estimates of the components of terrestrial water balance depend on a technique of estimation and on the global observational data sets used for this purpose. Land surface modelling is an up-to-date and powerful tool for such estimates. However, the results of modelling are affected by the quality of both a model and input information (including meteorological forcing data and model parameters). The latter is based on available global data sets containing meteorological data, land-use information, and soil and vegetation characteristics. Now there are a lot of global data sets, which differ in spatial and temporal resolution, as well as in accuracy and reliability. Evidently, uncertainties in global data sets will influence the results of model simulations, but to which extent? The present work is an attempt to investigate this issue. The work is based on the land surface model SWAP (Soil Water - Atmosphere - Plants) and global 1-degree data sets on meteorological forcing data and the land surface parameters, provided within the framework of the Second Global Soil Wetness Project (GSWP-2). The 3-hourly near-surface meteorological data (for the period from 1 July 1982 to 31 December 1995) are based on reanalyses and gridded observational data used in the International Satellite Land-Surface Climatology Project (ISLSCP) Initiative II. Following the GSWP-2 strategy, we used a number of alternative global forcing data sets to perform different sensitivity experiments (with six alternative versions of precipitation, four versions of radiation, two pure reanalysis products and two fully hybridized products of meteorological data). To reveal the influence of model parameters on simulations, in addition to GSWP-2 parameter data sets, we produced two alternative global data sets with soil parameters on the basis of their relationships with the content of clay and sand in a soil. After this the sensitivity experiments with three different sets of parameters were performed. As a result, 16 variants of global annual estimates of water balance components were obtained. Application of alternative data sets on radiation, precipitation, and soil parameters allowed us to reveal the influence of uncertainties in input data on global estimates of water balance components.
11. View to southeast. More distant overview of bridge in ...
11. View to southeast. More distant overview of bridge in setting; downstream side. (135mm lens) - South Fork Trinity River Bridge, State Highway 299 spanning South Fork Trinity River, Salyer, Trinity County, CA
The Last of the Gem Elixir (Research).
ERIC Educational Resources Information Center
Otto, Wayne
1988-01-01
Draws a humorous analogy between the Harmonic Convergence ushering in the dawn of the New Age and the convergence of a whole set of different views of reading instruction with established views and practices. (RS)
Clinically relevant hypoglycemia prediction metrics for event mitigation.
Harvey, Rebecca A; Dassau, Eyal; Zisser, Howard C; Bevier, Wendy; Seborg, Dale E; Jovanovič, Lois; Doyle, Francis J
2012-08-01
The purpose of this study was to develop a method to compare hypoglycemia prediction algorithms and choose parameter settings for different applications, such as triggering insulin pump suspension or alerting for rescue carbohydrate treatment. Hypoglycemia prediction algorithms with different parameter settings were implemented on an ambulatory dataset containing 490 days from 30 subjects with type 1 diabetes mellitus using the Dexcom™ (San Diego, CA) SEVEN™ continuous glucose monitoring system. The performance was evaluated using a proposed set of metrics representing the true-positive ratio, false-positive rate, and distribution of warning times. A prospective, in silico study was performed to show the effect of using different parameter settings to prevent or rescue from hypoglycemia. The retrospective study results suggest the parameter settings for different methods of hypoglycemia mitigation. When rescue carbohydrates are used, a high true-positive ratio, a minimal false-positive rate, and alarms with short warning time are desired. These objectives were met with a 30-min prediction horizon and two successive flags required to alarm: 78% of events were detected with 3.0 false alarms/day and 66% probability of alarms occurring within 30 min of the event. This parameter setting selection was confirmed in silico: treating with rescue carbohydrates reduced the duration of hypoglycemia from 14.9% to 0.5%. However, for a different method, such as pump suspension, this parameter setting only reduced hypoglycemia to 8.7%, as can be expected by the low probability of alarming more than 30 min ahead. The proposed metrics allow direct comparison of hypoglycemia prediction algorithms and selection of parameter settings for different types of hypoglycemia mitigation, as shown in the prospective in silico study in which hypoglycemia was alerted or treated with rescue carbohydrates.
Moon view period tabulations (with station masking) for Manned Space Flight Network stations, book 1
NASA Technical Reports Server (NTRS)
Gattie, M. M.; Williams, R. L.
1970-01-01
The times during which MSFN stations can view the moon are tabulated. Station view periods for each month are given. All times and dates refer to Greenwich Mean Time. AOS and LOS refer to the center of the moon at zero degrees elevation for moon rise and set, respectively.
New fundamental parameters for attitude representation
NASA Astrophysics Data System (ADS)
Patera, Russell P.
2017-08-01
A new attitude parameter set is developed to clarify the geometry of combining finite rotations in a rotational sequence and in combining infinitesimal angular increments generated by angular rate. The resulting parameter set of six Pivot Parameters represents a rotation as a great circle arc on a unit sphere that can be located at any clocking location in the rotation plane. Two rotations are combined by linking their arcs at either of the two intersection points of the respective rotation planes. In a similar fashion, linking rotational increments produced by angular rate is used to derive the associated kinematical equations, which are linear and have no singularities. Included in this paper is the derivation of twelve Pivot Parameter elements that represent all twelve Euler Angle sequences, which enables efficient conversions between Pivot Parameters and any Euler Angle sequence. Applications of this new parameter set include the derivation of quaternions and the quaternion composition rule, as well as, the derivation of the analytical solution to time dependent coning motion. The relationships between Pivot Parameters and traditional parameter sets are included in this work. Pivot Parameters are well suited for a variety of aerospace applications due to their effective composition rule, singularity free kinematic equations, efficient conversion to and from Euler Angle sequences and clarity of their geometrical foundation.
Adaptive Local Realignment of Protein Sequences.
DeBlasio, Dan; Kececioglu, John
2018-06-11
While mutation rates can vary markedly over the residues of a protein, multiple sequence alignment tools typically use the same values for their scoring-function parameters across a protein's entire length. We present a new approach, called adaptive local realignment, that in contrast automatically adapts to the diversity of mutation rates along protein sequences. This builds upon a recent technique known as parameter advising, which finds global parameter settings for an aligner, to now adaptively find local settings. Our approach in essence identifies local regions with low estimated accuracy, constructs a set of candidate realignments using a carefully-chosen collection of parameter settings, and replaces the region if a realignment has higher estimated accuracy. This new method of local parameter advising, when combined with prior methods for global advising, boosts alignment accuracy as much as 26% over the best default setting on hard-to-align protein benchmarks, and by 6.4% over global advising alone. Adaptive local realignment has been implemented within the Opal aligner using the Facet accuracy estimator.
Be discs in coplanar circular binaries: Phase-locked variations of emission lines
NASA Astrophysics Data System (ADS)
Panoglou, Despina; Faes, Daniel M.; Carciofi, Alex C.; Okazaki, Atsuo T.; Baade, Dietrich; Rivinius, Thomas; Borges Fernandes, Marcelo
2018-01-01
In this paper, we present the first results of radiative transfer calculations on decretion discs of binary Be stars. A smoothed particle hydrodynamics code computes the structure of Be discs in coplanar circular binary systems for a range of orbital and disc parameters. The resulting disc configuration consists of two spiral arms, and this can be given as input into a Monte Carlo code, which calculates the radiative transfer along the line of sight for various observational coordinates. Making use of the property of steady disc structure in coplanar circular binaries, observables are computed as functions of the orbital phase. Some orbital-phase series of line profiles are given for selected parameter sets under various viewing angles, to allow comparison with observations. Flat-topped profiles with and without superimposed multiple structures are reproduced, showing, for example, that triple-peaked profiles do not have to be necessarily associated with warped discs and misaligned binaries. It is demonstrated that binary tidal effects give rise to phase-locked variability of the violet-to-red (V/R) ratio of hydrogen emission lines. The V/R ratio exhibits two maxima per cycle; in certain cases those maxima are equal, leading to a clear new V/R cycle every half orbital period. This study opens a way to identifying binaries and to constraining the parameters of binary systems that exhibit phase-locked variations induced by tidal interaction with a companion star.
Assessment of Multiresolution Segmentation for Extracting Greenhouses from WORLDVIEW-2 Imagery
NASA Astrophysics Data System (ADS)
Aguilar, M. A.; Aguilar, F. J.; García Lorca, A.; Guirado, E.; Betlej, M.; Cichon, P.; Nemmaoui, A.; Vallario, A.; Parente, C.
2016-06-01
The latest breed of very high resolution (VHR) commercial satellites opens new possibilities for cartographic and remote sensing applications. In this way, object based image analysis (OBIA) approach has been proved as the best option when working with VHR satellite imagery. OBIA considers spectral, geometric, textural and topological attributes associated with meaningful image objects. Thus, the first step of OBIA, referred to as segmentation, is to delineate objects of interest. Determination of an optimal segmentation is crucial for a good performance of the second stage in OBIA, the classification process. The main goal of this work is to assess the multiresolution segmentation algorithm provided by eCognition software for delineating greenhouses from WorldView- 2 multispectral orthoimages. Specifically, the focus is on finding the optimal parameters of the multiresolution segmentation approach (i.e., Scale, Shape and Compactness) for plastic greenhouses. The optimum Scale parameter estimation was based on the idea of local variance of object heterogeneity within a scene (ESP2 tool). Moreover, different segmentation results were attained by using different combinations of Shape and Compactness values. Assessment of segmentation quality based on the discrepancy between reference polygons and corresponding image segments was carried out to identify the optimal setting of multiresolution segmentation parameters. Three discrepancy indices were used: Potential Segmentation Error (PSE), Number-of-Segments Ratio (NSR) and Euclidean Distance 2 (ED2).
Robust camera calibration for sport videos using court models
NASA Astrophysics Data System (ADS)
Farin, Dirk; Krabbe, Susanne; de With, Peter H. N.; Effelsberg, Wolfgang
2003-12-01
We propose an automatic camera calibration algorithm for court sports. The obtained camera calibration parameters are required for applications that need to convert positions in the video frame to real-world coordinates or vice versa. Our algorithm uses a model of the arrangement of court lines for calibration. Since the court model can be specified by the user, the algorithm can be applied to a variety of different sports. The algorithm starts with a model initialization step which locates the court in the image without any user assistance or a-priori knowledge about the most probable position. Image pixels are classified as court line pixels if they pass several tests including color and local texture constraints. A Hough transform is applied to extract line elements, forming a set of court line candidates. The subsequent combinatorial search establishes correspondences between lines in the input image and lines from the court model. For the succeeding input frames, an abbreviated calibration algorithm is used, which predicts the camera parameters for the new image and optimizes the parameters using a gradient-descent algorithm. We have conducted experiments on a variety of sport videos (tennis, volleyball, and goal area sequences of soccer games). Video scenes with considerable difficulties were selected to test the robustness of the algorithm. Results show that the algorithm is very robust to occlusions, partial court views, bad lighting conditions, or shadows.
NASA GIBS & Worldview - Lesson Ready Visualizations
NASA Astrophysics Data System (ADS)
Cechini, M. F.; Boller, R. A.; Baynes, K.; Gunnoe, T.; Wong, M. M.; Schmaltz, J. E.; De Luca, A. P.; King, J.; Roberts, J. T.; Rodriguez, J.; Thompson, C. K.; Alarcon, C.; De Cesare, C.; Pressley, N. N.
2016-12-01
For more than 20 years, the NASA Earth Observing System (EOS) has operated dozens of remote sensing satellites collecting 14 Petabytes of data that span thousands of science parameters. Within these observations are keys the Earth Scientists have used to unlock many things that we understand about our planet. Also contained within these observations are a myriad of opportunities for learning and education. The trick is making them accessible to educators and students in convenient and simple ways so that effort can be spent on lesson enrichment and not overcoming technical hurdles. The NASA Global Imagery Browse Services (GIBS) system and NASA Worldview website provide a unique view into EOS data through daily full resolution visualizations of hundreds of earth science parameters. For many of these parameters, visualizations are available within hours of acquisition from the satellite. For others, visualizations are available for the entire mission of the satellite. Accompanying the visualizations are visual aids such as color legends, place names, and orbit tracks. By using these visualizations, educators and students can observe natural phenomena that enrich a scientific education. This presentation will provide an overview of the visualizations available in NASA GIBS and Worldview and how they are accessed. We will also provide real-world examples of how the visualizations have been used in educational settings including planetariums, visitor centers, hack-a-thons, and public organizations.
N-mixture models for estimating population size from spatially replicated counts
Royle, J. Andrew
2004-01-01
Spatial replication is a common theme in count surveys of animals. Such surveys often generate sparse count data from which it is difficult to estimate population size while formally accounting for detection probability. In this article, i describe a class of models (n-mixture models) which allow for estimation of population size from such data. The key idea is to view site-specific population sizes, n, as independent random variables distributed according to some mixing distribution (e.g., Poisson). Prior parameters are estimated from the marginal likelihood of the data, having integrated over the prior distribution for n. Carroll and lombard (1985, journal of american statistical association 80, 423-426) proposed a class of estimators based on mixing over a prior distribution for detection probability. Their estimator can be applied in limited settings, but is sensitive to prior parameter values that are fixed a priori. Spatial replication provides additional information regarding the parameters of the prior distribution on n that is exploited by the n-mixture models and which leads to reasonable estimates of abundance from sparse data. A simulation study demonstrates superior operating characteristics (bias, confidence interval coverage) of the n-mixture estimator compared to the caroll and lombard estimator. Both estimators are applied to point count data on six species of birds illustrating the sensitivity to choice of prior on p and substantially different estimates of abundance as a consequence.
50. VIEW LOOKING SOUTHEAST AT A MOTORGENERATOR SET LOCATED UNDER ...
50. VIEW LOOKING SOUTHEAST AT A MOTOR-GENERATOR SET LOCATED UNDER CONTROL ROOM. THREE 450 kva., 2500 VOLT, 60 CYCLE MOTOR-GENERATOR UNITS PROVIDED POWER FOR THE RAILROAD SIGNAL SYSTEM. 25 CYCLE POWER WAS PROVIDED TO THE MOTOR (LEFT BACKGROUND). THE MOTOR TURNED THE GENERATOR (CENTER FOREGROUND) WHICH PRODUCED 60 CYCLE POWER TO OPERATE LIGHTS AND SIGNALING DEVICES. - New York, New Haven & Hartford Railroad, Cos Cob Power Plant, Sound Shore Drive, Greenwich, Fairfield County, CT
van Exel, Job; Baker, Rachel; Mason, Helen; Donaldson, Cam; Brouwer, Werner
2015-02-01
Resources available to the health care sector are finite and typically insufficient to fulfil all the demands for health care in the population. Decisions must be made about which treatments to provide. Relatively little is known about the views of the general public regarding the principles that should guide such decisions. We present the findings of a Q methodology study designed to elicit the shared views in the general public across ten countries regarding the appropriate principles for prioritising health care resources. In 2010, 294 respondents rank ordered a set of cards and the results of these were subject to by-person factor analysis to identify common patterns in sorting. Five distinct viewpoints were identified, (I) "Egalitarianism, entitlement and equality of access"; (II) "Severity and the magnitude of health gains"; (III) "Fair innings, young people and maximising health benefits"; (IV) "The intrinsic value of life and healthy living"; (V) "Quality of life is more important than simply staying alive". Given the plurality of views on the principles for health care priority setting, no single equity principle can be used to underpin health care priority setting. Hence, the process of decision making becomes more important, in which, arguably, these multiple perspectives in society should be somehow reflected. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Dynamics of Phonological Planning
ERIC Educational Resources Information Center
Roon, Kevin D.
2013-01-01
This dissertation proposes a dynamical computational model of the timecourse of phonological parameter setting. In the model, phonological representations embrace phonetic detail, with phonetic parameters represented as activation fields that evolve over time and determine the specific parameter settings of a planned utterance. Existing models of…
NASA Astrophysics Data System (ADS)
Sharma, Sanjib; Stello, Dennis; Buder, Sven; Kos, Janez; Bland-Hawthorn, Joss; Asplund, Martin; Duong, Ly; Lin, Jane; Lind, Karin; Ness, Melissa; Huber, Daniel; Zwitter, Tomaz; Traven, Gregor; Hon, Marc; Kafle, Prajwal R.; Khanna, Shourya; Saddon, Hafiz; Anguiano, Borja; Casey, Andrew R.; Freeman, Ken; Martell, Sarah; De Silva, Gayandhi M.; Simpson, Jeffrey D.; Wittenmyer, Rob A.; Zucker, Daniel B.
2018-01-01
The Transiting Exoplanet Survey Satellite (TESS) will provide high-precision time series photometry for millions of stars with at least a half-hour cadence. Of particular interest are the circular regions of 12° radius centred around the ecliptic poles that will be observed continuously for a full year. Spectroscopic stellar parameters are desirable to characterize and select suitable targets for TESS, whether they are focused on exploring exoplanets, stellar astrophysics or Galactic archaeology. Here, we present spectroscopic stellar parameters (Teff, log g, [Fe/H], v sin i, vmicro) for about 16 000 dwarf and subgiant stars in TESS' southern continuous viewing zone. For almost all the stars, we also present Bayesian estimates of stellar properties including distance, extinction, mass, radius and age using theoretical isochrones. Stellar surface gravity and radius are made available for an additional set of roughly 8500 red giants. All our target stars are in the range 10 < V < 13.1. Among them, we identify and list 227 stars belonging to the Large Magellanic Cloud. The data were taken using the High Efficiency and Resolution Multi-Element Spectrograph (HERMES; R ∼ 28 000) at the Anglo-Australian Telescope as part of the TESS-HERMES survey. Comparing our results with the TESS Input Catalogue (TIC) shows that the TIC is generally efficient in separating dwarfs and giants, but it has flagged more than 100 cool dwarfs (Teff < 4800 K) as giants, which ought to be high-priority targets for the exoplanet search. The catalogue can be accessed via http://www.physics.usyd.edu.au/tess-hermes/, or at Mikulski Archive for Space Telescopes (MAST).
Arabi, Hossein; Kamali Asl, Ali Reza; Ay, Mohammad Reza; Zaidi, Habib
2015-07-01
The purpose of this work is to evaluate the impact of optimization of magnification on performance parameters of the variable resolution X-ray (VRX) CT scanner. A realistic model based on an actual VRX CT scanner was implemented in the GATE Monte Carlo simulation platform. To evaluate the influence of system magnification, spatial resolution, field-of-view (FOV) and scatter-to-primary ratio of the scanner were estimated for both fixed and optimum object magnification at each detector rotation angle. Comparison and inference between these performance parameters were performed angle by angle to determine appropriate object position at each opening half angle. Optimization of magnification resulted in a trade-off between spatial resolution and FOV of the scanner at opening half angles of 90°-12°, where the spatial resolution increased up to 50% and the scatter-to-primary ratio decreased from 4.8% to 3.8% at a detector angle of about 90° for the same FOV and X-ray energy spectrum. The disadvantage of magnification optimization at these angles is the significant reduction of the FOV (up to 50%). Moreover, magnification optimization was definitely beneficial for opening half angles below 12° improving the spatial resolution from 7.5 cy/mm to 20 cy/mm. Meanwhile, the FOV increased by more than 50% at these angles. It can be concluded that optimization of magnification is essential for opening half angles below 12°. For opening half angles between 90° and 12°, the VRX CT scanner magnification should be set according to the desired spatial resolution and FOV. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
CAUSAL INFERENCE WITH A GRAPHICAL HIERARCHY OF INTERVENTIONS
Shpitser, Ilya; Tchetgen, Eric Tchetgen
2017-01-01
Identifying causal parameters from observational data is fraught with subtleties due to the issues of selection bias and confounding. In addition, more complex questions of interest, such as effects of treatment on the treated and mediated effects may not always be identified even in data where treatment assignment is known and under investigator control, or may be identified under one causal model but not another. Increasingly complex effects of interest, coupled with a diversity of causal models in use resulted in a fragmented view of identification. This fragmentation makes it unnecessarily difficult to determine if a given parameter is identified (and in what model), and what assumptions must hold for this to be the case. This, in turn, complicates the development of estimation theory and sensitivity analysis procedures. In this paper, we give a unifying view of a large class of causal effects of interest, including novel effects not previously considered, in terms of a hierarchy of interventions, and show that identification theory for this large class reduces to an identification theory of random variables under interventions from this hierarchy. Moreover, we show that one type of intervention in the hierarchy is naturally associated with queries identified under the Finest Fully Randomized Causally Interpretable Structure Tree Graph (FFRCISTG) model of Robins (via the extended g-formula), and another is naturally associated with queries identified under the Non-Parametric Structural Equation Model with Independent Errors (NPSEM-IE) of Pearl, via a more general functional we call the edge g-formula. Our results motivate the study of estimation theory for the edge g-formula, since we show it arises both in mediation analysis, and in settings where treatment assignment has unobserved causes, such as models associated with Pearl’s front-door criterion. PMID:28919652
Hidden from view: coupled dark sector physics and small scales
NASA Astrophysics Data System (ADS)
Elahi, Pascal J.; Lewis, Geraint F.; Power, Chris; Carlesi, Edoardo; Knebe, Alexander
2015-09-01
We study cluster mass dark matter (DM) haloes, their progenitors and surroundings in a coupled dark matter-dark energy (DE) model and compare it to quintessence and Λ cold dark matter (ΛCDM) models with adiabatic zoom simulations. When comparing cosmologies with different expansions histories, growth functions and power spectra, care must be taken to identify unambiguous signatures of alternative cosmologies. Shared cosmological parameters, such as σ8, need not be the same for optimal fits to observational data. We choose to set our parameters to ΛCDM z = 0 values. We find that in coupled models, where DM decays into DE, haloes appear remarkably similar to ΛCDM haloes despite DM experiencing an additional frictional force. Density profiles are not systematically different and the subhalo populations have similar mass, spin, and spatial distributions, although (sub)haloes are less concentrated on average in coupled cosmologies. However, given the scatter in related observables (V_max,R_{V_max}), this difference is unlikely to distinguish between coupled and uncoupled DM. Observations of satellites of Milky Way and M31 indicate a significant subpopulation reside in a plane. Coupled models do produce planar arrangements of satellites of higher statistical significance than ΛCDM models; however, in all models these planes are dynamically unstable. In general, the non-linear dynamics within and near large haloes masks the effects of a coupled dark sector. The sole environmental signature we find is that small haloes residing in the outskirts are more deficient in baryons than their ΛCDM counterparts. The lack of a pronounced signal for a coupled dark sector strongly suggests that such a phenomena would be effectively hidden from view.
CAUSAL INFERENCE WITH A GRAPHICAL HIERARCHY OF INTERVENTIONS.
Shpitser, Ilya; Tchetgen, Eric Tchetgen
2016-12-01
Identifying causal parameters from observational data is fraught with subtleties due to the issues of selection bias and confounding. In addition, more complex questions of interest, such as effects of treatment on the treated and mediated effects may not always be identified even in data where treatment assignment is known and under investigator control, or may be identified under one causal model but not another. Increasingly complex effects of interest, coupled with a diversity of causal models in use resulted in a fragmented view of identification. This fragmentation makes it unnecessarily difficult to determine if a given parameter is identified (and in what model), and what assumptions must hold for this to be the case. This, in turn, complicates the development of estimation theory and sensitivity analysis procedures. In this paper, we give a unifying view of a large class of causal effects of interest, including novel effects not previously considered, in terms of a hierarchy of interventions, and show that identification theory for this large class reduces to an identification theory of random variables under interventions from this hierarchy. Moreover, we show that one type of intervention in the hierarchy is naturally associated with queries identified under the Finest Fully Randomized Causally Interpretable Structure Tree Graph (FFRCISTG) model of Robins (via the extended g-formula), and another is naturally associated with queries identified under the Non-Parametric Structural Equation Model with Independent Errors (NPSEM-IE) of Pearl, via a more general functional we call the edge g-formula. Our results motivate the study of estimation theory for the edge g-formula, since we show it arises both in mediation analysis, and in settings where treatment assignment has unobserved causes, such as models associated with Pearl's front-door criterion.
The SeaView EarthCube project: Lessons Learned from Integrating Across Repositories
NASA Astrophysics Data System (ADS)
Diggs, S. C.; Stocks, K. I.; Arko, R. A.; Kinkade, D.; Shepherd, A.; Olson, C. J.; Pham, A.
2017-12-01
SeaView is an NSF-funded EarthCube Integrative Activity Project working with 5 existing data repositories* to provide oceanographers with highly integrated thematic data collections in user-requested formats. The project has three complementary goals: Supporting Scientists: SeaView targets scientists' need for easy access to data of interest that are ready to import into their preferred tool. Strengthening Repositories: By integrating data from multiple repositories for science use, SeaView is helping the ocean data repositories align their data and processes and make ocean data more accessible and easily integrated. Informing EarthCube (earthcube.org): SeaView's experience as an integration demonstration can inform the larger NSF EarthCube architecture and design effort. The challenges faced in this small-scale effort are informative to geosciences cyberinfrastructure more generally. Here we focus on the lessons learned that may inform other data facilities and integrative architecture projects. (The SeaView data collections will be presented at the Ocean Sciences 2018 meeting.) One example is the importance of shared semantics, with persistent identifiers, for key integration elements across the data sets (e.g. cruise, parameter, and project/program.) These must allow for revision through time and should have an agreed authority or process for resolving conflicts: aligning identifiers and correcting errors were time consuming and often required both deep domain knowledge and "back end" knowledge of the data facilities. Another example is the need for robust provenance, and tools that support automated or semi-automated data transform pipelines that capture provenance. Multiple copies and versions of data are now flowing into repositories, and onward to long-term archives such as NOAA NCEI and umbrella portals such as DataONE. Exact copies can be identified with hashes (for those that have the skills), but it can be painfully difficult to understand the processing or format changes that differentiates versions. As more sensors are deployed, and data re-use increases, this will only become more challenging. We will discuss these, and additional lessons learned, as well as invite discussion and solutions from others doing similar work. * BCO-DMO, CCHDO, OBIS, OOI, R2R
NASA Astrophysics Data System (ADS)
Zhang, Tianzhen; Wang, Xiumei; Gao, Xinbo
2018-04-01
Nowadays, several datasets are demonstrated by multi-view, which usually include shared and complementary information. Multi-view clustering methods integrate the information of multi-view to obtain better clustering results. Nonnegative matrix factorization has become an essential and popular tool in clustering methods because of its interpretation. However, existing nonnegative matrix factorization based multi-view clustering algorithms do not consider the disagreement between views and neglects the fact that different views will have different contributions to the data distribution. In this paper, we propose a new multi-view clustering method, named adaptive multi-view clustering based on nonnegative matrix factorization and pairwise co-regularization. The proposed algorithm can obtain the parts-based representation of multi-view data by nonnegative matrix factorization. Then, pairwise co-regularization is used to measure the disagreement between views. There is only one parameter to auto learning the weight values according to the contribution of each view to data distribution. Experimental results show that the proposed algorithm outperforms several state-of-the-arts algorithms for multi-view clustering.
Setting analyst: A practical harvest planning technique
Olivier R.M. Halleux; W. Dale Greene
2001-01-01
Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...
NASA Astrophysics Data System (ADS)
Kelleher, Christa; McGlynn, Brian; Wagener, Thorsten
2017-07-01
Distributed catchment models are widely used tools for predicting hydrologic behavior. While distributed models require many parameters to describe a system, they are expected to simulate behavior that is more consistent with observed processes. However, obtaining a single set of acceptable parameters can be problematic, as parameter equifinality often results in several behavioral
sets that fit observations (typically streamflow). In this study, we investigate the extent to which equifinality impacts a typical distributed modeling application. We outline a hierarchical approach to reduce the number of behavioral sets based on regional, observation-driven, and expert-knowledge-based constraints. For our application, we explore how each of these constraint classes reduced the number of behavioral
parameter sets and altered distributions of spatiotemporal simulations, simulating a well-studied headwater catchment, Stringer Creek, Montana, using the distributed hydrology-soil-vegetation model (DHSVM). As a demonstrative exercise, we investigated model performance across 10 000 parameter sets. Constraints on regional signatures, the hydrograph, and two internal measurements of snow water equivalent time series reduced the number of behavioral parameter sets but still left a small number with similar goodness of fit. This subset was ultimately further reduced by incorporating pattern expectations of groundwater table depth across the catchment. Our results suggest that utilizing a hierarchical approach based on regional datasets, observations, and expert knowledge to identify behavioral parameter sets can reduce equifinality and bolster more careful application and simulation of spatiotemporal processes via distributed modeling at the catchment scale.
HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.
Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C
2004-07-01
A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.
Julia Sets in Parameter Spaces
NASA Astrophysics Data System (ADS)
Buff, X.; Henriksen, C.
Given a complex number λ of modulus 1, we show that the bifurcation locus of the one parameter family {fb(z)=λz+bz2+z3}b∈ contains quasi-conformal copies of the quadratic Julia set J(λz+z2). As a corollary, we show that when the Julia set J(λz+z2) is not locally connected (for example when z|-->λz+z2 has a Cremer point at 0), the bifurcation locus is not locally connected. To our knowledge, this is the first example of complex analytic parameter space of dimension 1, with connected but non-locally connected bifurcation locus. We also show that the set of complex numbers λ of modulus 1, for which at least one of the parameter rays has a non-trivial accumulation set, contains a dense Gδ subset of S1.
Parameter Estimation for Thurstone Choice Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vojnovic, Milan; Yun, Seyoung
We consider the estimation accuracy of individual strength parameters of a Thurstone choice model when each input observation consists of a choice of one item from a set of two or more items (so called top-1 lists). This model accommodates the well-known choice models such as the Luce choice model for comparison sets of two or more items and the Bradley-Terry model for pair comparisons. We provide a tight characterization of the mean squared error of the maximum likelihood parameter estimator. We also provide similar characterizations for parameter estimators defined by a rank-breaking method, which amounts to deducing one ormore » more pair comparisons from a comparison of two or more items, assuming independence of these pair comparisons, and maximizing a likelihood function derived under these assumptions. We also consider a related binary classification problem where each individual parameter takes value from a set of two possible values and the goal is to correctly classify all items within a prescribed classification error. The results of this paper shed light on how the parameter estimation accuracy depends on given Thurstone choice model and the structure of comparison sets. In particular, we found that for unbiased input comparison sets of a given cardinality, when in expectation each comparison set of given cardinality occurs the same number of times, for a broad class of Thurstone choice models, the mean squared error decreases with the cardinality of comparison sets, but only marginally according to a diminishing returns relation. On the other hand, we found that there exist Thurstone choice models for which the mean squared error of the maximum likelihood parameter estimator can decrease much faster with the cardinality of comparison sets. We report empirical evaluation of some claims and key parameters revealed by theory using both synthetic and real-world input data from some popular sport competitions and online labor platforms.« less
Real Image Visual Display System
1992-12-01
DTI-100M autostereoscopic display ......................... 15 8. Lenticular screen ........ ............................. 16 9. Lenticular screen...parameters and pixel position ................. 17 10. General viewing of the stereoscopic couple .................... 18 11. Viewing zones for lenticular ...involves using a lenticular screen for imaging. Lenticular screens are probably most familiar in the form of ŗ-D postcards" which 15 consist of an
NASA Astrophysics Data System (ADS)
Odijk, Dennis; Zhang, Baocheng; Khodabandeh, Amir; Odolinski, Robert; Teunissen, Peter J. G.
2016-01-01
The concept of integer ambiguity resolution-enabled Precise Point Positioning (PPP-RTK) relies on appropriate network information for the parameters that are common between the single-receiver user that applies and the network that provides this information. Most of the current methods for PPP-RTK are based on forming the ionosphere-free combination using dual-frequency Global Navigation Satellite System (GNSS) observations. These methods are therefore restrictive in the light of the development of new multi-frequency GNSS constellations, as well as from the point of view that the PPP-RTK user requires ionospheric corrections to obtain integer ambiguity resolution results based on short observation time spans. The method for PPP-RTK that is presented in this article does not have above limitations as it is based on the undifferenced, uncombined GNSS observation equations, thereby keeping all parameters in the model. Working with the undifferenced observation equations implies that the models are rank-deficient; not all parameters are unbiasedly estimable, but only combinations of them. By application of S-system theory the model is made of full rank by constraining a minimum set of parameters, or S-basis. The choice of this S-basis determines the estimability and the interpretation of the parameters that are transmitted to the PPP-RTK users. As this choice is not unique, one has to be very careful when comparing network solutions in different S-systems; in that case the S-transformation, which is provided by the S-system method, should be used to make the comparison. Knowing the estimability and interpretation of the parameters estimated by the network is shown to be crucial for a correct interpretation of the estimable PPP-RTK user parameters, among others the essential ambiguity parameters, which have the integer property which is clearly following from the interpretation of satellite phase biases from the network. The flexibility of the S-system method is furthermore demonstrated by the fact that all models in this article are derived in multi-epoch mode, allowing to incorporate dynamic model constraints on all or subsets of parameters.
What do we perceive from motion pictures? A computational account.
Cheong, Loong-Fah; Xiang, Xu
2007-06-01
Cinema viewed from a location other than a canonical viewing point (CVP) presents distortions to the viewer in both its static and its dynamic aspects. Past works have investigated mainly the static aspect of this problem and attempted to explain why viewers still seem to perceive the scene very well. The dynamic aspect of depth perception, which is known as structure from motion, and its possible distortion, have not been well investigated. We derive the dynamic depth cues perceived by the viewer and use the so-called isodistortion framework to understand its distortion. The result is that viewers seated at a reasonably central position experience a shift in the intrinsic parameters of their visual systems. Despite this shift, the key properties of the perceived depths remain largely the same, being determined in the main by the accuracy to which extrinsic motion parameters can be recovered. For a viewer seated at a noncentral position and watching the movie screen at a slant angle, the view is related to the view at the CVP by a homography, resulting in various aberrations such as noncentral projection.
Image Tiling for Profiling Large Objects
NASA Technical Reports Server (NTRS)
Venkataraman, Ajit; Schock, Harold; Mercer, Carolyn R.
1992-01-01
Three dimensional surface measurements of large objects arc required in a variety of industrial processes. The nature of these measurements is changing as optical instruments arc beginning to replace conventional contact probes scanned over the objects. A common characteristic of the optical surface profilers is the trade off between measurement accuracy and field of view. In order to measure a large object with high accuracy, multiple views arc required. An accurate transformation between the different views is needed to bring about their registration. In this paper, we demonstrate how the transformation parameters can be obtained precisely by choosing control points which lie in the overlapping regions of the images. A good starting point for the transformation parameters is obtained by having a knowledge of the scanner position. The selection of the control points arc independent of the object geometry. By successively recording multiple views and obtaining transformation with respect to a single coordinate system, a complete physical model of an object can be obtained. Since all data arc in the same coordinate system, it can thus be used for building automatic models for free form surfaces.
Santhana Kumar, V; Pandey, P K; Anand, Theivasigamani; Bhuvaneswari, G Rathi; Dhinakaran, A; Kumar, Saurav
2018-06-01
Biofloc technology was evaluated with a view to analyse utilization of nitrogenous waste from the effluent and to improve water quality and growth parameters of Penaeus vannamei in intensive culture system. The experiment was carried out in two different treatment outdoor earthen ponds of 0.12 ha, one supplemented with carbon source (molasses, wheat and sugar) for biofloc formation and other was feed based control pond with a stocking density of 60 animals m -2 in duplicate for 120 days. Water, sediment and P. vannamei were sampled at regular intervals from the both set of ponds for evaluating physico-chemical parameters, nitrogen content and growth parameters, respectively. A significant reduction in the concentration of total ammonia nitrogen (TAN) and nitrite (NO 2 -N) were found in the biofloc pond than that of control pond. A significant low level of nitrogen was recorded in the effluents of biofloc pond in comparison to the control. In biofloc system, a significantly elevated heterotrophic bacterial count along with reduction in total Vibrio count was noticed. A significant improvement in the feed conversion efficiency (FCR) and growth parameters of P. vannamei was noticed in the biofloc pond. Growth of P. vannamei in the biofloc pond showed positive allometric pattern with an increased survival. The microbial biomass grown in biofloc consumes toxic inorganic nitrogen and converts it into useful protein, making it available for the cultured shrimp. This improved FCR and reduced the discharge of nitrogenous waste into adjacent environment, making intensive shrimp farming an eco-friendly enterprise. Copyright © 2018 Elsevier Ltd. All rights reserved.
Analysis of the restricting factors of laser countermeasure active detection technology
NASA Astrophysics Data System (ADS)
Zhang, Yufa; Sun, Xiaoquan
2016-07-01
The detection effect of laser active detection system is affected by various kinds of factors. In view of the application requirement of laser active detection, the influence factors for laser active detection are analyzed. The mathematical model of cat eye target detection distance has been built, influence of the parameters of laser detection system and the environment on detection range and the detection efficiency are analyzed. Various parameters constraint detection performance is simulated. The results show that the discovery distance of laser active detection is affected by the laser divergence angle, the incident angle and the visibility of the atmosphere. For a given detection range, the laser divergence angle and the detection efficiency are mutually restricted. Therefore, in view of specific application environment, it is necessary to select appropriate laser detection parameters to achieve optimal detection effect.
Auto-tuning for NMR probe using LabVIEW
NASA Astrophysics Data System (ADS)
Quen, Carmen; Pham, Stephanie; Bernal, Oscar
2014-03-01
Typical manual NMR-tuning method is not suitable for broadband spectra spanning several megahertz linewidths. Among the main problems encountered during manual tuning are pulse-power reproducibility, baselines, and transmission line reflections, to name a few. We present a design of an auto-tuning system using graphic programming language, LabVIEW, to minimize these problems. The program uses a simplified model of the NMR probe conditions near perfect tuning to mimic the tuning process and predict the position of the capacitor shafts needed to achieve the desirable impedance. The tuning capacitors of the probe are controlled by stepper motors through a LabVIEW/computer interface. Our program calculates the effective capacitance needed to tune the probe and provides controlling parameters to advance the motors in the right direction. The impedance reading of a network analyzer can be used to correct the model parameters in real time for feedback control.
E-Learning: Students Input for Using Mobile Devices in Science Instructional Settings
ERIC Educational Resources Information Center
Yilmaz, Ozkan
2016-01-01
A variety of e-learning theories, models, and strategy have been developed to support educational settings. There are many factors for designing good instructional settings. This study set out to determine functionality of mobile devices, students who already have, and the student needs and views in relation to e-learning settings. The study…
1. GENERAL VIEW TO THE WEST OF THE EMAD FACILITY ...
1. GENERAL VIEW TO THE WEST OF THE E-MAD FACILITY AND THE SURROUNDING ENVIRONMENTAL AND TOPOGRAPHICAL SETTING. - Nevada Test Site, Engine Maintenance Assembly & Disassembly Facility, Area 25, Jackass Flats, Mercury, Nye County, NV
7. Contextual view to eastnortheast showing downstream (west) side of ...
7. Contextual view to east-northeast showing downstream (west) side of bridge in setting, depicting dense riparian nature of area. - Stanislaus River Bridge, Atchison, Topeka & Santa Fe Railway at Stanislaus River, Riverbank, Stanislaus County, CA
Geolocation Accuracy Evaluations of OrbView-3, EROS-A, and SPOT-5 Imagery
NASA Technical Reports Server (NTRS)
Bresnahan, Paul
2007-01-01
This viewgraph presentation evaluates absolute geolocation accuracy of OrbView-3, EROS-A, and SPOT-5 by comparing test imagery-derived ground coordinates to Ground Control Points using SOCET set photogrammetric software.
Optimization of injection molding process parameters for a plastic cell phone housing component
NASA Astrophysics Data System (ADS)
Rajalingam, Sokkalingam; Vasant, Pandian; Khe, Cheng Seong; Merican, Zulkifli; Oo, Zeya
2016-11-01
To produce thin-walled plastic items, injection molding process is one of the most widely used application tools. However, to set optimal process parameters is difficult as it may cause to produce faulty items on injected mold like shrinkage. This study aims at to determine such an optimum injection molding process parameters which can reduce the fault of shrinkage on a plastic cell phone cover items. Currently used setting of machines process produced shrinkage and mis-specified length and with dimensions below the limit. Thus, for identification of optimum process parameters, maintaining closer targeted length and width setting magnitudes with minimal variations, more experiments are needed. The mold temperature, injection pressure and screw rotation speed are used as process parameters in this research. For optimal molding process parameters the Response Surface Methods (RSM) is applied. The major contributing factors influencing the responses were identified from analysis of variance (ANOVA) technique. Through verification runs it was found that the shrinkage defect can be minimized with the optimal setting found by RSM.
Parameter calibration for synthesizing realistic-looking variability in offline handwriting
NASA Astrophysics Data System (ADS)
Cheng, Wen; Lopresti, Dan
2011-01-01
Motivated by the widely accepted principle that the more training data, the better a recognition system performs, we conducted experiments asking human subjects to do evaluate a mixture of real English handwritten text lines and text lines altered from existing handwriting with various distortion degrees. The idea of generating synthetic handwriting is based on a perturbation method by T. Varga and H. Bunke that distorts an entire text line. There are two purposes of our experiments. First, we want to calibrate distortion parameter settings for Varga and Bunke's perturbation model. Second, we intend to compare the effects of parameter settings on different writing styles: block, cursive and mixed. From the preliminary experimental results, we determined appropriate ranges for parameter amplitude, and found that parameter settings should be altered for different handwriting styles. With the proper parameter settings, it should be possible to generate large amount of training and testing data for building better off-line handwriting recognition systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crossno, Patricia J.; Gittinger, Jaxon; Hunt, Warren L.
Slycat™ is a web-based system for performing data analysis and visualization of potentially large quantities of remote, high-dimensional data. Slycat™ specializes in working with ensemble data. An ensemble is a group of related data sets, which typically consists of a set of simulation runs exploring the same problem space. An ensemble can be thought of as a set of samples within a multi-variate domain, where each sample is a vector whose value defines a point in high-dimensional space. To understand and describe the underlying problem being modeled in the simulations, ensemble analysis looks for shared behaviors and common features acrossmore » the group of runs. Additionally, ensemble analysis tries to quantify differences found in any members that deviate from the rest of the group. The Slycat™ system integrates data management, scalable analysis, and visualization. Results are viewed remotely on a user’s desktop via commodity web clients using a multi-tiered hierarchy of computation and data storage, as shown in Figure 1. Our goal is to operate on data as close to the source as possible, thereby reducing time and storage costs associated with data movement. Consequently, we are working to develop parallel analysis capabilities that operate on High Performance Computing (HPC) platforms, to explore approaches for reducing data size, and to implement strategies for staging computation across the Slycat™ hierarchy. Within Slycat™, data and visual analysis are organized around projects, which are shared by a project team. Project members are explicitly added, each with a designated set of permissions. Although users sign-in to access Slycat™, individual accounts are not maintained. Instead, authentication is used to determine project access. Within projects, Slycat™ models capture analysis results and enable data exploration through various visual representations. Although for scientists each simulation run is a model of real-world phenomena given certain conditions, we use the term model to refer to our modeling of the ensemble data, not the physics. Different model types often provide complementary perspectives on data features when analyzing the same data set. Each model visualizes data at several levels of abstraction, allowing the user to range from viewing the ensemble holistically to accessing numeric parameter values for a single run. Bookmarks provide a mechanism for sharing results, enabling interesting model states to be labeled and saved.« less
Hakim, Alex D.
2011-01-01
To record sleep, actigraph devices are worn on the wrist and record movements that can be used to estimate sleep parameters with specialized algorithms in computer software programs. With the recent establishment of a Current Procedural Terminology code for wrist actigraphy, this technology is being used increasingly in clinical settings as actigraphy has the advantage of providing objective information on sleep habits in the patient’s natural sleep environment. Actigraphy has been well validated for the estimation of nighttime sleep parameters across age groups, but the validity of the estimation of sleep-onset latency and daytime sleeping is limited. Clinical guidelines and research suggest that wrist actigraphy is particularly useful in the documentation of sleep patterns prior to a multiple sleep latency test, in the evaluation of circadian rhythm sleep disorders, to evaluate treatment outcomes, and as an adjunct to home monitoring of sleep-disordered breathing. Actigraphy has also been well studied in the evaluation of sleep in the context of depression and dementia. Although actigraphy should not be viewed as a substitute for clinical interviews, sleep diaries, or overnight polysomnography when indicated, it can provide useful information about sleep in the natural sleep environment and/or when extended monitoring is clinically indicated. PMID:21652563
Enhanced visual perception through tone mapping
NASA Astrophysics Data System (ADS)
Harrison, Andre; Mullins, Linda L.; Raglin, Adrienne; Etienne-Cummings, Ralph
2016-05-01
Tone mapping operators compress high dynamic range images to improve the picture quality on a digital display when the dynamic range of the display is lower than that of the image. However, tone mapping operators have been largely designed and evaluated based on the aesthetic quality of the resulting displayed image or how perceptually similar the compressed image appears relative to the original scene. They also often require per image tuning of parameters depending on the content of the image. In military operations, however, the amount of information that can be perceived is more important than the aesthetic quality of the image and any parameter adjustment needs to be as automated as possible regardless of the content of the image. We have conducted two studies to evaluate the perceivable detail of a set of tone mapping algorithms, and we apply our findings to develop and test an automated tone mapping algorithm that demonstrates a consistent improvement in the amount of perceived detail. An automated, and thereby predictable, tone mapping method enables a consistent presentation of perceivable features, can reduce the bandwidth required to transmit the imagery, and can improve the accessibility of the data by reducing the needed expertise of the analyst(s) viewing the imagery.
NASA Astrophysics Data System (ADS)
Takuma, Takehisa; Masugi, Masao
2009-03-01
This paper presents an approach to the assessment of IP-network traffic in terms of the time variation of self-similarity. To get a comprehensive view in analyzing the degree of long-range dependence (LRD) of IP-network traffic, we use a hierarchical clustering scheme, which provides a way to classify high-dimensional data with a tree-like structure. Also, in the LRD-based analysis, we employ detrended fluctuation analysis (DFA), which is applicable to the analysis of long-range power-law correlations or LRD in non-stationary time-series signals. Based on sequential measurements of IP-network traffic at two locations, this paper derives corresponding values for the LRD-related parameter α that reflects the degree of LRD of measured data. In performing the hierarchical clustering scheme, we use three parameters: the α value, average throughput, and the proportion of network traffic that exceeds 80% of network bandwidth for each measured data set. We visually confirm that the traffic data can be classified in accordance with the network traffic properties, resulting in that the combined depiction of the LRD and other factors can give us an effective assessment of network conditions at different times.
Static design of steel-concrete lining for traffic tunnels
NASA Astrophysics Data System (ADS)
Vojtasik, Karel; Mohyla, Marek; Hrubesova, Eva
2017-09-01
Article summarizes the results of research focused on the structural design of traffic tunnel linings that have been achieved in the framework of a research project TE01020168 that supports The Technology Agency of Czech Republic. This research aim is to find and develop a process for design structure parameters of tunnel linings. These are now mostly build up by a shotcrete technology. The shotcrete is commonly endorsed either with steel girders or steel fibres. Since the installation a lining structure is loaded while strength and deformational parameters of shotcrete start to rise till the setting time elapses. That’s reason why conventional approaches of reinforced concrete are not suitable. As well as there are other circumstances to step in shown in this article. Problem is solved by 3D analysis using numerical model that takes into account all the significant features of a tunnel lining construction process inclusive the interaction between lining structure with rock massive. Analysis output is a view into development of stress-strain state in respective construction parts of tunnel lining the whole structure around, including impact on stability of rock massive. The proposed method comprises all features involved in tunnel fabrication including geotechnics and construction technologies.
Optimized linear motor and digital PID controller setup used in Mössbauer spectrometer
NASA Astrophysics Data System (ADS)
Kohout, Pavel; Kouřil, Lukáš; Navařík, Jakub; Novák, Petr; Pechoušek, Jiří
2014-10-01
Optimization of a linear motor and digital PID controller setup used in a Mössbauer spectrometer is presented. Velocity driving system with a digital PID feedback subsystem was developed in the LabVIEW graphical environment and deployed on the sbRIO real-time hardware device (National Instruments). The most important data acquisition processes are performed as real-time deterministic tasks on an FPGA chip. Velocity transducer of a double loudspeaker type with a power amplifier circuit is driven by the system. Series of calibration measurements were proceeded to find the optimal setup of the P, I, D parameters together with velocity error signal analysis. The shape and given signal characteristics of the velocity error signal are analyzed in details. Remote applications for controlling and monitoring the PID system from computer or smart phone, respectively, were also developed. The best setup and P, I, D parameters were set and calibration spectrum of α-Fe sample with an average nonlinearity of the velocity scale below 0.08% was collected. Furthermore, the width of the spectral line below 0.30 mm/s was observed. Powerful and complex velocity driving system was designed.
Cohen, Leeber; Mangers, Kristie; Grobman, William A; Platt, Lawrence D
2009-12-01
The purpose of this study was to determine the frequency with which 3 standard screening views of the fetal heart (4-chamber, left ventricular outflow tract [LVOT], and right ventricular outflow tract [RVOT]) can be obtained satisfactorily with the spatiotemporal image correlation (STIC) technique. A prospective study of 111 patients undergoing anatomic surveys at 18 to 22 weeks was performed. Two ultrasound machines with fetal cardiac settings were used. The best volume set that could be obtained from each patient during a 45-minute examination was graded by 2 sonologists with regard to whether the 4-chamber, LVOT, and RVOT images were satisfactory for screening. All 3 views were judged satisfactory for screening in most patients: 1 sonologist graded the views as satisfactory in 70% of the patients, whereas the other found the views to be satisfactory in 83%. The position of the placenta did not alter the probability of achieving a satisfactory view, but a fetus in the spine anterior position was associated with a significantly lower probability that the views were regarded as satisfactory for screening (odds ratio, 0.28; 95% confidence interval, 0.09-0.70; P < .05). This study suggests that STIC may assist with screening for cardiac anomalies at 18 to 22 weeks' gestation.
Timanaykar, Ramesh T; Anand, Lakesh K; Palta, Sanjeev
2011-04-01
The Truview EVO2™ laryngoscope is a recently introduced device with a unique blade that provides a magnified laryngeal view at 42° anterior reflected view. It facilitates visualization of the glottis without alignment of oral, pharyngeal, and tracheal axes. We compared the view obtained at laryngoscopy, intubating conditions and hemodynamic parameters of Truview with Macintosh blade. In prospective, randomized and controlled manner, 200 patients of ASA I and II of either sex (20-50 years), presenting for surgery requiring tracheal intubation, were assigned to undergo intubation using a Truview or Macintosh laryngoscope. Visualization of the vocal cord, ease of intubation, time taken for intubation, number of attempts, and hemodynamic parameters were evaluated. Truview provided better results for the laryngeal view using Cormack and Lehane grading, particularly in patients with higher airway Mallampati grading (P < 0.05). The time taken for intubation (33.06±5.6 vs. 23.11±57 seconds) was more with Truview than with Macintosh blade (P < 0.01). The Percentage of Glottic Opening (POGO) score was significantly higher (97.26±8) in Truview as that observed with Macintosh blade (83.70±21.5). Hemodynamic parameters increased after tracheal intubation from pre-intubation value (P < 0.05) in both the groups, but they were comparable amongst the groups. No postoperative adverse events were noted. Tracheal intubation using Truview blade provided consistently improved laryngeal view as compared to Macintosh blade without the need to align the oral, pharyngeal and tracheal axes, with equal attempts for successful intubation and similar changes in hemodynamics. However, the time taken for intubation was more with Truview.
Timanaykar, Ramesh T; Anand, Lakesh K; Palta, Sanjeev
2011-01-01
Background: The Truview EVO2™ laryngoscope is a recently introduced device with a unique blade that provides a magnified laryngeal view at 42° anterior reflected view. It facilitates visualization of the glottis without alignment of oral, pharyngeal, and tracheal axes. We compared the view obtained at laryngoscopy, intubating conditions and hemodynamic parameters of Truview with Macintosh blade. Materials and Methods: In prospective, randomized and controlled manner, 200 patients of ASA I and II of either sex (20–50 years), presenting for surgery requiring tracheal intubation, were assigned to undergo intubation using a Truview or Macintosh laryngoscope. Visualization of the vocal cord, ease of intubation, time taken for intubation, number of attempts, and hemodynamic parameters were evaluated. Results: Truview provided better results for the laryngeal view using Cormack and Lehane grading, particularly in patients with higher airway Mallampati grading (P < 0.05). The time taken for intubation (33.06±5.6 vs. 23.11±57 seconds) was more with Truview than with Macintosh blade (P < 0.01). The Percentage of Glottic Opening (POGO) score was significantly higher (97.26±8) in Truview as that observed with Macintosh blade (83.70±21.5). Hemodynamic parameters increased after tracheal intubation from pre-intubation value (P < 0.05) in both the groups, but they were comparable amongst the groups. No postoperative adverse events were noted. Conclusion: Tracheal intubation using Truview blade provided consistently improved laryngeal view as compared to Macintosh blade without the need to align the oral, pharyngeal and tracheal axes, with equal attempts for successful intubation and similar changes in hemodynamics. However, the time taken for intubation was more with Truview. PMID:21772680
Cosmological space-times with resolved Big Bang in Yang-Mills matrix models
NASA Astrophysics Data System (ADS)
Steinacker, Harold C.
2018-02-01
We present simple solutions of IKKT-type matrix models that can be viewed as quantized homogeneous and isotropic cosmological space-times, with finite density of microstates and a regular Big Bang (BB). The BB arises from a signature change of the effective metric on a fuzzy brane embedded in Lorentzian target space, in the presence of a quantized 4-volume form. The Hubble parameter is singular at the BB, and becomes small at late times. There is no singularity from the target space point of view, and the brane is Euclidean "before" the BB. Both recollapsing and expanding universe solutions are obtained, depending on the mass parameters.
Implementation of the EM Algorithm in the Estimation of Item Parameters: The BILOG Computer Program.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Bock, R. Darrell
This paper reviews the basic elements of the EM approach to estimating item parameters and illustrates its use with one simulated and one real data set. In order to illustrate the use of the BILOG computer program, runs for 1-, 2-, and 3-parameter models are presented for the two sets of data. First is a set of responses from 1,000 persons to five…
Moore, C S; Liney, G P; Beavis, A W; Saunderson, J R
2007-09-01
A test methodology using an anthropomorphic-equivalent chest phantom is described for the optimization of the Agfa computed radiography "MUSICA" processing algorithm for chest radiography. The contrast-to-noise ratio (CNR) in the lung, heart and diaphragm regions of the phantom, and the "system modulation transfer function" (sMTF) in the lung region, were measured using test tools embedded in the phantom. Using these parameters the MUSICA processing algorithm was optimized with respect to low-contrast detectability and spatial resolution. Two optimum "MUSICA parameter sets" were derived respectively for maximizing the CNR and sMTF in each region of the phantom. Further work is required to find the relative importance of low-contrast detectability and spatial resolution in chest images, from which the definitive optimum MUSICA parameter set can then be derived. Prior to this further work, a compromised optimum MUSICA parameter set was applied to a range of clinical images. A group of experienced image evaluators scored these images alongside images produced from the same radiographs using the MUSICA parameter set in clinical use at the time. The compromised optimum MUSICA parameter set was shown to produce measurably better images.
Young Children's Responses to Artworks: The Eye, the Mind, and the Body
ERIC Educational Resources Information Center
Yu, Jacqueline Lye Wai; Garces-Bacsal, Rhoda Myra; Wright, Susan Kay
2017-01-01
This study investigates young children's responses to viewing artworks in a preschool setting. Based on the responses of 15 children aged five to six years during five art viewing sessions in a preschool in Singapore, the study examines features of what young children see, think and feel when they view artworks. These sessions were facilitated by…
Hay, L.E.; McCabe, G.J.; Clark, M.P.; Risley, J.C.
2009-01-01
The accuracy of streamflow forecasts depends on the uncertainty associated with future weather and the accuracy of the hydrologic model that is used to produce the forecasts. We present a method for streamflow forecasting where hydrologic model parameters are selected based on the climate state. Parameter sets for a hydrologic model are conditioned on an atmospheric pressure index defined using mean November through February (NDJF) 700-hectoPascal geopotential heights over northwestern North America [Pressure Index from Geopotential heights (PIG)]. The hydrologic model is applied in the Sprague River basin (SRB), a snowmelt-dominated basin located in the Upper Klamath basin in Oregon. In the SRB, the majority of streamflow occurs during March through May (MAM). Water years (WYs) 1980-2004 were divided into three groups based on their respective PIG values (high, medium, and low PIG). Low (high) PIG years tend to have higher (lower) than average MAM streamflow. Four parameter sets were calibrated for the SRB, each using a different set of WYs. The initial set used WYs 1995-2004 and the remaining three used WYs defined as high-, medium-, and low-PIG years. Two sets of March, April, and May streamflow volume forecasts were made using Ensemble Streamflow Prediction (ESP). The first set of ESP simulations used the initial parameter set. Because the PIG is defined using NDJF pressure heights, forecasts starting in March can be made using the PIG parameter set that corresponds with the year being forecasted. The second set of ESP simulations used the parameter set associated with the given PIG year. Comparison of the ESP sets indicates that more accuracy and less variability in volume forecasts may be possible when the ESP is conditioned using the PIG. This is especially true during the high-PIG years (low-flow years). ?? 2009 American Water Resources Association.
Neither Pollyanna nor Chicken Little: Thoughts on the Ethics of Automation
NASA Technical Reports Server (NTRS)
Holloway, C. Michael; Knight, John C.; McDermid, John A.
2014-01-01
This paper has raised issues concerning the ethics of automation in aviation systems, and outlined ways of thinking about the issues that may help in ethical decision making. It is very easy to be carried along by technology and the Pollyanna view, but just because we can do something, doesn't mean we should - which is perhaps a little milder than the Chicken Little view. Both views have merits, and we would view ethical decisions as ones that more appropriately balance or reconcile these conflicting viewpoints. We have set out some of the background to the problems of automation in aviation systems, but are aware that there is much more that could be said (considering military UAS, for example). We hope, however, that the brief introduction provides a foundation for the ethical questions that we have set out. The underlying aim in proposing ESCs is to make understanding ethical issues easier so that ethically-informed decisions can be made. Whilst we have not linked the discussion directly back to specific ethical decisions, we believe that making explicit those issues on which such judgments are based is a contribution to ethically informed decision making. We also believe that the four principles set out by the RAEng are reflected in this approach. We acknowledge that what we have set out, especially the ideas of ESC, goes some way beyond current practice and principles and there are significant technical issues to resolve before such an approach could be implemented. It is hoped, however, that the ideas will help improve the production and presentation of safety cases in a range of industries not just aviation - a Pollyanna view, of course!
45. Photocopy of photograph (negative made from original photograph on ...
45. Photocopy of photograph (negative made from original photograph on file at Veterans Administration in Wichita, Kansas), government photograph certified by W.B. Hayes, Jr., Superintendent of Construction, Veterans Administration, 29 September 1934, listed as Project VAC-507, no. 7 of 7 views, set no. 2 of 3 sets, subject - Awnings, contractor Langdon Tent & Awning Co., Wichita, Kansas." View southeast, Building 8 - Veterans Administration Center, Officers Duplex Quarters, 5302 East Kellogg (Legal Address); 5500 East Kellogg (Common Address), Wichita, Sedgwick County, KS
43. Photocopy of photograph (negative made from original photograph on ...
43. Photocopy of photograph (negative made from original photograph on file at Veterans Administration in Wichita, Kansas), government photograph certified by Willis B. Hayes, Jr., Superintendent of Construction, Veterans Administration, 15 March 1933, listed as "Project VAC-206, No. 26 of 26 views, set no. 4 of 5 sets, subject - final photos, contractor Henry B. Ryan Co." View southeast, Building 8 - Veterans Administration Center, Officers Duplex Quarters, 5302 East Kellogg (Legal Address); 5500 East Kellogg (Common Address), Wichita, Sedgwick County, KS
STS-32 view of the moon setting over the Earth's limb
1990-01-20
STS-32 crew took this view of the moon setting over the Earth's limb. Near the center is a semi-vortex in the clouds - a storm system in the early stages of formation. The moon's image is distorted due to refraction through the Earth's atmosphere. The near side of the moon is visible showing the vast area of the moon's western seas (Mare Occidental), Apollo landing sites: Apollo 14 at Fra Mauro and Apollo 16 at Central Highlands near Descartes.
STS-32 view of the moon setting over the Earth's limb
NASA Technical Reports Server (NTRS)
1990-01-01
STS-32 crew took this view of the moon setting over the Earth's limb. Near the center is a semi-vortex in the clouds - a storm system in the early stages of formation. The moon's image is distorted due to refraction through the Earth's atmosphere. The near side of the moon is visible showing the vast area of the moon's western seas (Mare Occidental), Apollo landing sites: Apollo 14 at Fra Mauro and Apollo 16 at Central Highlands near Descartes.
Measuring the Viewing Angle of GW170817 with Electromagnetic and Gravitational Waves
NASA Astrophysics Data System (ADS)
Finstad, Daniel; De, Soumi; Brown, Duncan A.; Berger, Edo; Biwer, Christopher M.
2018-06-01
The joint detection of gravitational waves (GWs) and electromagnetic (EM) radiation from the binary neutron star merger GW170817 ushered in a new era of multi-messenger astronomy. Joint GW–EM observations can be used to measure the parameters of the binary with better precision than either observation alone. Here, we use joint GW–EM observations to measure the viewing angle of GW170817, the angle between the binary’s angular momentum and the line of sight. We combine a direct measurement of the distance to the host galaxy of GW170817 (NGC 4993) of 40.7 ± 2.36 Mpc with the Laser Interferometer Gravitational-wave Observatory (LIGO)/Virgo GW data and find that the viewing angle is {32}-13+10 +/- 1.7 degrees (90% confidence, statistical, and systematic errors). We place a conservative lower limit on the viewing angle of ≥13°, which is robust to the choice of prior. This measurement provides a constraint on models of the prompt γ-ray and radio/X-ray afterglow emission associated with the merger; for example, it is consistent with the off-axis viewing angle inferred for a structured jet model. We provide for the first time the full posterior samples from Bayesian parameter estimation of LIGO/Virgo data to enable further analysis by the community.
NASA Astrophysics Data System (ADS)
Hermance, J. F.; Jacob, R. W.; Bradley, B. A.; Mustard, J. F.
2005-12-01
In studying vegetation patterns remotely, the objective is to draw inferences on the development of specific or general land surface phenology (LSP) as a function of space and time by determining the behavior of a parameter (in our case NDVI), when the parameter estimate may be biased by noise, data dropouts and obfuscations from atmospheric and other effects. We describe the underpinning concepts of a procedure for a robust interpolation of NDVI data that does not have the limitations of other mathematical approaches which require orthonormal basis functions (e.g. Fourier analysis). In this approach, data need not be uniformly sampled in time, nor do we expect noise to be Gaussian-distributed. Our approach is intuitive and straightforward, and is applied here to the refined modeling of LSP using 7 years of weekly and biweekly AVHRR NDVI data for a 150 x 150 km study area in central Nevada. This site is a microcosm of a broad range of vegetation classes, from irrigated agriculture with annual NDVIvalues of up to 0.7 to playas and alkali salt flats with annual NDVI values of only 0.07. Our procedure involves a form of parameter estimation employing Bayesian statistics. In utilitarian terms, the latter procedure is a method of statistical analysis (in our case, robustified, weighted least-squares recursive curve-fitting) that incorporates a variety of prior knowledge when forming current estimates of a particular process or parameter. In addition to the standard Bayesian approach, we account for outliers due to data dropouts or obfuscations because of clouds and snow cover. An initial "starting model" for the average annual cycle and long term (7 year) trend is determined by jointly fitting a common set of complex annual harmonics and a low order polynomial to an entire multi-year time series in one step. This is not a formal Fourier series in the conventional sense, but rather a set of 4 cosine and 4 sine coefficients with fundamental periods of 12, 6, 3 and 1.5 months. Instabilities during large time gaps in the data are suppressed by introducing an expectation of minimum roughness on the fitted time series. Our next significant computational step involves a constrained least squares fit to the observed NDVI data. Residuals between the observed NDVI value and the predicted starting model are computed, and the inverse of these residuals provide the weights for a weighted least squares analysis whereby a set of annual eighth-order splines are fit to the 7 years of NDVI data. Although a series of independent 8-th order annual functionals over a period of 7 years is intrinsically unstable when there are significant data gaps, the splined versions for this specific application are quite stable due to explicit continuity conditions on the values and derivatives of the functionals across contiguous years, as well as a priori constraints on the predicted values vis-a-vis the assumed initial model. Our procedure allows us to robustly interpolate original unequally-spaced NDVI data with a new time series having the most-appropriate, user-defined time base. We apply this approach to the temporal behavior of vegetation in our 150 x 150 km study area. Such a small area, being so rich in vegetation diversity, is particularly useful to view in map form and by animated annual and multi-year time sequences, since the interrelation between phenology, topography and specific usage patterns becomes clear.
3. Perspective view of north end of Bunker 103 showing ...
3. Perspective view of north end of Bunker 103 showing north set of steel doors. Camera pointed NW. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, North of Campbell Trail, Bremerton, Kitsap County, WA
Li, Baoyue; Lingsma, Hester F; Steyerberg, Ewout W; Lesaffre, Emmanuel
2011-05-23
Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC.Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain.
Ramasubramanian, Viswanathan; Glasser, Adrian
2015-01-01
PURPOSE To determine whether relatively low-resolution ultrasound biomicroscopy (UBM) can predict the accommodative optical response in prepresbyopic eyes as well as in a previous study of young phakic subjects, despite lower accommodative amplitudes. SETTING College of Optometry, University of Houston, Houston, USA. DESIGN Observational cross-sectional study. METHODS Static accommodative optical response was measured with infrared photorefraction and an autorefractor (WR-5100K) in subjects aged 36 to 46 years. A 35 MHz UBM device (Vumax, Sonomed Escalon) was used to image the left eye, while the right eye viewed accommodative stimuli. Custom-developed Matlab image-analysis software was used to perform automated analysis of UBM images to measure the ocular biometry parameters. The accommodative optical response was predicted from biometry parameters using linear regression, 95% confidence intervals (CIs), and 95% prediction intervals. RESULTS The study evaluated 25 subjects. Per-diopter (D) accommodative changes in anterior chamber depth (ACD), lens thickness, anterior and posterior lens radii of curvature, and anterior segment length were similar to previous values from young subjects. The standard deviations (SDs) of accommodative optical response predicted from linear regressions for UBM-measured biometry parameters were ACD, 0.15 D; lens thickness, 0.25 D; anterior lens radii of curvature, 0.09 D; posterior lens radii of curvature, 0.37 D; and anterior segment length, 0.42 D. CONCLUSIONS Ultrasound biomicroscopy parameters can, on average, predict accommodative optical response with SDs of less than 0.55 D using linear regressions and 95% CIs. Ultrasound biomicroscopy can be used to visualize and quantify accommodative biometric changes and predict accommodative optical response in prepresbyopic eyes. PMID:26049831
Simplex GPS and InSAR Inversion Software
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Pierce, Marlon E.
2012-01-01
Changes in the shape of the Earth's surface can be routinely measured with precisions better than centimeters. Processes below the surface often drive these changes and as a result, investigators require models with inversion methods to characterize the sources. Simplex inverts any combination of GPS (global positioning system), UAVSAR (uninhabited aerial vehicle synthetic aperture radar), and InSAR (interferometric synthetic aperture radar) data simultaneously for elastic response from fault and fluid motions. It can be used to solve for multiple faults and parameters, all of which can be specified or allowed to vary. The software can be used to study long-term tectonic motions and the faults responsible for those motions, or can be used to invert for co-seismic slip from earthquakes. Solutions involving estimation of fault motion and changes in fluid reservoirs such as magma or water are possible. Any arbitrary number of faults or parameters can be considered. Simplex specifically solves for any of location, geometry, fault slip, and expansion/contraction of a single or multiple faults. It inverts GPS and InSAR data for elastic dislocations in a half-space. Slip parameters include strike slip, dip slip, and tensile dislocations. It includes a map interface for both setting up the models and viewing the results. Results, including faults, and observed, computed, and residual displacements, are output in text format, a map interface, and can be exported to KML. The software interfaces with the QuakeTables database allowing a user to select existing fault parameters or data. Simplex can be accessed through the QuakeSim portal graphical user interface or run from a UNIX command line.
Determination of the Parameter Sets for the Best Performance of IPS-driven ENLIL Model
NASA Astrophysics Data System (ADS)
Yun, Jongyeon; Choi, Kyu-Cheol; Yi, Jonghyuk; Kim, Jaehun; Odstrcil, Dusan
2016-12-01
Interplanetary scintillation-driven (IPS-driven) ENLIL model was jointly developed by University of California, San Diego (UCSD) and National Aeronaucics and Space Administration/Goddard Space Flight Center (NASA/GSFC). The model has been in operation by Korean Space Weather Cetner (KSWC) since 2014. IPS-driven ENLIL model has a variety of ambient solar wind parameters and the results of the model depend on the combination of these parameters. We have conducted researches to determine the best combination of parameters to improve the performance of the IPS-driven ENLIL model. The model results with input of 1,440 combinations of parameters are compared with the Advanced Composition Explorer (ACE) observation data. In this way, the top 10 parameter sets showing best performance were determined. Finally, the characteristics of the parameter sets were analyzed and application of the results to IPS-driven ENLIL model was discussed.
Construct Maps as a Foundation for Standard Setting
ERIC Educational Resources Information Center
Wyse, Adam E.
2013-01-01
Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…
Alternator control for battery charging
Brunstetter, Craig A.; Jaye, John R.; Tallarek, Glen E.; Adams, Joseph B.
2015-07-14
In accordance with an aspect of the present disclosure, an electrical system for an automotive vehicle has an electrical generating machine and a battery. A set point voltage, which sets an output voltage of the electrical generating machine, is set by an electronic control unit (ECU). The ECU selects one of a plurality of control modes for controlling the alternator based on an operating state of the vehicle as determined from vehicle operating parameters. The ECU selects a range for the set point voltage based on the selected control mode and then sets the set point voltage within the range based on feedback parameters for that control mode. In an aspect, the control modes include a trickle charge mode and battery charge current is the feedback parameter and the ECU controls the set point voltage within the range to maintain a predetermined battery charge current.
Short exposure to telestereoscope affects the oculomotor system.
Neveu, Pascaline; Priot, Anne-Emmanuelle; Plantier, Justin; Roumes, Corinne
2010-11-01
Under natural viewing conditions, the accommodation and vergence systems adjust the focus and the binocular alignment of the eyes in response to changes in viewing distance. The two responses are linked via cross-coupling and proceed almost simultaneously. Some optical devices, such as virtual reality or helmet mounted displays, create an oculomotor conflict by modifying demands on both vergence and accommodation. Previous studies extensively investigated the effect of such a conflict on the cross-coupling between vergence and accommodation, but little is known about the plasticity of the whole oculomotor system. In the present study, an oculomotor conflict was induced by a telestereoscope which magnified the standard inter-pupillary separation threefold and thus increased the convergence demand while accommodation remained almost unchanged. The effect of a 10 min exposure was assessed via a series of optometric parameters selected on the basis of existing oculomotor models. Associated with subject's visual complaints, most of the oculomotor parameters tested were modified: there was (1) deterioration of stereoscopic threshold; (2) increase in AC/A ratio; (3) increase in near and far phorias; and (4) shift of the zone of clear and single binocular vision towards convergence. These results showed a change in gain of accommodative vergence and a shift of vergence reserves towards convergence in response to telestereoscopic viewing. The subject's binocular behaviour tended towards esophoria with convergence excess as confirmed by Sheard's and Percival's criteria. Such changes in oculomotor parameters support adaptive behaviour linked with telestereoscopic viewing. © 2010 IRBA.
Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.
Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic
2017-05-15
Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Fatimah, F.; Rosadi, D.; Hakim, R. B. F.
2018-03-01
In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.
ERIC Educational Resources Information Center
Archer, Louise; Francis, Becky; Miller, Sarah; Taylor, Becky; Tereshchenko, Antonina; Mazenod, Anna; Pepper, David; Travers, Mary-Claire
2018-01-01
"Setting" is a widespread practice in the UK, despite little evidence of its efficacy and substantial evidence of its detrimental impact on those allocated to the lowest sets. Taking a Bourdieusian approach, we propose that setting can be understood as a practice through which the social and cultural reproduction of dominant power…
The effect of internal and external fields of view on visually induced motion sickness.
Bos, Jelte E; de Vries, Sjoerd C; van Emmerik, Martijn L; Groen, Eric L
2010-07-01
Field of view (FOV) is said to affect visually induced motion sickness. FOV, however, is characterized by an internal setting used by the graphics generator (iFOV) and an external factor determined by screen size and viewing distance (eFOV). We hypothesized that especially the incongruence between iFOV and eFOV would lead to sickness. To that end we used a computer game environment with different iFOV and eFOV settings, and found the opposite effect. We speculate that the relative large differences between iFOV and eFOV used in this experiment caused the discrepancy, as may be explained by assuming an observer model controlling body motion. Copyright 2009 Elsevier Ltd. All rights reserved.
Familiarity, legitimation, and frequency: the influence of others on the criminal self-view.
Asencio, Emily K
2011-01-01
From an identity theory perspective, reflected appraisals from others are relevant for social behavior, because behavior is motivated by the desire to achieve congruence between reflected appraisals and the self-view for a particular identity. This study extends prior identity theory work from the laboratory setting by examining identity processes with respect to the criminal identity in the unique “natural” setting of a total institution. The findings build on prior work which finds that reflected appraisals do have an influence on identities and behavior by demonstrating that the relationship one has to the source of reflected appraisals is important for the way in which reflected appraisals influence the criminal self-view for an incarcerated population.
The Potential of Low-Cost Rpas for Multi-View Reconstruction of Sub-Vertical Rock Faces
NASA Astrophysics Data System (ADS)
Thoeni, K.; Guccione, D. E.; Santise, M.; Giacomini, A.; Roncella, R.; Forlani, G.
2016-06-01
The current work investigates the potential of two low-cost off-the-shelf quadcopters for multi-view reconstruction of sub-vertical rock faces. The two platforms used are a DJI Phantom 1 equipped with a Gopro Hero 3+ Black and a DJI Phantom 3 Professional with integrated camera. The study area is a small sub-vertical rock face. Several flights were performed with both cameras set in time-lapse mode. Hence, images were taken automatically but the flights were performed manually as the investigated rock face is very irregular which required manual adjustment of the yaw and roll for optimal coverage. The digital images were processed with commercial SfM software packages. Several processing settings were investigated in order to find out the one providing the most accurate 3D reconstruction of the rock face. To this aim, all 3D models produced with both platforms are compared to a point cloud obtained with a terrestrial laser scanner. Firstly, the difference between the use of coded ground control targets and the use of natural features was studied. Coded targets generally provide the best accuracy, but they need to be placed on the surface, which is not always possible, as sub-vertical rock faces are not easily accessible. Nevertheless, natural features can provide a good alternative if wisely chosen as shown in this work. Secondly, the influence of using fixed interior orientation parameters or self-calibration was investigated. The results show that, in the case of the used sensors and camera networks, self-calibration provides better results. To support such empirical finding, a numerical investigation using a Monte Carlo simulation was performed.
A Hard X-ray View on Two Distant VHE Blazars: 1ES 1101-232 and 1ES 1553+113
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reimer, A.; Costamente, L.; /Stanford U., HEPL /KIPAC, Menlo Park
2008-05-02
TeV-blazars are known as prominent non-thermal emitters across the entire electromagnetic spectrum with their photon power peaking in the X-ray and TeV-band. If distant, absorption of -ray photons by the extragalactic background light (EBL) alters the intrinsic TeV spectral shape, thereby affecting the overall interpretation. Suzaku observations for two of the more distant TeV-blazars known to date, 1ES 1101-232 and 1ES 1553+113, were carried out in May and July 2006, respectively, including a quasi-simultaneous coverage with the state of the art Cherenkov telescope facilities. We report on the resulting data sets with emphasis on the X-ray band, and set intomore » context to their historical behavior. During our campaign, we did not detect any significant X-ray or {gamma}-ray variability. 1ES 1101-232 was found in a quiescent state with the lowest X-ray flux ever measured. The combined XIS and HXD PIN data for 1ES 1101-232 and 1ES 1553+113 clearly indicate spectral curvature up to the highest hard X-ray data point ({approx} 30 keV), manifesting as softening with increasing energy. We describe this spectral shape by either a broken power law or a log-parabolic fit with equal statistical goodness of fits. The combined 1ES 1553+113 very high energy spectrum (90-500 GeV) did not show any significant changes with respect to earlier observations. The resulting contemporaneous broadband spectral energy distributions of both TeV-blazars are discussed in view of implications for intrinsic blazar parameter values, taking into account the {gamma}-ray absorption in the EBL.« less
NASA Astrophysics Data System (ADS)
Ma, Pei; Gu, Shi; Wang, Yves T.; Jenkins, Michael W.; Rollins, Andrew M.
2016-03-01
Optical mapping (OM) using fluorescent voltage-sensitive dyes (VSD) to measure membrane potential is currently the most effective method for electrophysiology studies in early embryonic hearts due to its noninvasiveness and large field-of-view. Conventional OM acquires bright-field images, collecting signals that are integrated in depth and projected onto a 2D plane, not capturing the 3D structure of the sample. Early embryonic hearts, especially at looping stages, have a complicated, tubular geometry. Therefore, conventional OM cannot provide a full picture of the electrical conduction circumferentially around the heart, and may result in incomplete and inaccurate measurements. Here, we demonstrate OM of Hamburger and Hamilton stage 14 embryonic quail hearts using a new commercially-available VSD, Fluovolt, and depth sectioning using a custom built light-sheet microscopy system. Axial and lateral resolution of the system is 14µm and 8µm respectively. For OM imaging, the field-of-view was set to 900µm×900µm to cover the entire heart. 2D over time OM image sets at multiple cross-sections through the looping-stage heart were recorded. The shapes of both atrial and ventricular action potentials acquired were consistent with previous reports using conventional VSD (di-4-ANNEPS). With Fluovolt, signal-to-noise ratio (SNR) is improved significantly by a factor of 2-10 (compared with di-4-ANNEPS) enabling light-sheet OM, which intrinsically has lower SNR due to smaller sampling volumes. Electrophysiologic parameters are rate dependent. Optical pacing was successfully integrated into the system to ensure heart rate consistency. This will also enable accurately gated reconstruction of full four dimensional conduction maps and 3D conduction velocity measurements.
3D Exploration of Meteorological Data: Facing the challenges of operational forecasters
NASA Astrophysics Data System (ADS)
Koutek, Michal; Debie, Frans; van der Neut, Ian
2016-04-01
In the past years the Royal Netherlands Meteorological Institute (KNMI) has been working on innovation in the field of meteorological data visualization. We are dealing with Numerical Weather Prediction (NWP) model data and observational data, i.e. satellite images, precipitation radar, ground and air-borne measurements. These multidimensional multivariate data are geo-referenced and can be combined in 3D space to provide more intuitive views on the atmospheric phenomena. We developed the Weather3DeXplorer (W3DX), a visualization framework for processing and interactive exploration and visualization using Virtual Reality (VR) technology. We managed to have great successes with research studies on extreme weather situations. In this paper we will elaborate what we have learned from application of interactive 3D visualization in the operational weather room. We will explain how important it is to control the degrees-of-freedom during interaction that are given to the users: forecasters/scientists; (3D camera and 3D slicing-plane navigation appear to be rather difficult for the users, when not implemented properly). We will present a novel approach of operational 3D visualization user interfaces (UI) that for a great deal eliminates the obstacle and the time it usually takes to set up the visualization parameters and an appropriate camera view on a certain atmospheric phenomenon. We have found our inspiration in the way our operational forecasters work in the weather room. We decided to form a bridge between 2D visualization images and interactive 3D exploration. Our method combines WEB-based 2D UI's, pre-rendered 3D visualization catalog for the latest NWP model runs, with immediate entry into interactive 3D session for selected visualization setting. Finally, we would like to present the first user experiences with this approach.
Saheb-Koussa, Djohra; Koussa, Mustapha; Said, Nourredine
2013-01-01
This paper studies the technical, economic, and environmental analysis of wind and photovoltaic power systems connected to a conventional grid. The main interest in such systems is on-site consumption of the produced energy, system hybridization, pooling of resources, and contribution to the environment protection. To ensure a better management of system energy, models have been used for determining the power that the constituting subsystems can deliver under specific weather conditions. Simulation is performed using MATLAB-SIMULINK. While, the economic and environmental study is performed using HOMER software. From an economic point of view, this allows to compare the financial constraints on each part of the system for the case of Adrar site which is located to the northern part of the south of Algeria. It also permits to optimally size and select the system presenting the best features on the basis of two parameters, that is, cost and effectiveness. From an environmental point of view, this study allows highlighting the role of renewable energy in reducing gas emissions related to greenhouse effects. In addition, through a set of sensitivity analysis, it is found that the wind speed has more effects on the environmental and economic performances of grid-connected hybrid (photovoltaic-wind) power systems.
Saheb-Koussa, Djohra; Koussa, Mustapha; Said, Nourredine
2013-01-01
This paper studies the technical, economic, and environmental analysis of wind and photovoltaic power systems connected to a conventional grid. The main interest in such systems is on-site consumption of the produced energy, system hybridization, pooling of resources, and contribution to the environment protection. To ensure a better management of system energy, models have been used for determining the power that the constituting subsystems can deliver under specific weather conditions. Simulation is performed using MATLAB-SIMULINK. While, the economic and environmental study is performed using HOMER software. From an economic point of view, this allows to compare the financial constraints on each part of the system for the case of Adrar site which is located to the northern part of the south of Algeria. It also permits to optimally size and select the system presenting the best features on the basis of two parameters, that is, cost and effectiveness. From an environmental point of view, this study allows highlighting the role of renewable energy in reducing gas emissions related to greenhouse effects. In addition, through a set of sensitivity analysis, it is found that the wind speed has more effects on the environmental and economic performances of grid-connected hybrid (photovoltaic-wind) power systems. PMID:24489488
Airborne multidimensional integrated remote sensing system
NASA Astrophysics Data System (ADS)
Xu, Weiming; Wang, Jianyu; Shu, Rong; He, Zhiping; Ma, Yanhua
2006-12-01
In this paper, we present a kind of airborne multidimensional integrated remote sensing system that consists of an imaging spectrometer, a three-line scanner, a laser ranger, a position & orientation subsystem and a stabilizer PAV30. The imaging spectrometer is composed of two sets of identical push-broom high spectral imager with a field of view of 22°, which provides a field of view of 42°. The spectral range of the imaging spectrometer is from 420nm to 900nm, and its spectral resolution is 5nm. The three-line scanner is composed of two pieces of panchromatic CCD and a RGB CCD with 20° stereo angle and 10cm GSD(Ground Sample Distance) with 1000m flying height. The laser ranger can provide height data of three points every other four scanning lines of the spectral imager and those three points are calibrated to match the corresponding pixels of the spectral imager. The post-processing attitude accuracy of POS/AV 510 used as the position & orientation subsystem, which is the aerial special exterior parameters measuring product of Canadian Applanix Corporation, is 0.005° combined with base station data. The airborne multidimensional integrated remote sensing system was implemented successfully, performed the first flying experiment on April, 2005, and obtained satisfying data.
Generalized estimators of avian abundance from count survey data
Royle, J. Andrew
2004-01-01
I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture?recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations) constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.
Can we calibrate simultaneously groundwater recharge and aquifer hydrodynamic parameters ?
NASA Astrophysics Data System (ADS)
Hassane Maina, Fadji; Ackerer, Philippe; Bildstein, Olivier
2017-04-01
By groundwater model calibration, we consider here fitting the measured piezometric heads by estimating the hydrodynamic parameters (storage term and hydraulic conductivity) and the recharge. It is traditionally recommended to avoid simultaneous calibration of groundwater recharge and flow parameters because of correlation between recharge and the flow parameters. From a physical point of view, little recharge associated with low hydraulic conductivity can provide very similar piezometric changes than higher recharge and higher hydraulic conductivity. If this correlation is true under steady state conditions, we assume that this correlation is much weaker under transient conditions because recharge varies in time and the parameters do not. Moreover, the recharge is negligible during summer time for many climatic conditions due to reduced precipitation, increased evaporation and transpiration by vegetation cover. We analyze our hypothesis through global sensitivity analysis (GSA) in conjunction with the polynomial chaos expansion (PCE) methodology. We perform GSA by calculating the Sobol indices, which provide a variance-based 'measure' of the effects of uncertain parameters (storage and hydraulic conductivity) and recharge on the piezometric heads computed by the flow model. The choice of PCE has the following two benefits: (i) it provides the global sensitivity indices in a straightforward manner, and (ii) PCE can serve as a surrogate model for the calibration of parameters. The coefficients of the PCE are computed by probabilistic collocation. We perform the GSA on simplified real conditions coming from an already built groundwater model dedicated to a subdomain of the Upper-Rhine aquifer (geometry, boundary conditions, climatic data). GSA shows that the simultaneous calibration of recharge and flow parameters is possible if the calibration is performed over at least one year. It provides also the valuable information of the sensitivity versus time, depending on the aquifer inertia and climatic conditions. The groundwater levels variations during recharge (increase) are sensitive to the storage coefficient whereas the groundwater levels variations after recharge (decrease) are sensitive to the hydraulic conductivity. The performed model calibration on synthetic data sets shows that the parameters and recharge are estimated quite accurately.
A new Bayesian recursive technique for parameter estimation
NASA Astrophysics Data System (ADS)
Kaheil, Yasir H.; Gill, M. Kashif; McKee, Mac; Bastidas, Luis
2006-08-01
The performance of any model depends on how well its associated parameters are estimated. In the current application, a localized Bayesian recursive estimation (LOBARE) approach is devised for parameter estimation. The LOBARE methodology is an extension of the Bayesian recursive estimation (BARE) method. It is applied in this paper on two different types of models: an artificial intelligence (AI) model in the form of a support vector machine (SVM) application for forecasting soil moisture and a conceptual rainfall-runoff (CRR) model represented by the Sacramento soil moisture accounting (SAC-SMA) model. Support vector machines, based on statistical learning theory (SLT), represent the modeling task as a quadratic optimization problem and have already been used in various applications in hydrology. They require estimation of three parameters. SAC-SMA is a very well known model that estimates runoff. It has a 13-dimensional parameter space. In the LOBARE approach presented here, Bayesian inference is used in an iterative fashion to estimate the parameter space that will most likely enclose a best parameter set. This is done by narrowing the sampling space through updating the "parent" bounds based on their fitness. These bounds are actually the parameter sets that were selected by BARE runs on subspaces of the initial parameter space. The new approach results in faster convergence toward the optimal parameter set using minimum training/calibration data and fewer sets of parameter values. The efficacy of the localized methodology is also compared with the previously used BARE algorithm.
Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José
2015-06-04
In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required.
TCP performance in ATM networks: ABR parameter tuning and ABR/UBR comparisons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chien Fang; Lin, A.
1996-02-27
This paper explores two issues on TOP performance over ATM networks: ABR parameter tuning and performance comparison of binary mode ABR with enhanced UBR services. Of the fifteen parameters defined for ABR, two parameters dominate binary mode ABR performance: Rate Increase Factor (RIF) and Rate Decrease Factor (RDF). Using simulations, we study the effects of these two parameters on TOP over ABR performance. We compare TOP performance with different ABR parameter settings in terms of through-puts and fairness. The effects of different buffer sizes and LAN/WAN distances are also examined. We then compare TOP performance with the best ABR parametermore » setting with corresponding UBR service enhanced with Early Packet Discard and also with a fair buffer allocation scheme. The results show that TOP performance over binary mode ABR is very sensitive to parameter value settings, and that a poor choice of parameters can result in ABR performance worse than that of the much less expensive UBR-EPD scheme.« less
Measurement of Shear Elastic Moduli in Quasi-Incompressible Soft Solids
NASA Astrophysics Data System (ADS)
Rénier, Mathieu; Gennisson, Jean-Luc; Barrière, Christophe; Catheline, Stefan; Tanter, Mickaël; Royer, Daniel; Fink, Mathias
2008-06-01
Recently a nonlinear equation describing the plane shear wave propagation in isotropic quasi-incompressible media has been developed using a new expression of the strain energy density, as a function of the second, third and fourth order shear elastic constants (respectively μ, A, D) [1]. In such a case, the shear nonlinearity parameter βs depends only from these last coefficients. To date, no measurement of the parameter D have been carried out in soft solids. Using a set of two experiments, acoustoelasticity and finite amplitude shear waves, the shear elastic moduli up to the fourth order of soft solids are measured. Firstly, this theoretical background is applied to the acoustoelasticity theory, giving the variations of the shear wave speed as a function of the stress applied to the medium. From such variations, both linear (μ) and third order shear modulus (A) are deduced in agar-gelatin phantoms. Experimentally the radiation force induced by a focused ultrasound beam is used to generate quasi-plane linear shear waves within the medium. Then the shear wave propagation is imaged with an ultrafast ultrasound scanner. Secondly, in order to give rise to finite amplitude plane shear waves, the radiation force generation technique is replaced by a vibrating plate applied at the surface of the phantoms. The propagation is also imaged using the same ultrafast scanner. From the assessment of the third harmonic amplitude, the nonlinearity parameter βS is deduced. Finally, combining these results with the acoustoelasticity experiment, the fourth order modulus (D) is deduced. This set of experiments provides the characterization, up to the fourth order, of the nonlinear shear elastic moduli in quasi-incompressible soft media. Measurements of the A moduli reveal that while the behaviors of both soft solids are close from a linear point of view, the corresponding nonlinear moduli A are quite different. In a 5% agar-gelatin phantom, the fourth order elastic constant D is found to be 30±10 kPa.
Solder doped polycaprolactone scaffold enables reproducible laser tissue soldering.
Bregy, Amadé; Bogni, Serge; Bernau, Vianney J P; Vajtai, Istvan; Vollbach, Felix; Petri-Fink, Alke; Constantinescu, Mihai; Hofmann, Heinrich; Frenz, Martin; Reinert, Michael
2008-12-01
In this in vitro feasibility study we analyzed tissue fusion using bovine serum albumin (BSA) and Indocyanine green (ICG) doped polycaprolactone (PCL) scaffolds in combination with a diode laser as energy source while focusing on the influence of irradiation power and albumin concentration on the resulting tensile strength and induced tissue damage. A porous PCL scaffold doped with either 25% or 40% (w/w) of BSA in combination with 0.1% (w/w) ICG was used to fuse rabbit aortas. Soldering energy was delivered through the vessel from the endoluminal side using a continuous wave diode laser at 808 nm via a 400 microm core fiber. Scaffold surface temperatures were analyzed with an infrared camera. Optimum parameters such as irradiation time, radiation power and temperature were determined in view of maximum tensile strength but simultaneously minimum thermally induced tissue damage. Differential scanning calorimetry (DSC) was performed to measure the influence of PCL on the denaturation temperature of BSA. Optimum parameter settings were found to be 60 seconds irradiation time and 1.5 W irradiation power resulting in tensile strengths of around 2,000 mN. Corresponding scaffold surface temperature was 117.4+/- 12 degrees C. Comparison of the two BSA concentration revealed that 40% BSA scaffold resulted in significant higher tensile strength compared to the 25%. At optimum parameter settings, thermal damage was restricted to the adventitia and its interface with the outermost layer of the tunica media. The DSC showed two endothermic peaks in BSA containing samples, both strongly depending on the water content and the presence of PCL and/or ICG. Diode laser soldering of vascular tissue using BSA-ICG-PCL-scaffolds leads to strong and reproducible tissue bonds, with vessel damage limited to the adventitia. Higher BSA content results in higher tensile strengths. The DSC-measurements showed that BSA denaturation temperature is lowered by addition of water and/or ICG-PCL. (c) 2008 Wiley-Liss, Inc.
Determining "small parameters" for quasi-steady state
NASA Astrophysics Data System (ADS)
Goeke, Alexandra; Walcher, Sebastian; Zerz, Eva
2015-08-01
For a parameter-dependent system of ordinary differential equations we present a systematic approach to the determination of parameter values near which singular perturbation scenarios (in the sense of Tikhonov and Fenichel) arise. We call these special values Tikhonov-Fenichel parameter values. The principal application we intend is to equations that describe chemical reactions, in the context of quasi-steady state (or partial equilibrium) settings. Such equations have rational (or even polynomial) right-hand side. We determine the structure of the set of Tikhonov-Fenichel parameter values as a semi-algebraic set, and present an algorithmic approach to their explicit determination, using Groebner bases. Examples and applications (which include the irreversible and reversible Michaelis-Menten systems) illustrate that the approach is rather easy to implement.
Systematics of hot giant electric dipole resonance widths
NASA Astrophysics Data System (ADS)
Schiller, A.; Thoennessen, M.; McAlpine, K. M.
2007-05-01
Giant Electric Dipole Resonance (GDR) parameters for γ decay to excited states with finite spin and temperature have been compiled by two of the authors ( nucl-ex/0605004). Over 100 original works have been reviewed and from some 70 of them, more than 300 sets of hot GDR parameters for different isotopes, excitation energies, and spin regions have been extracted. All parameter sets have been brought onto a common footing by calculating the equivalent Lorentzian parameters. Together with a complementary compilation by Samuel S. Dietrich and Barry L. Berman [At. Data Nucl. Data Tables 38, 199-338, (1988)] on ground-state photo-neutron and photo-absorption cross sections and their Lorentzian parameters, it is now possible by means of a comparison of the two data sets to shed light on the evolution of GDR parameters with temperature and spin.
General setting from alley, office to left, concrete structure in ...
General setting from alley, office to left, concrete structure in center foreground, garage/shop to right, view to northeast - Former Umatilla Project Headquarters Buildings, Hermiston, Umatilla County, OR
Consilience and a Hierarchy of Species Concepts: Advances Toward Closure on the Species Puzzle
Mayden, Richard L.
1999-01-01
Numerous concepts exist for biological species. This diversity of ideas derives from a number of sources ranging from investigative study of particular taxa and character sets to philosophical aptitude and world view to operationalism and nomenclatorial rules. While usually viewed as counterproductive, in reality these varied concepts can greatly enhance our efforts to discover and understand biological diversity. Moreover, this continued "turf war" and dilemma over species can be resolved if the various concepts are viewed in a hierarchical system and each evaluated for its inherent level of consilience. Under this paradigm a theoretically appropriate, highly consilient concept of species capable of colligating the abundant types of species diversity offers the best guidance for developing and employing secondary operational concepts for identifying diversity. Of all the concepts currently recognized, only the non-operational Evolutionary Species Concept corresponds to the requisite parameters and, therefore, should serve as the theoretical concept appropriate for the category Species. As operational concepts, the remaining ideas have been incompatible with one another in their ability to encompass species diversity because each has restrictive criteria as to what qualifies as a species. However, the operational concepts can complement one another and do serve a vital role under the Evolutionary Species Concept as fundamental tools necessary for discovering diversity compatible with the primary theoretical concept. Thus, the proposed hierarchical system of primary and secondary concepts promises both the most productive framework for mutual respect for varied concepts and the most efficient and effective means for revealing species diversity. PMID:19270881
Young People's Views on the Nature and Purposes of Physical Education: A Sociological Analysis
ERIC Educational Resources Information Center
Smith, Andy; Parr, Michael
2007-01-01
Amid the long-standing debate about the nature and purposes of physical education (PE) in schools, comparatively little research has examined the ways in which PE is viewed by young people themselves. This study set out, therefore, to explore young people's views on the nature and purposes of PE from a sociological perspective in the belief that a…
ERIC Educational Resources Information Center
Wood, Robert W.; Eicher, Charles E.
A sample of 422 students from grades three through eight in Vermillion, South Dakota, schools completed a questionnaire about their television-viewing habits, including a day-by-day record of the amount of television viewed over a two-week period. Analysis of results indicated that the school population had an average of two television sets per…
Linear fully dry polymer actuators
NASA Astrophysics Data System (ADS)
De Rossi, Danilo; Mazzoldi, Alberto
1999-05-01
In the last period, the interest in the development of devices that emulate the properties of the 'par excellence' biological actuator, the human muscle, is considerably grown. The recent advances in the field of conducting polymers open new interesting prospects in this direction: from this point of view polyaniline (PANi), since it is easily produced in fiber form, represents an interesting material. In this conference we report the development of a linear actuator prototype that makes use of PANi fiber. All fabrication steps (fiber extrusion, solid polymer electrolyte preparation, compound realization) and experimental set-up for the electromechanical characterization are described. Quantitative measurements of isotonic length changes and isometric stress generation during electrochemical stimulation are reported. An overall assessment of PANi fibers actuative properties in wet and dry conditions is reported and possible future developments are proposed. Finally, continuum and lumped parameter models formulated to describe passive and active contractile properties of conducting polymer actuators are briefly outlined.
Image and information management system
NASA Technical Reports Server (NTRS)
Robertson, Tina L. (Inventor); Raney, Michael C. (Inventor); Dougherty, Dennis M. (Inventor); Kent, Peter C. (Inventor); Brucker, Russell X. (Inventor); Lampert, Daryl A. (Inventor)
2009-01-01
A system and methods through which pictorial views of an object's configuration, arranged in a hierarchical fashion, are navigated by a person to establish a visual context within the configuration. The visual context is automatically translated by the system into a set of search parameters driving retrieval of structured data and content (images, documents, multimedia, etc.) associated with the specific context. The system places ''hot spots'', or actionable regions, on various portions of the pictorials representing the object. When a user interacts with an actionable region, a more detailed pictorial from the hierarchy is presented representing that portion of the object, along with real-time feedback in the form of a popup pane containing information about that region, and counts-by-type reflecting the number of items that are available within the system associated with the specific context and search filters established at that point in time.
Retina-V1 model of detectability across the visual field
Bradley, Chris; Abrams, Jared; Geisler, Wilson S.
2014-01-01
A practical model is proposed for predicting the detectability of targets at arbitrary locations in the visual field, in arbitrary gray scale backgrounds, and under photopic viewing conditions. The major factors incorporated into the model include (a) the optical point spread function of the eye, (b) local luminance gain control (Weber's law), (c) the sampling array of retinal ganglion cells, (d) orientation and spatial frequency–dependent contrast masking, (e) broadband contrast masking, and (f) efficient response pooling. The model is tested against previously reported threshold measurements on uniform backgrounds (the ModelFest data set and data from Foley, Varadharajan, Koh, & Farias, 2007) and against new measurements reported here for several ModelFest targets presented on uniform, 1/f noise, and natural backgrounds at retinal eccentricities ranging from 0° to 10°. Although the model has few free parameters, it is able to account quite well for all the threshold measurements. PMID:25336179
Role of IAC in large space systems thermal analysis
NASA Technical Reports Server (NTRS)
Jones, G. K.; Skladany, J. T.; Young, J. P.
1982-01-01
Computer analysis programs to evaluate critical coupling effects that can significantly influence spacecraft system performance are described. These coupling effects arise from the varied parameters of the spacecraft systems, environments, and forcing functions associated with disciplines such as thermal, structures, and controls. Adverse effects can be expected to significantly impact system design aspects such as structural integrity, controllability, and mission performance. One such needed design analysis capability is a software system that can integrate individual discipline computer codes into a highly user-oriented/interactive-graphics-based analysis capability. The integrated analysis capability (IAC) system can be viewed as: a core framework system which serves as an integrating base whereby users can readily add desired analysis modules and as a self-contained interdisciplinary system analysis capability having a specific set of fully integrated multidisciplinary analysis programs that deal with the coupling of thermal, structures, controls, antenna radiation performance, and instrument optical performance disciplines.
Image and information management system
NASA Technical Reports Server (NTRS)
Robertson, Tina L. (Inventor); Kent, Peter C. (Inventor); Raney, Michael C. (Inventor); Dougherty, Dennis M. (Inventor); Brucker, Russell X. (Inventor); Lampert, Daryl A. (Inventor)
2007-01-01
A system and methods through which pictorial views of an object's configuration, arranged in a hierarchical fashion, are navigated by a person to establish a visual context within the configuration. The visual context is automatically translated by the system into a set of search parameters driving retrieval of structured data and content (images, documents, multimedia, etc.) associated with the specific context. The system places hot spots, or actionable regions, on various portions of the pictorials representing the object. When a user interacts with an actionable region, a more detailed pictorial from the hierarchy is presented representing that portion of the object, along with real-time feedback in the form of a popup pane containing information about that region, and counts-by-type reflecting the number of items that are available within the system associated with the specific context and search filters established at that point in time.
Spherical Hecke algebra in the Nekrasov-Shatashvili limit
NASA Astrophysics Data System (ADS)
Bourgine, Jean-Emile
2015-01-01
The Spherical Hecke central (SHc) algebra has been shown to act on the Nekrasov instanton partition functions of gauge theories. Its presence accounts for both integrability and AGT correspondence. On the other hand, a specific limit of the Omega background, introduced by Nekrasov and Shatashvili (NS), leads to the appearance of TBA and Bethe like equations. To unify these two points of view, we study the NS limit of the SHc algebra. We provide an expression of the instanton partition function in terms of Bethe roots, and define a set of operators that generates infinitesimal variations of the roots. These operators obey the commutation relations defining the SHc algebra at first order in the equivariant parameter ɛ 2. Furthermore, their action on the bifundamental contributions reproduces the Kanno-Matsuo-Zhang transformation. We also discuss the connections with the Mayer cluster expansion approach that leads to TBA-like equations.
NASA Astrophysics Data System (ADS)
Wei, Xile; Si, Kaili; Yi, Guosheng; Wang, Jiang; Lu, Meili
2016-07-01
In this paper, we use a reduced two-compartment neuron model to investigate the interaction between extracellular subthreshold electric field and synchrony in small world networks. It is observed that network synchronization is closely related to the strength of electric field and geometric properties of the two-compartment model. Specifically, increasing the electric field induces a gradual improvement in network synchrony, while increasing the geometric factor results in an abrupt decrease in synchronization of network. In addition, increasing electric field can make the network become synchronous from asynchronous when the geometric parameter is set to a given value. Furthermore, it is demonstrated that network synchrony can also be affected by the firing frequency and dynamical bifurcation feature of single neuron. These results highlight the effect of weak field on network synchrony from the view of biophysical model, which may contribute to further understanding the effect of electric field on network activity.
A visual detection model for DCT coefficient quantization
NASA Technical Reports Server (NTRS)
Ahumada, Albert J., Jr.; Peterson, Heidi A.
1993-01-01
The discrete cosine transform (DCT) is widely used in image compression, and is part of the JPEG and MPEG compression standards. The degree of compression, and the amount of distortion in the decompressed image are determined by the quantization of the transform coefficients. The standards do not specify how the DCT coefficients should be quantized. Our approach is to set the quantization level for each coefficient so that the quantization error is at the threshold of visibility. Here we combine results from our previous work to form our current best detection model for DCT coefficient quantization noise. This model predicts sensitivity as a function of display parameters, enabling quantization matrices to be designed for display situations varying in luminance, veiling light, and spatial frequency related conditions (pixel size, viewing distance, and aspect ratio). It also allows arbitrary color space directions for the representation of color.