Application of the artificial bee colony algorithm for solving the set covering problem.
Crawford, Broderick; Soto, Ricardo; Cuesta, Rodrigo; Paredes, Fernando
2014-01-01
The set covering problem is a formal model for many practical optimization problems. In the set covering problem the goal is to choose a subset of the columns of minimal cost that covers every row. Here, we present a novel application of the artificial bee colony algorithm to solve the non-unicost set covering problem. The artificial bee colony algorithm is a recent swarm metaheuristic technique based on the intelligent foraging behavior of honey bees. Experimental results show that our artificial bee colony algorithm is competitive in terms of solution quality with other recent metaheuristic approaches for the set covering problem.
Application of the Artificial Bee Colony Algorithm for Solving the Set Covering Problem
Crawford, Broderick; Soto, Ricardo; Cuesta, Rodrigo; Paredes, Fernando
2014-01-01
The set covering problem is a formal model for many practical optimization problems. In the set covering problem the goal is to choose a subset of the columns of minimal cost that covers every row. Here, we present a novel application of the artificial bee colony algorithm to solve the non-unicost set covering problem. The artificial bee colony algorithm is a recent swarm metaheuristic technique based on the intelligent foraging behavior of honey bees. Experimental results show that our artificial bee colony algorithm is competitive in terms of solution quality with other recent metaheuristic approaches for the set covering problem. PMID:24883356
On the Impact of Local Taxes in a Set Cover Game
NASA Astrophysics Data System (ADS)
Escoffier, Bruno; Gourvès, Laurent; Monnot, Jérôme
Given a collection C of weighted subsets of a ground set E, the SET cover problem is to find a minimum weight subset of C which covers all elements of E. We study a strategic game defined upon this classical optimization problem. Every element of E is a player which chooses one set of C where it appears. Following a public tax function, every player is charged a fraction of the weight of the set that it has selected. Our motivation is to design a tax function having the following features: it can be implemented in a distributed manner, existence of an equilibrium is guaranteed and the social cost for these equilibria is minimized.
A combinatorial approach to the design of vaccines.
Martínez, Luis; Milanič, Martin; Legarreta, Leire; Medvedev, Paul; Malaina, Iker; de la Fuente, Ildefonso M
2015-05-01
We present two new problems of combinatorial optimization and discuss their applications to the computational design of vaccines. In the shortest λ-superstring problem, given a family S1,...,S(k) of strings over a finite alphabet, a set Τ of "target" strings over that alphabet, and an integer λ, the task is to find a string of minimum length containing, for each i, at least λ target strings as substrings of S(i). In the shortest λ-cover superstring problem, given a collection X1,...,X(n) of finite sets of strings over a finite alphabet and an integer λ, the task is to find a string of minimum length containing, for each i, at least λ elements of X(i) as substrings. The two problems are polynomially equivalent, and the shortest λ-cover superstring problem is a common generalization of two well known combinatorial optimization problems, the shortest common superstring problem and the set cover problem. We present two approaches to obtain exact or approximate solutions to the shortest λ-superstring and λ-cover superstring problems: one based on integer programming, and a hill-climbing algorithm. An application is given to the computational design of vaccines and the algorithms are applied to experimental data taken from patients infected by H5N1 and HIV-1.
General form of a cooperative gradual maximal covering location problem
NASA Astrophysics Data System (ADS)
Bagherinejad, Jafar; Bashiri, Mahdi; Nikzad, Hamideh
2018-07-01
Cooperative and gradual covering are two new methods for developing covering location models. In this paper, a cooperative maximal covering location-allocation model is developed (CMCLAP). In addition, both cooperative and gradual covering concepts are applied to the maximal covering location simultaneously (CGMCLP). Then, we develop an integrated form of a cooperative gradual maximal covering location problem, which is called a general CGMCLP. By setting the model parameters, the proposed general model can easily be transformed into other existing models, facilitating general comparisons. The proposed models are developed without allocation for physical signals and with allocation for non-physical signals in discrete location space. Comparison of the previously introduced gradual maximal covering location problem (GMCLP) and cooperative maximal covering location problem (CMCLP) models with our proposed CGMCLP model in similar data sets shows that the proposed model can cover more demands and acts more efficiently. Sensitivity analyses are performed to show the effect of related parameters and the model's validity. Simulated annealing (SA) and a tabu search (TS) are proposed as solution algorithms for the developed models for large-sized instances. The results show that the proposed algorithms are efficient solution approaches, considering solution quality and running time.
Determination of Algorithm Parallelism in NP Complete Problems for Distributed Architectures
1990-03-05
12 structure STACK declare OpenStack (S-.NODE **TopPtr) -+TopPtrI FlushStack(S.-NODE **TopPtr) -*TopPtr PushOnStack(S-.NODE **TopPtr, ITEM *NewltemPtr...OfCoveringSets, CoveringSets, L, Best CoverTime, Vertex, Set3end SCND ADT B.26 structure STACKI declare OpenStack (S-NODE **TopPtr) -+TopPtr FlushStack(S
Optimizing sensor cover energy for directional sensors
NASA Astrophysics Data System (ADS)
Astorino, Annabella; Gaudioso, Manlio; Miglionico, Giovanna
2016-10-01
The Directional Sensors Continuous Coverage Problem (DSCCP) aims at covering a given set of targets in a plane by means of a set of directional sensors. The location of these sensors is known in advance and they are characterized by a discrete set of possible radii and aperture angles. Decisions to be made are about orientation (which in our approach can vary continuously), radius and aperture angle of each sensor. The objective is to get a minimum cost coverage of all targets, if any. We introduce a MINLP formulation of the problem and define a Lagrangian heuristics based on a dual ascent procedure operating on one multiplier at a time. Finally we report the results of the implementation of the method on a set of test problems.
Space-ecology set covering problem for modeling Daiyun Mountain Reserve, China
NASA Astrophysics Data System (ADS)
Lin, Chih-Wei; Liu, Jinfu; Huang, Jiahang; Zhang, Huiguang; Lan, Siren; Hong, Wei; Li, Wenzhou
2018-02-01
Site selection is an important issue in designing the nature reserve that has been studied over the years. However, a well-balanced relationship between preservation of biodiversity and site selection is still challenging. Unlike the existing methods, we consider three critical components, the spatial continuity, spatial compactness and ecological information to address the problem of designing the reserve. In this paper, we propose a new mathematical model of set covering problem called Space-ecology Set Covering Problem (SeSCP) for designing a reserve network. First, we generate the ecological information by forest resource investigation. Then, we split the landscape into elementary cells and calculate the ecological score of each cell. Next, we associate the ecological information with the spatial properties to select a set of cells to form a nature reserve for improving the ability of protecting the biodiversity. Two spatial constraints, continuity and compactability, are given in SeSCP. The continuity is to ensure that any selected site has to be connected with adjacent sites and the compactability is to minimize the perimeter of the selected sites. In computational experiments, we take Daiyun Mountain as a study area to demonstrate the feasibility and effectiveness of the proposed model.
A set-covering formulation for a drayage problem with single and double container loads
NASA Astrophysics Data System (ADS)
Ghezelsoflu, A.; Di Francesco, M.; Frangioni, A.; Zuddas, P.
2018-01-01
This paper addresses a drayage problem, which is motivated by the case study of a real carrier. Its trucks carry one or two containers from a port to importers and from exporters to the port. Since up to four customers can be served in each route, we propose a set-covering formulation for this problem where all possible routes are enumerated. This model can be efficiently solved to optimality by a commercial solver, significantly outperforming a previously proposed node-arc formulation. Moreover, the model can be effectively used to evaluate a new distribution policy, which results in an enlarged set of feasible routes and can increase savings w.r.t. the policy currently employed by the carrier.
Global land cover mapping and characterization: present situation and future research priorities
Giri, Chandra
2005-01-01
The availability and accessibility of global land cover data sets plays an important role in many global change studies. The importance of such science‐based information is also reflected in a number of international, regional, and national projects and programs. Recent developments in earth observing satellite technology, information technology, computer hardware and software, and infrastructure development have helped developed better quality land cover data sets. As a result, such data sets are increasingly becoming available, the user‐base is ever widening, application areas have been expanding, and the potential of many other applications are enormous. Yet, we are far from producing high quality global land cover data sets. This paper examines the progress in the development of digital global land cover data, their availability, and current applications. Problems and opportunities are also explained. The overview sets the stage for identifying future research priorities needed for operational land cover assessment and monitoring.
Guturu, Parthasarathy; Dantu, Ram
2008-06-01
Many graph- and set-theoretic problems, because of their tremendous application potential and theoretical appeal, have been well investigated by the researchers in complexity theory and were found to be NP-hard. Since the combinatorial complexity of these problems does not permit exhaustive searches for optimal solutions, only near-optimal solutions can be explored using either various problem-specific heuristic strategies or metaheuristic global-optimization methods, such as simulated annealing, genetic algorithms, etc. In this paper, we propose a unified evolutionary algorithm (EA) to the problems of maximum clique finding, maximum independent set, minimum vertex cover, subgraph and double subgraph isomorphism, set packing, set partitioning, and set cover. In the proposed approach, we first map these problems onto the maximum clique-finding problem (MCP), which is later solved using an evolutionary strategy. The proposed impatient EA with probabilistic tabu search (IEA-PTS) for the MCP integrates the best features of earlier successful approaches with a number of new heuristics that we developed to yield a performance that advances the state of the art in EAs for the exploration of the maximum cliques in a graph. Results of experimentation with the 37 DIMACS benchmark graphs and comparative analyses with six state-of-the-art algorithms, including two from the smaller EA community and four from the larger metaheuristics community, indicate that the IEA-PTS outperforms the EAs with respect to a Pareto-lexicographic ranking criterion and offers competitive performance on some graph instances when individually compared to the other heuristic algorithms. It has also successfully set a new benchmark on one graph instance. On another benchmark suite called Benchmarks with Hidden Optimal Solutions, IEA-PTS ranks second, after a very recent algorithm called COVER, among its peers that have experimented with this suite.
On the Critical Behaviour, Crossover Point and Complexity of the Exact Cover Problem
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Smelyanskiy, Vadim N.; Shumow, Daniel; Koga, Dennis (Technical Monitor)
2003-01-01
Research into quantum algorithms for NP-complete problems has rekindled interest in the detailed study a broad class of combinatorial problems. A recent paper applied the quantum adiabatic evolution algorithm to the Exact Cover problem for 3-sets (EC3), and provided an empirical evidence that the algorithm was polynomial. In this paper we provide a detailed study of the characteristics of the exact cover problem. We present the annealing approximation applied to EC3, which gives an over-estimate of the phase transition point. We also identify empirically the phase transition point. We also study the complexity of two classical algorithms on this problem: Davis-Putnam and Simulated Annealing. For these algorithms, EC3 is significantly easier than 3-SAT.
2015-03-26
Turkish Airborne Early Warning and Control (AEW& C ) aircraft in the combat arena. He examines three combat scenarios Turkey might encounter to cover and...to limited SAR assets, constrained budgets, logistic- maintenance problems, and high risk level of military flights. In recent years, the Turkish Air...model, Set Covering Location Problem (SCLP), defines the minimum number of SAR DPs to cover all fighter aircraft training areas (TAs). The second
On Making a Distinguished Vertex Minimum Degree by Vertex Deletion
NASA Astrophysics Data System (ADS)
Betzler, Nadja; Bredereck, Robert; Niedermeier, Rolf; Uhlmann, Johannes
For directed and undirected graphs, we study the problem to make a distinguished vertex the unique minimum-(in)degree vertex through deletion of a minimum number of vertices. The corresponding NP-hard optimization problems are motivated by applications concerning control in elections and social network analysis. Continuing previous work for the directed case, we show that the problem is W[2]-hard when parameterized by the graph's feedback arc set number, whereas it becomes fixed-parameter tractable when combining the parameters "feedback vertex set number" and "number of vertices to delete". For the so far unstudied undirected case, we show that the problem is NP-hard and W[1]-hard when parameterized by the "number of vertices to delete". On the positive side, we show fixed-parameter tractability for several parameterizations measuring tree-likeness, including a vertex-linear problem kernel with respect to the parameter "feedback edge set number". On the contrary, we show a non-existence result concerning polynomial-size problem kernels for the combined parameter "vertex cover number and number of vertices to delete", implying corresponding nonexistence results when replacing vertex cover number by treewidth or feedback vertex set number.
Unsupervised universal steganalyzer for high-dimensional steganalytic features
NASA Astrophysics Data System (ADS)
Hou, Xiaodan; Zhang, Tao
2016-11-01
The research in developing steganalytic features has been highly successful. These features are extremely powerful when applied to supervised binary classification problems. However, they are incompatible with unsupervised universal steganalysis because the unsupervised method cannot distinguish embedding distortion from varying levels of noises caused by cover variation. This study attempts to alleviate the problem by introducing similarity retrieval of image statistical properties (SRISP), with the specific aim of mitigating the effect of cover variation on the existing steganalytic features. First, cover images with some statistical properties similar to those of a given test image are searched from a retrieval cover database to establish an aided sample set. Then, unsupervised outlier detection is performed on a test set composed of the given test image and its aided sample set to determine the type (cover or stego) of the given test image. Our proposed framework, called SRISP-aided unsupervised outlier detection, requires no training. Thus, it does not suffer from model mismatch mess. Compared with prior unsupervised outlier detectors that do not consider SRISP, the proposed framework not only retains the universality but also exhibits superior performance when applied to high-dimensional steganalytic features.
NASA Technical Reports Server (NTRS)
Tuey, R. C.
1972-01-01
Computer solutions of linear programming problems are outlined. Information covers vector spaces, convex sets, and matrix algebra elements for solving simultaneous linear equations. Dual problems, reduced cost analysis, ranges, and error analysis are illustrated.
NASA Technical Reports Server (NTRS)
Chang, H.
1976-01-01
A computer program using Lemke, Salkin and Spielberg's Set Covering Algorithm (SCA) to optimize a traffic model problem in the Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE) was documented. SCA forms a submodule of SAMPLE and provides for input and output, subroutines, and an interactive feature for performing the optimization and arranging the results in a readily understandable form for output.
NASA Astrophysics Data System (ADS)
Wayan Suletra, I.; Priyandari, Yusuf; Jauhari, Wakhid A.
2018-03-01
We propose a new model of facility location to solve a kind of problem that belong to a class of set-covering problem using an integer programming formulation. Our model contains a single objective function, but it represents two goals. The first is to minimize the number of facilities, and the other is to minimize the total distance of customers to facilities. The first goal is a mandatory goal, and the second is an improvement goal that is very useful when alternate optimum solutions for the first goal exist. We use a big number as a weight on the first goal to force the solution algorithm to give first priority to the first goal. Besides considering capacity constraints, our model accommodates a kind of either-or constraints representing facilities dependency. The either-or constraints will prevent the solution algorithm to select two or more facilities from the same set of facility with mutually exclusive properties. A real location selection problem to locate a set of wastewater treatment facility (IPAL) in Surakarta city, Indonesia, will describe the implementation of our model. A numerical example is given using the data of that real problem.
Approximation algorithms for a genetic diagnostics problem.
Kosaraju, S R; Schäffer, A A; Biesecker, L G
1998-01-01
We define and study a combinatorial problem called WEIGHTED DIAGNOSTIC COVER (WDC) that models the use of a laboratory technique called genotyping in the diagnosis of an important class of chromosomal aberrations. An optimal solution to WDC would enable us to define a genetic assay that maximizes the diagnostic power for a specified cost of laboratory work. We develop approximation algorithms for WDC by making use of the well-known problem SET COVER for which the greedy heuristic has been extensively studied. We prove worst-case performance bounds on the greedy heuristic for WDC and for another heuristic we call directional greedy. We implemented both heuristics. We also implemented a local search heuristic that takes the solutions obtained by greedy and dir-greedy and applies swaps until they are locally optimal. We report their performance on a real data set that is representative of the options that a clinical geneticist faces for the real diagnostic problem. Many open problems related to WDC remain, both of theoretical interest and practical importance.
Solving Set Cover with Pairs Problem using Quantum Annealing
NASA Astrophysics Data System (ADS)
Cao, Yudong; Jiang, Shuxian; Perouli, Debbie; Kais, Sabre
2016-09-01
Here we consider using quantum annealing to solve Set Cover with Pairs (SCP), an NP-hard combinatorial optimization problem that plays an important role in networking, computational biology, and biochemistry. We show an explicit construction of Ising Hamiltonians whose ground states encode the solution of SCP instances. We numerically simulate the time-dependent Schrödinger equation in order to test the performance of quantum annealing for random instances and compare with that of simulated annealing. We also discuss explicit embedding strategies for realizing our Hamiltonian construction on the D-wave type restricted Ising Hamiltonian based on Chimera graphs. Our embedding on the Chimera graph preserves the structure of the original SCP instance and in particular, the embedding for general complete bipartite graphs and logical disjunctions may be of broader use than that the specific problem we deal with.
Distributed Sleep Scheduling in Wireless Sensor Networks via Fractional Domatic Partitioning
NASA Astrophysics Data System (ADS)
Schumacher, André; Haanpää, Harri
We consider setting up sleep scheduling in sensor networks. We formulate the problem as an instance of the fractional domatic partition problem and obtain a distributed approximation algorithm by applying linear programming approximation techniques. Our algorithm is an application of the Garg-Könemann (GK) scheme that requires solving an instance of the minimum weight dominating set (MWDS) problem as a subroutine. Our two main contributions are a distributed implementation of the GK scheme for the sleep-scheduling problem and a novel asynchronous distributed algorithm for approximating MWDS based on a primal-dual analysis of Chvátal's set-cover algorithm. We evaluate our algorithm with
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Macready, William G.
2005-01-01
Recent work on the mathematical foundations of optimization has begun to uncover its rich structure. In particular, the "No Free Lunch" (NFL) theorems state that any two algorithms are equivalent when their performance is averaged across all possible problems. This highlights the need for exploiting problem-specific knowledge to achieve better than random performance. In this paper we present a general framework covering more search scenarios. In addition to the optimization scenarios addressed in the NFL results, this framework covers multi-armed bandit problems and evolution of multiple co-evolving players. As a particular instance of the latter, it covers "self-play" problems. In these problems the set of players work together to produce a champion, who then engages one or more antagonists in a subsequent multi-player game. In contrast to the traditional optimization case where the NFL results hold, we show that in self-play there are free lunches: in coevolution some algorithms have better performance than other algorithms, averaged across all possible problems. We consider the implications of these results to biology where there is no champion.
Rural Development: Problems and Advantages of Rural Locations for Industrial Plants.
ERIC Educational Resources Information Center
Bishop, C. E.; And Others
The problems and advantages of locating industry in a rural setting were discussed in this conference report. The 10 individual speeches covered: changes in employment and the labor force; problems and advantages of rural locations, rural labor, and site selection; the importance of involving the Black community; the nature of the food processing…
A Hybrid Cellular Genetic Algorithm for Multi-objective Crew Scheduling Problem
NASA Astrophysics Data System (ADS)
Jolai, Fariborz; Assadipour, Ghazal
Crew scheduling is one of the important problems of the airline industry. This problem aims to cover a number of flights by crew members, such that all the flights are covered. In a robust scheduling the assignment should be so that the total cost, delays, and unbalanced utilization are minimized. As the problem is NP-hard and the objectives are in conflict with each other, a multi-objective meta-heuristic called CellDE, which is a hybrid cellular genetic algorithm, is implemented as the optimization method. The proposed algorithm provides the decision maker with a set of non-dominated or Pareto-optimal solutions, and enables them to choose the best one according to their preferences. A set of problems of different sizes is generated and solved using the proposed algorithm. Evaluating the performance of the proposed algorithm, three metrics are suggested, and the diversity and the convergence of the achieved Pareto front are appraised. Finally a comparison is made between CellDE and PAES, another meta-heuristic algorithm. The results show the superiority of CellDE.
NASA Aviation Safety Reporting System
NASA Technical Reports Server (NTRS)
1980-01-01
Problems in briefing of relief by air traffic controllers are discussed, including problems that arise when duty positions are changed by controllers. Altimeter reading and setting errors as factors in aviation safety are discussed, including problems associated with altitude-including instruments. A sample of reports from pilots and controllers is included, covering the topics of ATIS broadcasts an clearance readback problems. A selection of Alert Bulletins, with their responses, is included.
A set-covering based heuristic algorithm for the periodic vehicle routing problem.
Cacchiani, V; Hemmelmayr, V C; Tricoire, F
2014-01-30
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems.
A set-covering based heuristic algorithm for the periodic vehicle routing problem
Cacchiani, V.; Hemmelmayr, V.C.; Tricoire, F.
2014-01-01
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems. PMID:24748696
NASA Astrophysics Data System (ADS)
Wei, Xiao-Ran; Zhang, Yu-He; Geng, Guo-Hua
2016-09-01
In this paper, we examined how printing the hollow objects without infill via fused deposition modeling, one of the most widely used 3D-printing technologies, by partitioning the objects to shell parts. More specifically, we linked the partition to the exact cover problem. Given an input watertight mesh shape S, we developed region growing schemes to derive a set of surfaces that had inside surfaces that were printable without support on the mesh for the candidate parts. We then employed Monte Carlo tree search over the candidate parts to obtain the optimal set cover. All possible candidate subsets of exact cover from the optimal set cover were then obtained and the bounded tree was used to search the optimal exact cover. We oriented each shell part to the optimal position to guarantee the inside surface was printed without support, while the outside surface was printed with minimum support. Our solution can be applied to a variety of models, closed-hollowed or semi-closed, with or without holes, as evidenced by experiments and performance evaluation on our proposed algorithm.
On Efficient Deployment of Wireless Sensors for Coverage and Connectivity in Constrained 3D Space.
Wu, Chase Q; Wang, Li
2017-10-10
Sensor networks have been used in a rapidly increasing number of applications in many fields. This work generalizes a sensor deployment problem to place a minimum set of wireless sensors at candidate locations in constrained 3D space to k -cover a given set of target objects. By exhausting the combinations of discreteness/continuousness constraints on either sensor locations or target objects, we formulate four classes of sensor deployment problems in 3D space: deploy sensors at Discrete/Continuous Locations (D/CL) to cover Discrete/Continuous Targets (D/CT). We begin with the design of an approximate algorithm for DLDT and then reduce DLCT, CLDT, and CLCT to DLDT by discretizing continuous sensor locations or target objects into a set of divisions without sacrificing sensing precision. Furthermore, we consider a connected version of each problem where the deployed sensors must form a connected network, and design an approximation algorithm to minimize the number of deployed sensors with connectivity guarantee. For performance comparison, we design and implement an optimal solution and a genetic algorithm (GA)-based approach. Extensive simulation results show that the proposed deployment algorithms consistently outperform the GA-based heuristic and achieve a close-to-optimal performance in small-scale problem instances and a significantly superior overall performance than the theoretical upper bound.
NASA Astrophysics Data System (ADS)
Kodali, Anuradha
In this thesis, we develop dynamic multiple fault diagnosis (DMFD) algorithms to diagnose faults that are sporadic and coupled. Firstly, we formulate a coupled factorial hidden Markov model-based (CFHMM) framework to diagnose dependent faults occurring over time (dynamic case). Here, we implement a mixed memory Markov coupling model to determine the most likely sequence of (dependent) fault states, the one that best explains the observed test outcomes over time. An iterative Gauss-Seidel coordinate ascent optimization method is proposed for solving the problem. A soft Viterbi algorithm is also implemented within the framework for decoding dependent fault states over time. We demonstrate the algorithm on simulated and real-world systems with coupled faults; the results show that this approach improves the correct isolation rate as compared to the formulation where independent fault states are assumed. Secondly, we formulate a generalization of set-covering, termed dynamic set-covering (DSC), which involves a series of coupled set-covering problems over time. The objective of the DSC problem is to infer the most probable time sequence of a parsimonious set of failure sources that explains the observed test outcomes over time. The DSC problem is NP-hard and intractable due to the fault-test dependency matrix that couples the failed tests and faults via the constraint matrix, and the temporal dependence of failure sources over time. Here, the DSC problem is motivated from the viewpoint of a dynamic multiple fault diagnosis problem, but it has wide applications in operations research, for e.g., facility location problem. Thus, we also formulated the DSC problem in the context of a dynamically evolving facility location problem. Here, a facility can be opened, closed, or can be temporarily unavailable at any time for a given requirement of demand points. These activities are associated with costs or penalties, viz., phase-in or phase-out for the opening or closing of a facility, respectively. The set-covering matrix encapsulates the relationship among the rows (tests or demand points) and columns (faults or locations) of the system at each time. By relaxing the coupling constraints using Lagrange multipliers, the DSC problem can be decoupled into independent subproblems, one for each column. Each subproblem is solved using the Viterbi decoding algorithm, and a primal feasible solution is constructed by modifying the Viterbi solutions via a heuristic. The proposed Viterbi-Lagrangian relaxation algorithm (VLRA) provides a measure of suboptimality via an approximate duality gap. As a major practical extension of the above problem, we also consider the problem of diagnosing faults with delayed test outcomes, termed delay-dynamic set-covering (DDSC), and experiment with real-world problems that exhibit masking faults. Also, we present simulation results on OR-library datasets (set-covering formulations are predominantly validated on these matrices in the literature), posed as facility location problems. Finally, we implement these algorithms to solve problems in aerospace and automotive applications. Firstly, we address the diagnostic ambiguity problem in aerospace and automotive applications by developing a dynamic fusion framework that includes dynamic multiple fault diagnosis algorithms. This improves the correct fault isolation rate, while minimizing the false alarm rates, by considering multiple faults instead of the traditional data-driven techniques based on single fault (class)-single epoch (static) assumption. The dynamic fusion problem is formulated as a maximum a posteriori decision problem of inferring the fault sequence based on uncertain outcomes of multiple binary classifiers over time. The fusion process involves three steps: the first step transforms the multi-class problem into dichotomies using error correcting output codes (ECOC), thereby solving the concomitant binary classification problems; the second step fuses the outcomes of multiple binary classifiers over time using a sliding window or block dynamic fusion method that exploits temporal data correlations over time. We solve this NP-hard optimization problem via a Lagrangian relaxation (variational) technique. The third step optimizes the classifier parameters, viz., probabilities of detection and false alarm, using a genetic algorithm. The proposed algorithm is demonstrated by computing the diagnostic performance metrics on a twin-spool commercial jet engine, an automotive engine, and UCI datasets (problems with high classification error are specifically chosen for experimentation). We show that the primal-dual optimization framework performed consistently better than any traditional fusion technique, even when it is forced to give a single fault decision across a range of classification problems. Secondly, we implement the inference algorithms to diagnose faults in vehicle systems that are controlled by a network of electronic control units (ECUs). The faults, originating from various interactions and especially between hardware and software, are particularly challenging to address. Our basic strategy is to divide the fault universe of such cyber-physical systems in a hierarchical manner, and monitor the critical variables/signals that have impact at different levels of interactions. The proposed diagnostic strategy is validated on an electrical power generation and storage system (EPGS) controlled by two ECUs in an environment with CANoe/MATLAB co-simulation. Eleven faults are injected with the failures originating in actuator hardware, sensor, controller hardware and software components. Diagnostic matrix is established to represent the relationship between the faults and the test outcomes (also known as fault signatures) via simulations. The results show that the proposed diagnostic strategy is effective in addressing the interaction-caused faults.
Identifying the minor set cover of dense connected bipartite graphs via random matching edge sets
NASA Astrophysics Data System (ADS)
Hamilton, Kathleen E.; Humble, Travis S.
2017-04-01
Using quantum annealing to solve an optimization problem requires minor embedding a logic graph into a known hardware graph. In an effort to reduce the complexity of the minor embedding problem, we introduce the minor set cover (MSC) of a known graph G: a subset of graph minors which contain any remaining minor of the graph as a subgraph. Any graph that can be embedded into G will be embeddable into a member of the MSC. Focusing on embedding into the hardware graph of commercially available quantum annealers, we establish the MSC for a particular known virtual hardware, which is a complete bipartite graph. We show that the complete bipartite graph K_{N,N} has a MSC of N minors, from which K_{N+1} is identified as the largest clique minor of K_{N,N}. The case of determining the largest clique minor of hardware with faults is briefly discussed but remains an open question.
Identifying the minor set cover of dense connected bipartite graphs via random matching edge sets
Hamilton, Kathleen E.; Humble, Travis S.
2017-02-23
Using quantum annealing to solve an optimization problem requires minor embedding a logic graph into a known hardware graph. We introduce the minor set cover (MSC) of a known graph GG : a subset of graph minors which contain any remaining minor of the graph as a subgraph, in an effort to reduce the complexity of the minor embedding problem. Any graph that can be embedded into GG will be embeddable into a member of the MSC. Focusing on embedding into the hardware graph of commercially available quantum annealers, we establish the MSC for a particular known virtual hardware, whichmore » is a complete bipartite graph. Furthermore, we show that the complete bipartite graph K N,N has a MSC of N minors, from which K N+1 is identified as the largest clique minor of K N,N. In the case of determining the largest clique minor of hardware with faults we briefly discussed this open question.« less
NASA Technical Reports Server (NTRS)
Wray, J. R.
1982-01-01
Selecting a site for a nuclear powerplant can be helped by digitizing land use and land cover data, population data, and other pertinent data sets, and then placing them in a geographic information system. Such a system begins with a set of standardized maps for location reference and then provides for retrieval and analysis of spatial data keyed to the maps. This makes possible thematic mapping by computer, or interactive visual display for decisionmaking. It also permits correlating land use area measurements with census and other data (such as fallout dosages), and the updating of all data sets. The system is thus a tool for dealing with resource management problems and for analyzing the interaction between people and their environment. An explanation of a computer-plotted map of land use and cover for Three Mile Island and vicinity is given.
Unbiased Taxonomic Annotation of Metagenomic Samples
Fosso, Bruno; Pesole, Graziano; Rosselló, Francesc
2018-01-01
Abstract The classification of reads from a metagenomic sample using a reference taxonomy is usually based on first mapping the reads to the reference sequences and then classifying each read at a node under the lowest common ancestor of the candidate sequences in the reference taxonomy with the least classification error. However, this taxonomic annotation can be biased by an imbalanced taxonomy and also by the presence of multiple nodes in the taxonomy with the least classification error for a given read. In this article, we show that the Rand index is a better indicator of classification error than the often used area under the receiver operating characteristic (ROC) curve and F-measure for both balanced and imbalanced reference taxonomies, and we also address the second source of bias by reducing the taxonomic annotation problem for a whole metagenomic sample to a set cover problem, for which a logarithmic approximation can be obtained in linear time and an exact solution can be obtained by integer linear programming. Experimental results with a proof-of-concept implementation of the set cover approach to taxonomic annotation in a next release of the TANGO software show that the set cover approach further reduces ambiguity in the taxonomic annotation obtained with TANGO without distorting the relative abundance profile of the metagenomic sample. PMID:29028181
Proficiency Standards and Cut-Scores for Language Proficiency Tests.
ERIC Educational Resources Information Center
Moy, Raymond H.
1984-01-01
Discusses the problems associated with "grading on a curve," the approach often used for standard setting on language proficiency tests. Proposes four main steps presented in the setting of a non-arbitrary cut-score. These steps not only establish a proficiency standard checked by external criteria, but also check to see that the test covers the…
Primal-dual techniques for online algorithms and mechanisms
NASA Astrophysics Data System (ADS)
Liaghat, Vahid
An offline algorithm is one that knows the entire input in advance. An online algorithm, however, processes its input in a serial fashion. In contrast to offline algorithms, an online algorithm works in a local fashion and has to make irrevocable decisions without having the entire input. Online algorithms are often not optimal since their irrevocable decisions may turn out to be inefficient after receiving the rest of the input. For a given online problem, the goal is to design algorithms which are competitive against the offline optimal solutions. In a classical offline scenario, it is often common to see a dual analysis of problems that can be formulated as a linear or convex program. Primal-dual and dual-fitting techniques have been successfully applied to many such problems. Unfortunately, the usual tricks come short in an online setting since an online algorithm should make decisions without knowing even the whole program. In this thesis, we study the competitive analysis of fundamental problems in the literature such as different variants of online matching and online Steiner connectivity, via online dual techniques. Although there are many generic tools for solving an optimization problem in the offline paradigm, in comparison, much less is known for tackling online problems. The main focus of this work is to design generic techniques for solving integral linear optimization problems where the solution space is restricted via a set of linear constraints. A general family of these problems are online packing/covering problems. Our work shows that for several seemingly unrelated problems, primal-dual techniques can be successfully applied as a unifying approach for analyzing these problems. We believe this leads to generic algorithmic frameworks for solving online problems. In the first part of the thesis, we show the effectiveness of our techniques in the stochastic settings and their applications in Bayesian mechanism design. In particular, we introduce new techniques for solving a fundamental linear optimization problem, namely, the stochastic generalized assignment problem (GAP). This packing problem generalizes various problems such as online matching, ad allocation, bin packing, etc. We furthermore show applications of such results in the mechanism design by introducing Prophet Secretary, a novel Bayesian model for online auctions. In the second part of the thesis, we focus on the covering problems. We develop the framework of "Disk Painting" for a general class of network design problems that can be characterized by proper functions. This class generalizes the node-weighted and edge-weighted variants of several well-known Steiner connectivity problems. We furthermore design a generic technique for solving the prize-collecting variants of these problems when there exists a dual analysis for the non-prize-collecting counterparts. Hence, we solve the online prize-collecting variants of several network design problems for the first time. Finally we focus on designing techniques for online problems with mixed packing/covering constraints. We initiate the study of degree-bounded graph optimization problems in the online setting by designing an online algorithm with a tight competitive ratio for the degree-bounded Steiner forest problem. We hope these techniques establishes a starting point for the analysis of the important class of online degree-bounded optimization on graphs.
ERIC Educational Resources Information Center
Farrell, Albert D.; Sullivan, Terri N.
2004-01-01
Two studies used latent growth-curve analysis to examine the relation between witnessing violence and changes in problem behaviors (drug use, aggression, and delinquency) and attitudes during early adolescence. In Study 1, six waves of data covering 6th to 8th grades were collected from 731 students in urban schools serving mostly African-American…
ERIC Educational Resources Information Center
Goldston, J. W.
This unit introduces analytic solutions of ordinary differential equations. The objective is to enable the student to decide whether a given function solves a given differential equation. Examples of problems from biology and chemistry are covered. Problem sets, quizzes, and a model exam are included, and answers to all items are provided. The…
7 CFR 764.457 - Vendor requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Maintain and use a financial management information system to make financial decisions; (3) Understand and... budget; and (6) Use production records and other production information to identify problems, evaluate... general goal setting, risk management, and planning. (2) Financial management courses, covering all...
A Reading Resource Center: Why and How
ERIC Educational Resources Information Center
Minkoff, Henry
1974-01-01
Hunter College has set up a Reading Resource Center where students receive individualized help in specific problem areas not covered in their reading classes and where teachers can find materials either for their own edification or for use in the classroom. (Author)
ERIC Educational Resources Information Center
Walling, Linda Lucas
1992-01-01
Summarizes federal legislation regarding equal access for students with disabilities and discusses environmental barriers to accessibility in the library media center. Solutions to these design problems are suggested in the following areas: material formats and space requirements; the physical setting, including furniture, floor coverings,…
NASA Astrophysics Data System (ADS)
Gutowitz, Howard
1991-08-01
Cellular automata, dynamic systems in which space and time are discrete, are yielding interesting applications in both the physical and natural sciences. The thirty four contributions in this book cover many aspects of contemporary studies on cellular automata and include reviews, research reports, and guides to recent literature and available software. Chapters cover mathematical analysis, the structure of the space of cellular automata, learning rules with specified properties: cellular automata in biology, physics, chemistry, and computation theory; and generalizations of cellular automata in neural nets, Boolean nets, and coupled map lattices. Current work on cellular automata may be viewed as revolving around two central and closely related problems: the forward problem and the inverse problem. The forward problem concerns the description of properties of given cellular automata. Properties considered include reversibility, invariants, criticality, fractal dimension, and computational power. The role of cellular automata in computation theory is seen as a particularly exciting venue for exploring parallel computers as theoretical and practical tools in mathematical physics. The inverse problem, an area of study gaining prominence particularly in the natural sciences, involves designing rules that possess specified properties or perform specified task. A long-term goal is to develop a set of techniques that can find a rule or set of rules that can reproduce quantitative observations of a physical system. Studies of the inverse problem take up the organization and structure of the set of automata, in particular the parameterization of the space of cellular automata. Optimization and learning techniques, like the genetic algorithm and adaptive stochastic cellular automata are applied to find cellular automaton rules that model such physical phenomena as crystal growth or perform such adaptive-learning tasks as balancing an inverted pole. Howard Gutowitz is Collaborateur in the Service de Physique du Solide et Résonance Magnetique, Commissariat a I'Energie Atomique, Saclay, France.
Predicting protein-protein interactions from protein domains using a set cover approach.
Huang, Chengbang; Morcos, Faruck; Kanaan, Simon P; Wuchty, Stefan; Chen, Danny Z; Izaguirre, Jesús A
2007-01-01
One goal of contemporary proteome research is the elucidation of cellular protein interactions. Based on currently available protein-protein interaction and domain data, we introduce a novel method, Maximum Specificity Set Cover (MSSC), for the prediction of protein-protein interactions. In our approach, we map the relationship between interactions of proteins and their corresponding domain architectures to a generalized weighted set cover problem. The application of a greedy algorithm provides sets of domain interactions which explain the presence of protein interactions to the largest degree of specificity. Utilizing domain and protein interaction data of S. cerevisiae, MSSC enables prediction of previously unknown protein interactions, links that are well supported by a high tendency of coexpression and functional homogeneity of the corresponding proteins. Focusing on concrete examples, we show that MSSC reliably predicts protein interactions in well-studied molecular systems, such as the 26S proteasome and RNA polymerase II of S. cerevisiae. We also show that the quality of the predictions is comparable to the Maximum Likelihood Estimation while MSSC is faster. This new algorithm and all data sets used are accessible through a Web portal at http://ppi.cse.nd.edu.
Guideline 3: Psychosocial Treatment.
ERIC Educational Resources Information Center
American Journal on Mental Retardation, 2000
2000-01-01
The third in seven sets of guidelines based on the consensus of experts in the treatment of psychiatric and behavioral problems in mental retardation (MR) focuses on psychosocial treatment. Guidelines cover general principles, choosing among psychosocial treatments, severity of MR and psychiatric/behavior symptoms, diagnosable disorders, target…
Vehicle and driver scheduling for public transit.
DOT National Transportation Integrated Search
2009-08-01
The problem of driver scheduling involves the construction of a legal set of shifts, including allowance : of overtime, which cover the blocks in a particular vehicle schedule. A shift is the work scheduled to be performed by : a driver in one day, w...
Points of View in Problem Solving and Learning: Interactive Microworlds for Instruction.
1985-05-01
writing or mathematics tools, or educational games in afterschool or classroom settings all learned most effectively when they were provided initially with...COMPLETION GUIDE General. Make Blocks I. 4. S. 6. 7. It. 13. IS. and 16 agree with the corresponding information on the report cover. Leave Blocks 2 and...placed in this space. If no such number are used, leave this space blank. Bl Author(s). Include corresponding information from the report cover. Give
Evaluating Environmental Chemistry Textbooks.
ERIC Educational Resources Information Center
Hites, Ronald A.
2001-01-01
A director of the Indiana University Center for Environmental Science Research reviews textbooks on environmental chemistry. Highlights clear writing, intellectual depth, presence of problem sets covering both the qualitative and quantitative aspects of the material, and full coverage of the topics of concern. Discusses the director's own approach…
NASA Astrophysics Data System (ADS)
Phillips, Carolyn L.; Anderson, Joshua A.; Huber, Greg; Glotzer, Sharon C.
2012-05-01
We present filling as a type of spatial subdivision problem similar to covering and packing. Filling addresses the optimal placement of overlapping objects lying entirely inside an arbitrary shape so as to cover the most interior volume. In n-dimensional space, if the objects are polydisperse n-balls, we show that solutions correspond to sets of maximal n-balls. For polygons, we provide a heuristic for finding solutions of maximal disks. We consider the properties of ideal distributions of N disks as N→∞. We note an analogy with energy landscapes.
Millwright Apprenticeship. Related Training Modules. 6.1-6.12 Human Relations.
ERIC Educational Resources Information Center
Lane Community Coll., Eugene, OR.
This packet, part of the instructional materials for the Oregon apprenticeship program for millwright training, contains 12 modules covering human relations. The modules provide information on the following topics: communications skills, feedback, individual strengths, interpersonal conflicts, group problem solving, goal setting and decision…
A Group Theoretic Approach to Metaheuristic Local Search for Partitioning Problems
2005-05-01
Tabu Search. Mathematical and Computer Modeling 39: 599-616. 107 Daskin , M.S., E. Stern. 1981. A Hierarchical Objective Set Covering Model for EMS... A Group Theoretic Approach to Metaheuristic Local Search for Partitioning Problems by Gary W. Kinney Jr., B.G.S., M.S. Dissertation Presented to the...DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited The University of Texas at Austin May, 2005 20050504 002 REPORT
Benchmark problems for numerical implementations of phase field models
Jokisaari, A. M.; Voorhees, P. W.; Guyer, J. E.; ...
2016-10-01
Here, we present the first set of benchmark problems for phase field models that are being developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST). While many scientific research areas use a limited set of well-established software, the growing phase field community continues to develop a wide variety of codes and lacks benchmark problems to consistently evaluate the numerical performance of new implementations. Phase field modeling has become significantly more popular as computational power has increased and is now becoming mainstream, driving the need for benchmark problems to validate and verifymore » new implementations. We follow the example set by the micromagnetics community to develop an evolving set of benchmark problems that test the usability, computational resources, numerical capabilities and physical scope of phase field simulation codes. In this paper, we propose two benchmark problems that cover the physics of solute diffusion and growth and coarsening of a second phase via a simple spinodal decomposition model and a more complex Ostwald ripening model. We demonstrate the utility of benchmark problems by comparing the results of simulations performed with two different adaptive time stepping techniques, and we discuss the needs of future benchmark problems. The development of benchmark problems will enable the results of quantitative phase field models to be confidently incorporated into integrated computational materials science and engineering (ICME), an important goal of the Materials Genome Initiative.« less
The Approximability of Partial Vertex Covers in Trees.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mkrtchyan, Vahan; Parekh, Ojas D.; Segev, Danny
Motivated by applications in risk management of computational systems, we focus our attention on a special case of the partial vertex cover problem, where the underlying graph is assumed to be a tree. Here, we consider four possible versions of this setting, depending on whether vertices and edges are weighted or not. Two of these versions, where edges are assumed to be unweighted, are known to be polynomial-time solvable (Gandhi, Khuller, and Srinivasan, 2004). However, the computational complexity of this problem with weighted edges, and possibly with weighted vertices, has not been determined yet. The main contribution of this papermore » is to resolve these questions, by fully characterizing which variants of partial vertex cover remain intractable in trees, and which can be efficiently solved. In particular, we propose a pseudo-polynomial DP-based algorithm for the most general case of having weights on both edges and vertices, which is proven to be NPhard. This algorithm provides a polynomial-time solution method when weights are limited to edges, and combined with additional scaling ideas, leads to an FPTAS for the general case. A secondary contribution of this work is to propose a novel way of using centroid decompositions in trees, which could be useful in other settings as well.« less
Human Sexuality and Relationship Counseling: A Workshop for Paraprofessionals.
ERIC Educational Resources Information Center
Rosenman, Martin F.
A four-day workshop dealing with problems related to human sexuality and relationship counseling followed the assumption that impact and desensitization at the personal level increases the participants' ability to apply the material covered to their particular counseling setting. Initially, desensitization was facilitated through the use of…
What if ? On alternative conceptual models and the problem of their implementation
NASA Astrophysics Data System (ADS)
Neuberg, Jurgen
2015-04-01
Seismic and other monitoring techniques rely on a set of conceptual models on the base of which data sets can be interpreted. In order to do this on an operational level in volcano observatories these models need to be tested and ready for an interpretation in a timely manner. Once established, scientists in charge advising stakeholders and decision makers often stick firmly to these models to avoid confusion by giving alternative versions of interpretations to non-experts. This talk gives an overview of widely accepted conceptual models to interpret seismic and deformation data, and highlights in a few case studies some of the arising problems. Aspects covered include knowledge transfer between research institutions and observatories, data sharing, the problem of up-taking advice, and some hidden problems which turn out to be much more critical in assessing volcanic hazard than the actual data interpretation.
A Projection free method for Generalized Eigenvalue Problem with a nonsmooth Regularizer.
Hwang, Seong Jae; Collins, Maxwell D; Ravi, Sathya N; Ithapu, Vamsi K; Adluru, Nagesh; Johnson, Sterling C; Singh, Vikas
2015-12-01
Eigenvalue problems are ubiquitous in computer vision, covering a very broad spectrum of applications ranging from estimation problems in multi-view geometry to image segmentation. Few other linear algebra problems have a more mature set of numerical routines available and many computer vision libraries leverage such tools extensively. However, the ability to call the underlying solver only as a "black box" can often become restrictive. Many 'human in the loop' settings in vision frequently exploit supervision from an expert, to the extent that the user can be considered a subroutine in the overall system. In other cases, there is additional domain knowledge, side or even partial information that one may want to incorporate within the formulation. In general, regularizing a (generalized) eigenvalue problem with such side information remains difficult. Motivated by these needs, this paper presents an optimization scheme to solve generalized eigenvalue problems (GEP) involving a (nonsmooth) regularizer. We start from an alternative formulation of GEP where the feasibility set of the model involves the Stiefel manifold. The core of this paper presents an end to end stochastic optimization scheme for the resultant problem. We show how this general algorithm enables improved statistical analysis of brain imaging data where the regularizer is derived from other 'views' of the disease pathology, involving clinical measurements and other image-derived representations.
The In-Transit Vigilant Covering Tour Problem of Routing Unmanned Ground Vehicles
2012-08-01
of vertices in both vertex sets V and W, rather than exclusively in the vertex set V. A metaheuristic algorithm which follows the Greedy Randomized...window (VRPTW) approach, with the application of Java-encoded metaheuristic , was used [O’Rourke et al., 2001] for the dynamic routing of UAVs. Harder et...minimize both the two conflicting objectives; tour length and the coverage distance via a multi-objective evolutionary algorithm . This approach avoids a
Three Preparatory Schools' Syllabus Designs in the Turkish Republic of Northern Cyprus
ERIC Educational Resources Information Center
Bensen, Hanife; Silman, Fatos
2012-01-01
Problem Statement: Curriculum development involves the process of planning, setting up and running courses. The knowledge about designing syllabi, making choices of content and materials and assessing student performances plays an important role in curriculum development. Syllabus design is one aspect of curriculum development. It covers the kind…
The Two and a Half Learning Model: A Consequence of Academic Dishonesty
ERIC Educational Resources Information Center
Sahin, Mehmet
2016-01-01
Academic dishonesty has been regarded as a problem but not a visible and declared one in every type of educational setting from elementary school to graduate level all over the world. Dishonesty or misconduct in the academic realm covers plagiarism, fabrication, deception, cheating, bribery, sabotage, professorial misconduct and impersonation.…
Aerobic Digestion. Biological Treatment Process Control. Instructor's Guide.
ERIC Educational Resources Information Center
Klopping, Paul H.
This unit on aerobic sludge digestion covers the theory of the process, system components, factors that affect the process performance, standard operational concerns, indicators of steady-state operations, and operational problems. The instructor's guide includes: (1) an overview of the unit; (2) lesson plan; (3) lecture outline (keyed to a set of…
"Project Outreach," Final Report, December 31, 1979.
ERIC Educational Resources Information Center
Latzer, Robert M.
Fall 1979 activities for Project Outreach, a faculty development program at Jersey City State College, are described. Problems involved in carrying out these activities, a statement of expenditures and balances, and a set of recommendations are covered. Four areas of activity planned at the college were: a second series of study skills seminars, a…
Integer Linear Programming for Constrained Multi-Aspect Committee Review Assignment
Karimzadehgan, Maryam; Zhai, ChengXiang
2011-01-01
Automatic review assignment can significantly improve the productivity of many people such as conference organizers, journal editors and grant administrators. A general setup of the review assignment problem involves assigning a set of reviewers on a committee to a set of documents to be reviewed under the constraint of review quota so that the reviewers assigned to a document can collectively cover multiple topic aspects of the document. No previous work has addressed such a setup of committee review assignments while also considering matching multiple aspects of topics and expertise. In this paper, we tackle the problem of committee review assignment with multi-aspect expertise matching by casting it as an integer linear programming problem. The proposed algorithm can naturally accommodate any probabilistic or deterministic method for modeling multiple aspects to automate committee review assignments. Evaluation using a multi-aspect review assignment test set constructed using ACM SIGIR publications shows that the proposed algorithm is effective and efficient for committee review assignments based on multi-aspect expertise matching. PMID:22711970
Optimization Strategies for Sensor and Actuator Placement
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Kincaid, Rex K.
1999-01-01
This paper provides a survey of actuator and sensor placement problems from a wide range of engineering disciplines and a variety of applications. Combinatorial optimization methods are recommended as a means for identifying sets of actuators and sensors that maximize performance. Several sample applications from NASA Langley Research Center, such as active structural acoustic control, are covered in detail. Laboratory and flight tests of these applications indicate that actuator and sensor placement methods are effective and important. Lessons learned in solving these optimization problems can guide future research.
Solving the influence maximization problem reveals regulatory organization of the yeast cell cycle.
Gibbs, David L; Shmulevich, Ilya
2017-06-01
The Influence Maximization Problem (IMP) aims to discover the set of nodes with the greatest influence on network dynamics. The problem has previously been applied in epidemiology and social network analysis. Here, we demonstrate the application to cell cycle regulatory network analysis for Saccharomyces cerevisiae. Fundamentally, gene regulation is linked to the flow of information. Therefore, our implementation of the IMP was framed as an information theoretic problem using network diffusion. Utilizing more than 26,000 regulatory edges from YeastMine, gene expression dynamics were encoded as edge weights using time lagged transfer entropy, a method for quantifying information transfer between variables. By picking a set of source nodes, a diffusion process covers a portion of the network. The size of the network cover relates to the influence of the source nodes. The set of nodes that maximizes influence is the solution to the IMP. By solving the IMP over different numbers of source nodes, an influence ranking on genes was produced. The influence ranking was compared to other metrics of network centrality. Although the top genes from each centrality ranking contained well-known cell cycle regulators, there was little agreement and no clear winner. However, it was found that influential genes tend to directly regulate or sit upstream of genes ranked by other centrality measures. The influential nodes act as critical sources of information flow, potentially having a large impact on the state of the network. Biological events that affect influential nodes and thereby affect information flow could have a strong effect on network dynamics, potentially leading to disease. Code and data can be found at: https://github.com/gibbsdavidl/miergolf.
An analysis of IGBP global land-cover characterization process
Loveland, Thomas R.; Zhu, Zhiliang; Ohlen, Donald O.; Brown, Jesslyn F.; Reed, Bradley C.; Yang, Limin
1999-01-01
The international Geosphere Biosphere Programme (IGBP) has called for the development of improved global land-cover data for use in increasingly sophisticated global environmental models. To meet this need, the staff of the U.S. Geological Survey and the University of Nebraska-Lincoln developed and applied a global land-cover characterization methodology using 1992-1993 1-km resolution Advanced Very High Resolution Radiometer (AVHRR) and other spatial data. The methodology, based on unsupervised classification with extensive postclassification refinement, yielded a multi-layer database consisting of eight land-cover data sets, descriptive attributes, and source data. An independent IGBP accuracy assessment reports a global accuracy of 73.5 percent, and continental results vary from 63 percent to 83 percent. Although data quality, methodology, interpreter performance, and logistics affected the results, significant problems were associated with the relationship between AVHRR data and fine-scale, spectrally similar land-cover patterns in complex natural or disturbed landscapes.
NASA Astrophysics Data System (ADS)
Krinitskiy, Mikhail; Sinitsyn, Alexey
2017-04-01
Shortwave radiation is an important component of surface heat budget over sea and land. To estimate them accurate observations of cloud conditions are needed including total cloud cover, spatial and temporal cloud structure. While massively observed visually, for building accurate SW radiation parameterizations cloud structure needs also to be quantified using precise instrumental measurements. While there already exist several state of the art land-based cloud-cameras that satisfy researchers needs, their major disadvantages are associated with inaccuracy of all-sky images processing algorithms which typically result in the uncertainties of 2-4 octa of cloud cover estimates with the resulting true-scoring cloud cover accuracy of about 7%. Moreover, none of these algorithms determine cloud types. We developed an approach for cloud cover and structure estimating, which provides much more accurate estimates and also allows for measuring additional characteristics. This method is based on the synthetic controlling index, namely the "grayness rate index", that we introduced in 2014. Since then this index has already demonstrated high efficiency being used along with the technique namely the "background sunburn effect suppression", to detect thin clouds. This made it possible to significantly increase the accuracy of total cloud cover estimation in various sky image states using this extension of routine algorithm type. Errors for the cloud cover estimates significantly decreased down resulting the mean squared error of about 1.5 octa. Resulting true-scoring accuracy is more than 38%. The main source of this approach uncertainties is the solar disk state determination errors. While the deep neural networks approach lets us to estimate solar disk state with 94% accuracy, the final result of total cloud estimation still isn`t satisfying. To solve this problem completely we applied the set of machine learning algorithms to the problem of total cloud cover estimation directly. The accuracy of this approach varies depending on algorithm choice. Deep neural networks demonstrated the best accuracy of more than 96%. We will demonstrate some approaches and the most influential statistical features of all-sky images that lets the algorithm reach that high accuracy. With the use of our new optical package a set of over 480`000 samples has been collected in several sea missions in 2014-2016 along with concurrent standard human observed and instrumentally recorded meteorological parameters. We will demonstrate the results of the field measurements and will discuss some still remaining problems and the potential of the further developments of machine learning approach.
Towards a Framework for Modeling Space Systems Architectures
NASA Technical Reports Server (NTRS)
Shames, Peter; Skipper, Joseph
2006-01-01
Topics covered include: 1) Statement of the problem: a) Space system architecture is complex; b) Existing terrestrial approaches must be adapted for space; c) Need a common architecture methodology and information model; d) Need appropriate set of viewpoints. 2) Requirements on a space systems model. 3) Model Based Engineering and Design (MBED) project: a) Evaluated different methods; b) Adapted and utilized RASDS & RM-ODP; c) Identified useful set of viewpoints; d) Did actual model exchanges among selected subset of tools. 4) Lessons learned & future vision.
Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms
NASA Astrophysics Data System (ADS)
Kwok, Kwan S.; Driessen, Brian J.; Phillips, Cynthia A.; Tovey, Craig A.
1997-09-01
This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. We wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which we must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solution times for one hundred robots took only seconds on a silicon graphics crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. We have found these mobile robot problems to be a very interesting application of network optimization methods, and we expect this to be a fruitful area for future research.
NASA Astrophysics Data System (ADS)
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.
ERIC Educational Resources Information Center
King, Michael A.
2009-01-01
Business intelligence derived from data warehousing and data mining has become one of the most strategic management tools today, providing organizations with long-term competitive advantages. Business school curriculums and popular database textbooks cover data warehousing, but the examples and problem sets typically are small and unrealistic. The…
Dare To Be You Program. Leaders' Manual. 2nd Edition.
ERIC Educational Resources Information Center
Miller-Heyl, Janet Lynne; Shores, Wanda, Ed.
This manual contains a complete set of program formats, activities, and references covered by the Dare to be You Training Program. The program, piloted in a rural Colorado County, is designed for use by parents, youth, teachers, church, and service organizations to help them deal with adolescents' problems and to assist youth in developing skills…
ERIC Educational Resources Information Center
Edwards, Dan
A model is provided for an inservice workshop to provide systematic project review, conduct individual volunteer support and problem solving, and conduct future work planning. Information on model use and general instructions are presented. Materials are provided for 12 sessions covering a 5-day period. The first session on climate setting and…
Solving Partial Differential Equations on Overlapping Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henshaw, W D
2008-09-22
We discuss the solution of partial differential equations (PDEs) on overlapping grids. This is a powerful technique for efficiently solving problems in complex, possibly moving, geometry. An overlapping grid consists of a set of structured grids that overlap and cover the computational domain. By allowing the grids to overlap, grids for complex geometries can be more easily constructed. The overlapping grid approach can also be used to remove coordinate singularities by, for example, covering a sphere with two or more patches. We describe the application of the overlapping grid approach to a variety of different problems. These include the solutionmore » of incompressible fluid flows with moving and deforming geometry, the solution of high-speed compressible reactive flow with rigid bodies using adaptive mesh refinement (AMR), and the solution of the time-domain Maxwell's equations of electromagnetism.« less
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
2000-01-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues
Resource-constrained scheduling with hard due windows and rejection penalties
NASA Astrophysics Data System (ADS)
Garcia, Christopher
2016-09-01
This work studies a scheduling problem where each job must be either accepted and scheduled to complete within its specified due window, or rejected altogether. Each job has a certain processing time and contributes a certain profit if accepted or penalty cost if rejected. There is a set of renewable resources, and no resource limit can be exceeded at any time. Each job requires a certain amount of each resource when processed, and the objective is to maximize total profit. A mixed-integer programming formulation and three approximation algorithms are presented: a priority rule heuristic, an algorithm based on the metaheuristic for randomized priority search and an evolutionary algorithm. Computational experiments comparing these four solution methods were performed on a set of generated benchmark problems covering a wide range of problem characteristics. The evolutionary algorithm outperformed the other methods in most cases, often significantly, and never significantly underperformed any method.
NASA Astrophysics Data System (ADS)
Eyono Obono, S. D.; Basak, Sujit Kumar
2011-12-01
The general formulation of the assignment problem consists in the optimal allocation of a given set of tasks to a workforce. This problem is covered by existing literature for different domains such as distributed databases, distributed systems, transportation, packets radio networks, IT outsourcing, and teaching allocation. This paper presents a new version of the assignment problem for the allocation of academic tasks to staff members in departments with long leave opportunities. It presents the description of a workload allocation scheme and its algorithm, for the allocation of an equitable number of tasks in academic departments where long leaves are necessary.
Environmental trade-offs of tunnels vs cut-and-cover subways
Walton, M.
1978-01-01
Heavy construction projects in cities entail two kinds of cost - internal cost, which can be defined in terms of payments from one set of parties to another, and external cost, which is the cost borne by the community at large as the result of disutilities entailed in construction and operation. Environmental trade-offs involve external costs, which are commonly difficult to measure. Cut-and-cover subway construction probably entails higher external and internal cost than deep tunnel construction in many urban geological environments, but uncertainty concerning the costs and environmental trade-offs of tunneling leads to limited and timid use of tunneling by American designers. Thus uncertainty becomes a major trade-off which works against tunneling. The reverse is true in Sweden after nearly 30 years of subway construction. Econometric methods for measuring external costs exist in principle, but are limited in application. Economic theory based on market pressure does not address the real problem of urban environmental trade-offs. Nevertheless, the problem of uncertainty can be addressed by comparative studies of estimated and as-built costs of cut-and-cover vs tunnel projects and a review of environmental issues associated with such construction. Such a study would benefit the underground construction industry and the design of transportation systems. It would also help solve an aspect of the urban problem. ?? 1978.
UMAP Modules-Units 71, 72, 73, 74, 75, 81-83, 234.
ERIC Educational Resources Information Center
Horelick, Brindell; And Others
The first four units cover aspects of medical applications of calculus: 71-Measuring Cardiac Output; 72-Prescribing Safe and Effective Dosage; 73-Epidemics; and 74-Tracer Methods in Permiability. All units include a set of exercises and answers to at least some of the problems. Unit 72 also contains a model exam and answers to this exam. The fifth…
Branded for Life: The Case of Detained Black Youth.
ERIC Educational Resources Information Center
Swan, L. Alex
The concept represented by the term juvenile delinquency covers a wide range of activities and a complex set of juvenile offenses. Moreover, the concept is vague and its definition is often left to the discretion of law enforcement agencies and the courts. A problem for black youth is that their acts are assessed by outsiders who have the power to…
ERIC Educational Resources Information Center
Pragma Corp., Falls Church, VA.
This manual is intended to assist Peace Corps trainers in providing inservice technical training in small enterprise development. The following topics are covered: expectations and sharing of resources, problem analysis as a part of project identification, procedures in setting goals and objectives, steps in identifying project resources, the…
Research in the Child Psychiatric and Guidance Clinics: Supplementary Bibliography II (1972).
ERIC Educational Resources Information Center
Klein, Zanvel E.
This bibliography is the third in a series of compilations of research related to pre-adolescent problems commonly treated through outpatient mental health settings such as clinics and schools. The original document (ED 073 849) covered literature up to 1971, and the first supplement (PS 007 263) focused on research that appeared in 1971. This,…
ERIC Educational Resources Information Center
Kieren, Thomas E.
This last paper in a set of four reviews research on a wide variety of computer applications in the mathematics classroom. It covers computer-based instruction, especially drill-and-practice and tutorial modes; computer-managed instruction; and computer-augmented problem-solving. Analytical comments on the findings and status of the research are…
Kirschneck, Michaela; Sabariego, Carla; Singer, Susanne; Tschiesner, Uta
2014-07-01
The International Classification of Functioning, Disability and Health Core Set for Head and Neck Cancer (ICF-HNC) covers the typical spectrum of problems in functioning in head and neck cancer. This study is part of a multistep process to develop practical guidelines in Germany. The purpose of this study was to identify instruments for the assessment of functioning using the ICF-HNC as reference. Four Delphi surveys with physicians, physiotherapists, psychologists, and social workers were performed to identify which aspects of the ICF-HNC are being treated and which assessment tools are recommended for the assessment of functioning. Ninety-seven percent categories of the ICF-HNC were treated by healthcare professionals participating in the current study. Altogether, 33 assessment tools were recommended for therapy monitoring, food intake, pain, further organic problems/laboratory tests, and psychosocial areas. Although the ICF-HNC is being currently implemented by the head and neck cancer experts, several areas are not covered regularly. Additionally, validated tools were rarely recommended. Copyright © 2013 Wiley Periodicals, Inc.
Kaneko, Makoto; Ohta, Ryuichi; Nago, Naoki; Fukushi, Motoharu; Matsushima, Masato
2017-09-13
The Japanese health care system has yet to establish structured training for primary care physicians; therefore, physicians who received an internal medicine based training program continue to play a principal role in the primary care setting. To promote the development of a more efficient primary health care system, the assessment of its current status in regard to the spectrum of patients' reasons for encounters (RFEs) and health problems is an important step. Recognizing the proportions of patients' RFEs and health problems, which are not generally covered by an internist, can provide valuable information to promote the development of a primary care physician-centered system. We conducted a systematic review in which we searched six databases (PubMed, the Cochrane Library, Google Scholar, Ichushi-Web, JDreamIII and CiNii) for observational studies in Japan coded by International Classification of Health Problems in Primary Care (ICHPPC) and International Classification of Primary Care (ICPC) up to March 2015. We employed population density as index of accessibility. We calculated Spearman's rank correlation coefficient to examine the correlation between the proportion of "non-internal medicine-related" RFEs and health problems in each study area in consideration of the population density. We found 17 studies with diverse designs and settings. Among these studies, "non-internal medicine-related" RFEs, which was not thought to be covered by internists, ranged from about 4% to 40%. In addition, "non-internal medicine-related" health problems ranged from about 10% to 40%. However, no significant correlation was found between population density and the proportion of "non-internal medicine-related" RFEs and health problems. This is the first systematic review on RFEs and health problems coded by ICHPPC and ICPC undertaken to reveal the diversity of health problems in Japanese primary care. These results suggest that primary care physicians in some rural areas of Japan need to be able to deal with "non-internal-medicine-related" RFEs and health problems, and that curriculum including practical non-internal medicine-related training is likely to be important.
A heuristic for deriving the optimal number and placement of reconnaissance sensors
NASA Astrophysics Data System (ADS)
Nanda, S.; Weeks, J.; Archer, M.
2008-04-01
A key to mastering asymmetric warfare is the acquisition of accurate intelligence on adversaries and their assets in urban and open battlefields. To achieve this, one needs adequate numbers of tactical sensors placed in locations to optimize coverage, where optimality is realized by covering a given area of interest with the least number of sensors, or covering the largest possible subsection of an area of interest with a fixed set of sensors. Unfortunately, neither problem admits a polynomial time algorithm as a solution, and therefore, the placement of such sensors must utilize intelligent heuristics instead. In this paper, we present a scheme implemented on parallel SIMD processing architectures to yield significantly faster results, and that is highly scalable with respect to dynamic changes in the area of interest. Furthermore, the solution to the first problem immediately translates to serve as a solution to the latter if and when any sensors are rendered inoperable.
NASA Astrophysics Data System (ADS)
Li, Youping; Lu, Jinsong; Cheng, Jian; Yin, Yongzhen; Wang, Jianlan
2017-04-01
Based on the summaries of the rules about the vibration measurement for hydro-generator sets with respect to relevant standards, the key issues of the vibration measurement, such as measurement modes, the transducer selection are illustrated. In addition, the problems existing in vibration measurement are pointed out. The actual acquisition data of head cover vertical vibration respectively obtained by seismic transducer and eddy current transducer in site hydraulic turbine performance tests during the rising of the reservoir upstream level in a certain hydraulic power plant are compared. The difference of the data obtained by the two types of transducers and the potential reasons are presented. The application conditions of seismic transducer and eddy current transducer for hydro-generator set vibration measurement are given based on the analysis. Research subjects that should be focused on about the topic discussed in this paper are suggested.
Huang, Jia Hang; Liu, Jin Fu; Lin, Zhi Wei; Zheng, Shi Qun; He, Zhong Sheng; Zhang, Hui Guang; Li, Wen Zhou
2017-01-01
Designing the nature reserves is an effective approach to protecting biodiversity. The traditional approaches to designing the nature reserves could only identify the core area for protecting the species without specifying an appropriate land area of the nature reserve. The site selection approaches, which are based on mathematical model, can select part of the land from the planning area to compose the nature reserve and to protect specific species or ecosystem. They are useful approaches to alleviating the contradiction between ecological protection and development. The existing site selection methods do not consider the ecological differences between each unit and has the bottleneck of computational efficiency in optimization algorithm. In this study, we first constructed the ecological value assessment system which was appropriated for forest ecosystem and that was used for calculating ecological value of Daiyun Mountain and for drawing its distribution map. Then, the Ecological Set Covering Problem (ESCP) was established by integrating the ecological values and then the Space-ecology Set Covering Problem (SSCP) was generated based on the spatial compactness of ESCP. Finally, the STS algorithm which possessed good optimizing performance was utilized to search the approximate optimal solution under diverse protection targets, and the optimization solution of the built-up area of Daiyun Mountain was proposed. According to the experimental results, the difference of ecological values in the spatial distribution was obvious. The ecological va-lue of selected sites of ESCP was higher than that of SCP. SSCP could aggregate the sites with high ecological value based on ESCP. From the results, the level of the aggregation increased with the weight of the perimeter. We suggested that the range of the existing reserve could be expanded for about 136 km 2 and the site of Tsuga longibracteata should be included, which was located in the northwest of the study area. Our research aimed at providing an optimization scheme for the sustai-nable development of Daiyun Mountain nature reserve and the optimal allocation of land resource, and a novel idea for designing the nature reserve of forest ecosystem in China.
Simplex-stochastic collocation method with improved scalability
NASA Astrophysics Data System (ADS)
Edeling, W. N.; Dwight, R. P.; Cinnella, P.
2016-04-01
The Simplex-Stochastic Collocation (SSC) method is a robust tool used to propagate uncertain input distributions through a computer code. However, it becomes prohibitively expensive for problems with dimensions higher than 5. The main purpose of this paper is to identify bottlenecks, and to improve upon this bad scalability. In order to do so, we propose an alternative interpolation stencil technique based upon the Set-Covering problem, and we integrate the SSC method in the High-Dimensional Model-Reduction framework. In addition, we address the issue of ill-conditioned sample matrices, and we present an analytical map to facilitate uniformly-distributed simplex sampling.
Combinatorial therapy discovery using mixed integer linear programming.
Pang, Kaifang; Wan, Ying-Wooi; Choi, William T; Donehower, Lawrence A; Sun, Jingchun; Pant, Dhruv; Liu, Zhandong
2014-05-15
Combinatorial therapies play increasingly important roles in combating complex diseases. Owing to the huge cost associated with experimental methods in identifying optimal drug combinations, computational approaches can provide a guide to limit the search space and reduce cost. However, few computational approaches have been developed for this purpose, and thus there is a great need of new algorithms for drug combination prediction. Here we proposed to formulate the optimal combinatorial therapy problem into two complementary mathematical algorithms, Balanced Target Set Cover (BTSC) and Minimum Off-Target Set Cover (MOTSC). Given a disease gene set, BTSC seeks a balanced solution that maximizes the coverage on the disease genes and minimizes the off-target hits at the same time. MOTSC seeks a full coverage on the disease gene set while minimizing the off-target set. Through simulation, both BTSC and MOTSC demonstrated a much faster running time over exhaustive search with the same accuracy. When applied to real disease gene sets, our algorithms not only identified known drug combinations, but also predicted novel drug combinations that are worth further testing. In addition, we developed a web-based tool to allow users to iteratively search for optimal drug combinations given a user-defined gene set. Our tool is freely available for noncommercial use at http://www.drug.liuzlab.org/. zhandong.liu@bcm.edu Supplementary data are available at Bioinformatics online.
NASA Technical Reports Server (NTRS)
Hepner, George F.; Logan, Thomas; Ritter, Niles; Bryant, Nevin
1990-01-01
Recent research has shown an artificial neural network (ANN) to be capable of pattern recognition and the classification of image data. This paper examines the potential for the application of neural network computing to satellite image processing. A second objective is to provide a preliminary comparison and ANN classification. An artificial neural network can be trained to do land-cover classification of satellite imagery using selected sites representative of each class in a manner similar to conventional supervised classification. One of the major problems associated with recognition and classifications of pattern from remotely sensed data is the time and cost of developing a set of training sites. This reseach compares the use of an ANN back propagation classification procedure with a conventional supervised maximum likelihood classification procedure using a minimal training set. When using a minimal training set, the neural network is able to provide a land-cover classification superior to the classification derived from the conventional classification procedure. This research is the foundation for developing application parameters for further prototyping of software and hardware implementations for artificial neural networks in satellite image and geographic information processing.
Introductory Course Based on a Single Problem: Learning Nucleic Acid Biochemistry from AIDS Research
ERIC Educational Resources Information Center
Grover, Neena
2004-01-01
In departure from the standard approach of using several problems to cover specific topics in a class, I use a single problem to cover the contents of the entire semester-equivalent biochemistry classes. I have developed a problem-based service-learning (PBSL) problem on HIV/AIDS to cover nucleic acid concepts that are typically taught in the…
A distributed-memory approximation algorithm for maximum weight perfect bipartite matching
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azad, Ariful; Buluc, Aydin; Li, Xiaoye S.
We design and implement an efficient parallel approximation algorithm for the problem of maximum weight perfect matching in bipartite graphs, i.e. the problem of finding a set of non-adjacent edges that covers all vertices and has maximum weight. This problem differs from the maximum weight matching problem, for which scalable approximation algorithms are known. It is primarily motivated by finding good pivots in scalable sparse direct solvers before factorization where sequential implementations of maximum weight perfect matching algorithms, such as those available in MC64, are widely used due to the lack of scalable alternatives. To overcome this limitation, we proposemore » a fully parallel distributed memory algorithm that first generates a perfect matching and then searches for weightaugmenting cycles of length four in parallel and iteratively augments the matching with a vertex disjoint set of such cycles. For most practical problems the weights of the perfect matchings generated by our algorithm are very close to the optimum. An efficient implementation of the algorithm scales up to 256 nodes (17,408 cores) on a Cray XC40 supercomputer and can solve instances that are too large to be handled by a single node using the sequential algorithm.« less
Summer Thermal Performance of Ventilated Roofs with Tiled Coverings
NASA Astrophysics Data System (ADS)
Bortoloni, M.; Bottarelli, M.; Piva, S.
2017-01-01
The thermal performance of a ventilated pitched roof with tiled coverings is analysed and compared with unventilated roofs. The analysis is carried out by means of a finite element numerical code, by solving both the fluid and thermal problems in steady-state. A whole one-floor building with a pitched roof is schematized as a 2D computational domain including the air-permeability of tiled covering. Realistic data sets for wind, temperature and solar radiation are used to simulate summer conditions at different times of the day. The results demonstrate that the batten space in pitched roofs is an effective solution for reducing the solar heat gain in summer and thus for achieving better indoor comfort conditions. The efficiency of the ventilation is strictly linked to the external wind conditions and to buoyancy forces occurring due to the heating of the tiles.
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
1999-10-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. Through these examples and additional exercises at the end of each chapter, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems.
a Spiral-Based Downscaling Method for Generating 30 M Time Series Image Data
NASA Astrophysics Data System (ADS)
Liu, B.; Chen, J.; Xing, H.; Wu, H.; Zhang, J.
2017-09-01
The spatial detail and updating frequency of land cover data are important factors influencing land surface dynamic monitoring applications in high spatial resolution scale. However, the fragmentized patches and seasonal variable of some land cover types (e. g. small crop field, wetland) make it labor-intensive and difficult in the generation of land cover data. Utilizing the high spatial resolution multi-temporal image data is a possible solution. Unfortunately, the spatial and temporal resolution of available remote sensing data like Landsat or MODIS datasets can hardly satisfy the minimum mapping unit and frequency of current land cover mapping / updating at the same time. The generation of high resolution time series may be a compromise to cover the shortage in land cover updating process. One of popular way is to downscale multi-temporal MODIS data with other high spatial resolution auxiliary data like Landsat. But the usual manner of downscaling pixel based on a window may lead to the underdetermined problem in heterogeneous area, result in the uncertainty of some high spatial resolution pixels. Therefore, the downscaled multi-temporal data can hardly reach high spatial resolution as Landsat data. A spiral based method was introduced to downscale low spatial and high temporal resolution image data to high spatial and high temporal resolution image data. By the way of searching the similar pixels around the adjacent region based on the spiral, the pixel set was made up in the adjacent region pixel by pixel. The underdetermined problem is prevented to a large extent from solving the linear system when adopting the pixel set constructed. With the help of ordinary least squares, the method inverted the endmember values of linear system. The high spatial resolution image was reconstructed on the basis of high spatial resolution class map and the endmember values band by band. Then, the high spatial resolution time series was formed with these high spatial resolution images image by image. Simulated experiment and remote sensing image downscaling experiment were conducted. In simulated experiment, the 30 meters class map dataset Globeland30 was adopted to investigate the effect on avoid the underdetermined problem in downscaling procedure and a comparison between spiral and window was conducted. Further, the MODIS NDVI and Landsat image data was adopted to generate the 30m time series NDVI in remote sensing image downscaling experiment. Simulated experiment results showed that the proposed method had a robust performance in downscaling pixel in heterogeneous region and indicated that it was superior to the traditional window-based methods. The high resolution time series generated may be a benefit to the mapping and updating of land cover data.
Terrain intelligence Chita Oblast (U.S.S.R.)
,
1943-01-01
The following folio of maps and explanatory tables outlines the principal terrain features of the Chita Oblast. Each map and table is devoted to a specialized set of problems; together they cover such subjects as terrain appreciations, rivers, surface-water and ground-water supplies, construction materials, fuels, suitability for temporary roads and airfields, mineral resources, and geology. These maps and data were complied by the United States Geological Survey.
ERIC Educational Resources Information Center
Schlenker, Richard M.; Dillon, Timothy
This document contains lab activities, problem sets, and a tape script to be accompanied by a slide show. The minicourse covers the following topics of general chemistry: kinetic-molecular theory, the Bohr atom, acids, bases, and salts, the periodic table, bonding, chemical equations, the metric system, computation of density, mass, and volume,…
Estimation of Subpixel Snow-Covered Area by Nonparametric Regression Splines
NASA Astrophysics Data System (ADS)
Kuter, S.; Akyürek, Z.; Weber, G.-W.
2016-10-01
Measurement of the areal extent of snow cover with high accuracy plays an important role in hydrological and climate modeling. Remotely-sensed data acquired by earth-observing satellites offer great advantages for timely monitoring of snow cover. However, the main obstacle is the tradeoff between temporal and spatial resolution of satellite imageries. Soft or subpixel classification of low or moderate resolution satellite images is a preferred technique to overcome this problem. The most frequently employed snow cover fraction methods applied on Moderate Resolution Imaging Spectroradiometer (MODIS) data have evolved from spectral unmixing and empirical Normalized Difference Snow Index (NDSI) methods to latest machine learning-based artificial neural networks (ANNs). This study demonstrates the implementation of subpixel snow-covered area estimation based on the state-of-the-art nonparametric spline regression method, namely, Multivariate Adaptive Regression Splines (MARS). MARS models were trained by using MODIS top of atmospheric reflectance values of bands 1-7 as predictor variables. Reference percentage snow cover maps were generated from higher spatial resolution Landsat ETM+ binary snow cover maps. A multilayer feed-forward ANN with one hidden layer trained with backpropagation was also employed to estimate the percentage snow-covered area on the same data set. The results indicated that the developed MARS model performed better than th
The Dynamic Multi-objective Multi-vehicle Covering Tour Problem
2013-06-01
AI Artificial Intelligence AUV Autonomous Underwater Vehicle CLP Clover Leaf Problem CSP Covering Salesman Problem CTP Covering Tour Problem CVRP...introduces a new formalization - the DMOMCTP. Related works from routing problems, Artificial Intelligence ( AI ), and MOPs are discussed briefly. As a...the rest of that framework being replaced. The codebase differs from jMetal 4.2 in that it can handle the time and DM dependent nature of the DMOMCTP
The TSP-approach to approximate solving the m-Cycles Cover Problem
NASA Astrophysics Data System (ADS)
Gimadi, Edward Kh.; Rykov, Ivan; Tsidulko, Oxana
2016-10-01
In the m-Cycles Cover problem it is required to find a collection of m vertex-disjoint cycles that covers all vertices of the graph and the total weight of edges in the cover is minimum (or maximum). The problem is a generalization of the Traveling salesmen problem. It is strongly NP-hard. We discuss a TSP-approach that gives polynomial approximate solutions for this problem. It transforms an approximation TSP algorithm into an approximation m-CCP algorithm. In this paper we present a number of successful transformations with proven performance guarantees for the obtained solutions.
Making Debris Avoidance Decisions for ESMO's EOS Mission Set
NASA Technical Reports Server (NTRS)
Mantziaras, Dimitrios
2016-01-01
The presentation will cover the aspects of making debris risk decisions from the NASA Mission Director's perspective, specifically for NASA Earth Science Mission Operations (ESMO) Earth Observing System (EOS) mission set. ESMO has been involved in analyzing potential debris risk conjunctions with secondary objects since the inception of this discipline. Through the cumulated years of experience and continued exposure to various debris scenarios, ESMO's understanding of the problem and process to deal with this issue has evolved. The presentation will describe the evolution of the ESMO process, specifically as it relates to the maneuver execution and spacecraft risk management decision process. It will briefly cover the original Drag Make-Up Maneuver, several day, methodical manually intensive, ramp up waive off approach, to the present day more automated, pre-canned onboard command, tools based approach. The presentation will also cover the key information needed to make debris decisions and challenges in doing so while still trying to meet science goals, constellation constraints and manage resources. A slide or two at the end of the presentation, will be devoted to discussing what further improvements could be helpful to improve decision making and future process improvement plans challenges.
Application of the maximal covering location problem to habitat reserve site selection: a review
Stephanie A. Snyder; Robert G. Haight
2016-01-01
The Maximal Covering Location Problem (MCLP) is a classic model from the location science literature which has found wide application. One important application is to a fundamental problem in conservation biology, the Maximum Covering Species Problem (MCSP), which identifies land parcels to protect to maximize the number of species represented in the selected sites. We...
Measurement standards for interdisciplinary medical rehabilitation.
Johnston, M V; Keith, R A; Hinderer, S R
1992-12-01
Rehabilitation must address problems inherent in the measurement of human function and health-related quality of life, as well as problems in diagnosis and measurement of impairment. This educational document presents an initial set of standards to be used as guidelines for development and use of measurement and evaluation procedures and instruments for interdisciplinary, health-related rehabilitation. Part I covers general measurement principles and technical standards, beginning with validity, the central consideration for use of measures. Subsequent sections focus on reliability and errors of measurement, norms and scaling, development of measures, and technical manuals and guides. Part II covers principles and standards for use of measures. General principles of application of measures in practice are discussed first, followed by standards to protect persons being measured and then by standards for administrative applications. Many explanations, examples, and references are provided to help professionals understand measurement principles. Improved measurement will ensure the basis of rehabilitation as a science and nourish its success as a clinical service.
NASA Technical Reports Server (NTRS)
Hartman, Edwin P; Biermann, David
1938-01-01
Negative thrust and torque data for 2, 3, and 4-blade metal propellers having Clark y and R.A.F. 6 airfoil sections were obtained from tests in the NACA 20-foot tunnel. The propellers were mounted in front of a radial engine nacelle and the blade-angle settings covered in the tests ranged from l5 degrees to 90 degrees. One propeller was also tested at blade-angle settings of 0 degree, 5 degrees, and 10 degrees. A considerable portion of the report deals with the various applications of the negative thrust and torque to flight problems. A controllable propeller is shown to have a number of interesting, and perhaps valuable, uses within the negative thrust and torque range of operation. A small amount of engine-friction data is included to facilitate the application of the propeller data.
Rational Approximations with Hankel-Norm Criterion
1980-01-01
REPORT TYPE ANDu DATES COVERED It) L. TITLE AND SLWUIlL Fi901 ia FUNDING NUMOIRS, RATIONAL APPROXIMATIONS WITH HANKEL-NORM CRITERION PE61102F i...problem is proved to be reducible to obtain a two-variable all- pass ration 1 function, interpolating a set of parametric values at specified points inside...PAGES WHICH DO NOT REPRODUCE LEGIBLY. V" C - w RATIONAL APPROXIMATIONS WITH HANKEL-NORM CRITERION* Y. Genin* Philips Research Lab. 2, avenue van
Optimal shortening of uniform covering arrays
Rangel-Valdez, Nelson; Avila-George, Himer; Carrizalez-Turrubiates, Oscar
2017-01-01
Software test suites based on the concept of interaction testing are very useful for testing software components in an economical way. Test suites of this kind may be created using mathematical objects called covering arrays. A covering array, denoted by CA(N; t, k, v), is an N × k array over Zv={0,…,v-1} with the property that every N × t sub-array covers all t-tuples of Zvt at least once. Covering arrays can be used to test systems in which failures occur as a result of interactions among components or subsystems. They are often used in areas such as hardware Trojan detection, software testing, and network design. Because system testing is expensive, it is critical to reduce the amount of testing required. This paper addresses the Optimal Shortening of Covering ARrays (OSCAR) problem, an optimization problem whose objective is to construct, from an existing covering array matrix of uniform level, an array with dimensions of (N − δ) × (k − Δ) such that the number of missing t-tuples is minimized. Two applications of the OSCAR problem are (a) to produce smaller covering arrays from larger ones and (b) to obtain quasi-covering arrays (covering arrays in which the number of missing t-tuples is small) to be used as input to a meta-heuristic algorithm that produces covering arrays. In addition, it is proven that the OSCAR problem is NP-complete, and twelve different algorithms are proposed to solve it. An experiment was performed on 62 problem instances, and the results demonstrate the effectiveness of solving the OSCAR problem to facilitate the construction of new covering arrays. PMID:29267343
Development of a Refined Space Vehicle Rollout Forcing Function
NASA Technical Reports Server (NTRS)
James, George; Tucker, Jon-Michael; Valle, Gerard; Grady, Robert; Schliesing, John; Fahling, James; Emory, Benjamin; Armand, Sasan
2016-01-01
For several decades, American manned spaceflight vehicles and the associated launch platforms have been transported from final assembly to the launch pad via a pre-launch phase called rollout. The rollout environment is rich with forced harmonics and higher order effects can be used for extracting structural dynamics information. To enable this utilization, processing tools are needed to move from measured and analytical data to dynamic metrics such as transfer functions, mode shapes, modal frequencies, and damping. This paper covers the range of systems and tests that are available to estimate rollout forcing functions for the Space Launch System (SLS). The specific information covered in this paper includes: the different definitions of rollout forcing functions; the operational and developmental data sets that are available; the suite of analytical processes that are currently in-place or in-development; and the plans and future work underway to solve two immediate problems related to rollout forcing functions. Problem 1 involves estimating enforced accelerations to drive finite element models for developing design requirements for the SLS class of launch vehicles. Problem 2 involves processing rollout measured data in near real time to understand structural dynamics properties of a specific vehicle and the class to which it belongs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.
2013-07-15
Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less
Scaling of plane-wave functions in statistically optimized near-field acoustic holography.
Hald, Jørgen
2014-11-01
Statistically Optimized Near-field Acoustic Holography (SONAH) is a Patch Holography method, meaning that it can be applied in cases where the measurement area covers only part of the source surface. The method performs projections directly in the spatial domain, avoiding the use of spatial discrete Fourier transforms and the associated errors. First, an inverse problem is solved using regularization. For each calculation point a multiplication must then be performed with two transfer vectors--one to get the sound pressure and the other to get the particle velocity. Considering SONAH based on sound pressure measurements, existing derivations consider only pressure reconstruction when setting up the inverse problem, so the evanescent wave amplification associated with the calculation of particle velocity is not taken into account in the regularized solution of the inverse problem. The present paper introduces a scaling of the applied plane wave functions that takes the amplification into account, and it is shown that the previously published virtual source-plane retraction has almost the same effect. The effectiveness of the different solutions is verified through a set of simulated measurements.
Comprehensive data set of global land cover change for land surface model applications
NASA Astrophysics Data System (ADS)
Sterling, Shannon; Ducharne, AgnèS.
2008-09-01
To increase our understanding of how humans have altered the Earth's surface and to facilitate land surface modeling experiments aimed to elucidate the direct impact of land cover change on the Earth system, we create and analyze a database of global land use/cover change (LUCC). From a combination of sources including satellite imagery and other remote sensing, ecological modeling, and country surveys, we adapt and synthesize existing maps of potential land cover and layers of the major anthropogenic land covers, including a layer of wetland loss, that are then tailored for land surface modeling studies. Our map database shows that anthropogenic land cover totals to approximately 40% of the Earth's surface, consistent with literature estimates. Almost all (92%) of the natural grassland on the Earth has been converted to human use, mostly grazing land, and the natural temperate savanna with mixed C3/C4 is almost completely lost (˜90%), due mostly to conversion to cropland. Yet the resultant change in functioning, in terms of plant functional types, of the Earth system from land cover change is dominated by a loss of tree cover. Finally, we identify need for standardization of percent bare soil for global land covers and for a global map of tree plantations. Estimates of land cover change are inherently uncertain, and these uncertainties propagate into modeling studies of the impact of land cover change on the Earth system; to begin to address this problem, modelers need to document fully areas of land cover change used in their studies.
A Target Coverage Scheduling Scheme Based on Genetic Algorithms in Directional Sensor Networks
Gil, Joon-Min; Han, Youn-Hee
2011-01-01
As a promising tool for monitoring the physical world, directional sensor networks (DSNs) consisting of a large number of directional sensors are attracting increasing attention. As directional sensors in DSNs have limited battery power and restricted angles of sensing range, maximizing the network lifetime while monitoring all the targets in a given area remains a challenge. A major technique to conserve the energy of directional sensors is to use a node wake-up scheduling protocol by which some sensors remain active to provide sensing services, while the others are inactive to conserve their energy. In this paper, we first address a Maximum Set Covers for DSNs (MSCD) problem, which is known to be NP-complete, and present a greedy algorithm-based target coverage scheduling scheme that can solve this problem by heuristics. This scheme is used as a baseline for comparison. We then propose a target coverage scheduling scheme based on a genetic algorithm that can find the optimal cover sets to extend the network lifetime while monitoring all targets by the evolutionary global search technique. To verify and evaluate these schemes, we conducted simulations and showed that the schemes can contribute to extending the network lifetime. Simulation results indicated that the genetic algorithm-based scheduling scheme had better performance than the greedy algorithm-based scheme in terms of maximizing network lifetime. PMID:22319387
Koskinen, Sanna; Hokkinen, Eeva-Maija; Wilson, Lindsay; Sarajuuri, Jaana; Von Steinbüchel, Nicole; Truelle, Jean-Luc
2011-01-01
The aim is to examine two aspects of outcome after traumatic brain injury (TBI). Functional outcome was assessed by the Glasgow Outcome Scale - Extended (GOSE) and by clinician ratings, while health-related quality of life (HRQoL) was assessed by the Quality of Life after Brain Injury (QOLIBRI). The GOSE and the QOLIBRI were linked to the International Classification of Functioning, Disability and Health (ICF) to analyse their content. Functional outcome on ICF categories was assessed by rehabilitation clinicians in 55 participants with TBI and was compared to the participants' own judgements of their HRQoL. The QOLIBRI was linked to 42 and the GOSE to 57 two-level ICF categories covering 78% of the categories on the ICF brief core set for TBI. The closest agreement in the views of the professionals and the participants was found on the Physical Problems and Cognition scales of the QOLIBRI. The problems encountered after TBI are well covered by the QOLIBRI and the GOSE. They capture important domains that are not traditionally sufficiently documented, especially in the domains of interpersonal relationships, social and leisure activities, self and the environment. The findings indicate that they are useful and complementary outcome measures for TBI. In rehabilitation, they can serve as tools in assessment, setting meaningful goals and creating therapeutic alliance.
Reformulation of the covering and quantizer problems as ground states of interacting particles.
Torquato, S
2010-11-01
It is known that the sphere-packing problem and the number-variance problem (closely related to an optimization problem in number theory) can be posed as energy minimizations associated with an infinite number of point particles in d-dimensional Euclidean space R(d) interacting via certain repulsive pair potentials. We reformulate the covering and quantizer problems as the determination of the ground states of interacting particles in R(d) that generally involve single-body, two-body, three-body, and higher-body interactions. This is done by linking the covering and quantizer problems to certain optimization problems involving the "void" nearest-neighbor functions that arise in the theory of random media and statistical mechanics. These reformulations, which again exemplify the deep interplay between geometry and physics, allow one now to employ theoretical and numerical optimization techniques to analyze and solve these energy minimization problems. The covering and quantizer problems have relevance in numerous applications, including wireless communication network layouts, the search of high-dimensional data parameter spaces, stereotactic radiation therapy, data compression, digital communications, meshing of space for numerical analysis, and coding and cryptography, among other examples. In the first three space dimensions, the best known solutions of the sphere-packing and number-variance problems (or their "dual" solutions) are directly related to those of the covering and quantizer problems, but such relationships may or may not exist for d≥4 , depending on the peculiarities of the dimensions involved. Our reformulation sheds light on the reasons for these similarities and differences. We also show that disordered saturated sphere packings provide relatively thin (economical) coverings and may yield thinner coverings than the best known lattice coverings in sufficiently large dimensions. In the case of the quantizer problem, we derive improved upper bounds on the quantizer error using sphere-packing solutions, which are generally substantially sharper than an existing upper bound in low to moderately large dimensions. We also demonstrate that disordered saturated sphere packings yield relatively good quantizers. Finally, we remark on possible applications of our results for the detection of gravitational waves.
Reformulation of the covering and quantizer problems as ground states of interacting particles
NASA Astrophysics Data System (ADS)
Torquato, S.
2010-11-01
It is known that the sphere-packing problem and the number-variance problem (closely related to an optimization problem in number theory) can be posed as energy minimizations associated with an infinite number of point particles in d -dimensional Euclidean space Rd interacting via certain repulsive pair potentials. We reformulate the covering and quantizer problems as the determination of the ground states of interacting particles in Rd that generally involve single-body, two-body, three-body, and higher-body interactions. This is done by linking the covering and quantizer problems to certain optimization problems involving the “void” nearest-neighbor functions that arise in the theory of random media and statistical mechanics. These reformulations, which again exemplify the deep interplay between geometry and physics, allow one now to employ theoretical and numerical optimization techniques to analyze and solve these energy minimization problems. The covering and quantizer problems have relevance in numerous applications, including wireless communication network layouts, the search of high-dimensional data parameter spaces, stereotactic radiation therapy, data compression, digital communications, meshing of space for numerical analysis, and coding and cryptography, among other examples. In the first three space dimensions, the best known solutions of the sphere-packing and number-variance problems (or their “dual” solutions) are directly related to those of the covering and quantizer problems, but such relationships may or may not exist for d≥4 , depending on the peculiarities of the dimensions involved. Our reformulation sheds light on the reasons for these similarities and differences. We also show that disordered saturated sphere packings provide relatively thin (economical) coverings and may yield thinner coverings than the best known lattice coverings in sufficiently large dimensions. In the case of the quantizer problem, we derive improved upper bounds on the quantizer error using sphere-packing solutions, which are generally substantially sharper than an existing upper bound in low to moderately large dimensions. We also demonstrate that disordered saturated sphere packings yield relatively good quantizers. Finally, we remark on possible applications of our results for the detection of gravitational waves.
Classification of simple vegetation types using POLSAR image data
NASA Technical Reports Server (NTRS)
Freeman, A.
1993-01-01
Mapping basic vegetation or land cover types is a fairly common problem in remote sensing. Knowledge of the land cover type is a key input to algorithms which estimate geophysical parameters, such as soil moisture, surface roughness, leaf area index or biomass from remotely sensed data. In an earlier paper, an algorithm for fitting a simple three-component scattering model to POLSAR data was presented. The algorithm yielded estimates for surface scatter, double-bounce scatter and volume scatter for each pixel in a POLSAR image data set. In this paper, we show how the relative levels of each of the three components can be used as inputs to simple classifier for vegetation type. Vegetation classes include no vegetation cover (e.g. bare soil or desert), low vegetation cover (e.g. grassland), moderate vegetation cover (e.g. fully developed crops), forest and urban areas. Implementation of the approach requires estimates for the three components from all three frequencies available using the NASA/JPL AIRSAR, i.e. C-, L- and P-bands. The research described in this paper was carried out by the Jet Propulsion Laboratory, California Institute of Technology under a contract with the National Aeronautics and Space Administration.
Bridging education and training in ageing and disability: the European Care Certificate (ECC)
Churchill, James; Gyorki, Eva
2009-01-01
Introduction There has been significant movement of workers between EU countries seeking work in the social care sector, causing problems for workers and employers who cannot easily evaluate the worth of qualifications gained abroad. The European Care Certificate (ECC) helps workers start work in the social care sector by defining basic knowledge and offering recognition for their learning. Development of product A LEONARDO project involving six countries (BE, UK, AT, DE, RO, PO) established a set of learning outcomes—the BESCLO (Basic European Social Care Learning Outcomes) covering eight key areas of knowledge (not competence). Existing awards and courses become ‘ECC compliant’ by demonstrating coverage of all the BESCLO. Students pass a multi-choice exam to gain the Certificate. There is a developing system of Lead and Delivery Partners spreading the ECC across Europe. Conclusion The BESCLO covers essential knowledge with a common set of values in social care. The ECC fits within existing training courses, is cheap and easy to operate, is at entry level, covers all client groups, can be made available in any language and is equally useful in recruitment, workplace induction training, or more formal college/university courses as an early achievement marker. Website: http://www.eclicence.eu
Human Performance on Hard Non-Euclidean Graph Problems: Vertex Cover
ERIC Educational Resources Information Center
Carruthers, Sarah; Masson, Michael E. J.; Stege, Ulrike
2012-01-01
Recent studies on a computationally hard visual optimization problem, the Traveling Salesperson Problem (TSP), indicate that humans are capable of finding close to optimal solutions in near-linear time. The current study is a preliminary step in investigating human performance on another hard problem, the Minimum Vertex Cover Problem, in which…
Williams, Perry J.; Kendall, William L.
2017-01-01
Choices in ecological research and management are the result of balancing multiple, often competing, objectives. Multi-objective optimization (MOO) is a formal decision-theoretic framework for solving multiple objective problems. MOO is used extensively in other fields including engineering, economics, and operations research. However, its application for solving ecological problems has been sparse, perhaps due to a lack of widespread understanding. Thus, our objective was to provide an accessible primer on MOO, including a review of methods common in other fields, a review of their application in ecology, and a demonstration to an applied resource management problem.A large class of methods for solving MOO problems can be separated into two strategies: modelling preferences pre-optimization (the a priori strategy), or modelling preferences post-optimization (the a posteriori strategy). The a priori strategy requires describing preferences among objectives without knowledge of how preferences affect the resulting decision. In the a posteriori strategy, the decision maker simultaneously considers a set of solutions (the Pareto optimal set) and makes a choice based on the trade-offs observed in the set. We describe several methods for modelling preferences pre-optimization, including: the bounded objective function method, the lexicographic method, and the weighted-sum method. We discuss modelling preferences post-optimization through examination of the Pareto optimal set. We applied each MOO strategy to the natural resource management problem of selecting a population target for cackling goose (Branta hutchinsii minima) abundance. Cackling geese provide food security to Native Alaskan subsistence hunters in the goose's nesting area, but depredate crops on private agricultural fields in wintering areas. We developed objective functions to represent the competing objectives related to the cackling goose population target and identified an optimal solution first using the a priori strategy, and then by examining trade-offs in the Pareto set using the a posteriori strategy. We used four approaches for selecting a final solution within the a posteriori strategy; the most common optimal solution, the most robust optimal solution, and two solutions based on maximizing a restricted portion of the Pareto set. We discuss MOO with respect to natural resource management, but MOO is sufficiently general to cover any ecological problem that contains multiple competing objectives that can be quantified using objective functions.
NASA Astrophysics Data System (ADS)
Armstrong, R. L.; Brodzik, M.; Savoie, M. H.
2007-12-01
Over the past several decades both visible and passive microwave satellite data have been utilized for snow mapping at the continental to global scale. Snow mapping using visible data has been based primarily on the magnitude of the surface reflectance, and in more recent cases on specific spectral signatures, while microwave data can be used to identify snow cover because the microwave energy emitted by the underlying soil is scattered by the snow grains resulting in a sharp decrease in brightness temperature and a characteristic negative spectral gradient. Both passive microwave and visible data sets indicate a similar pattern of inter-annual variability, although the maximum snow extents derived from the microwave data are consistently less than those provided by the visible satellite data and the visible data typically show higher monthly variability. We describe the respective problems as well as the advantages and disadvantages of these two types of satellite data for snow cover mapping and demonstrate how a multi-sensor approach is optimal. For the period 1978 to present we combine data from the NOAA weekly snow charts with snow cover derived from the SMMR and SSM/I brightness temperature data. For the period since 2002 we blend NASA EOS MODIS and AMSR-E data sets. Our current product incorporates MODIS data from the Climate Modelers Grid (CMG) at approximately 5 km (0.05 deg.) with microwave-derived snow water equivalent (SWE) at 25 km, resulting in a blended product that includes percent snow cover in the larger grid cell whenever the microwave SWE signal is absent. Validation of AMSR-E at the brightness temperature level is provided through the comparison with data from the well-calibrated heritage SSM/I sensor over large homogeneous snow-covered surfaces (e.g. Dome C region, Antarctica). We also describe how the application of the higher frequency microwave channels (85 and 89 GHz)enhances accurate mapping of shallow and intermittent snow cover.
A Strategic Approach to Optimizing the U.S. Army’s Aeromedical Evacuation System in Afghanistan
2009-07-10
arise on distinct nodes and the facilities are restricted to a finite set of candidate locations ( Daskin 2008). Here, this problem classifies as a ...Research Logistics, 55(4), 283-294. Daskin , M. (1983) A maximum expected covering location model: formulation, properties and heuristic solution...34,," !hal notwithstan<ling any oilier provision 01 law. no person sha~ be subject to any penart)’ l or fai!;ng to comply willi a cdledion 01 inIormalion W
Sales, Célia Md; Neves, Inês Td; Alves, Paula G; Ashworth, Mark
2017-11-22
There is increasing interest in individualized patient-reported outcome measures (I-PROMS), where patients themselves indicate the specific problems they want to address in therapy and these problems are used as items within the outcome measurement tool. This paper examined the extent to which 279 items reported in an I-PROM (PSYCHLOPS) added qualitative information which was not captured by two well-established outcome measures (CORE-OM and PHQ-9). Comparison of items was only conducted for patients scoring above the "caseness" threshold on the standardized measures. 107 patients were participating in therapy within addiction and general psychiatric clinical settings. Almost every patient (95%) reported at least one item whose content was not covered by PHQ-9, and 71% reported at least one item not covered by CORE-OM. Results demonstrate the relevance of individualized outcome assessment for capturing data describing the issues of greatest concern to patients, as nomothetic measures do not always seem to capture the whole story. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.
Aerosol climatology over Mexico City basin: Characterization of their optical properties
NASA Astrophysics Data System (ADS)
Carabali-Sandoval, Giovanni; Valdéz-Barrón, Mauro; Bonifaz-Alfonso, Roberto; Riveros-Rosas, David; Estévez, Héctor
2015-04-01
Climatology of aerosol optical depth (AOD), single scattering albedo (SSA) and size parameters were analyzed using a 15-year (1999-2014) data set from AErosol RObotic NETwork (AERONET) observations over Mexico City basin. Since urban air pollution is one of the biggest problems that face this megacity, many studies addressing these issues have been published. However few studies have examined the climatology of aerosol taking into account their optical properties over long-time period. Pollution problems in Mexico City have been generated by the daily activities of some 21 million people coupled with the vast amount of industry located within the city's metropolitan area. Another contributing factor is the unique geographical setting of the basin encompassing Mexico City. The basin covers approximately 5000 km2 of the Mexican Plateau at an average elevation of 2250 m above sea level (ASL) and is surrounded on three sides by mountains averaging over 3000 m ASL. In this work we present preliminary results of aerosol climatology in Mexico City.
... KB] Spanish [153 KB] Cover Your Cough, Flyer & Poster for Health Care Settings Flyer : English Portuguese [268 ... KB] Chinese [246 KB] Cover Your Cough, Flyer & Poster for Community and Public Settings Flyer : English Portuguese [ ...
NASA Technical Reports Server (NTRS)
Heric, Matthew; Cox, William; Gordon, Daniel K.
1987-01-01
In an attempt to improve the land cover/use classification accuracy obtainable from remotely sensed multispectral imagery, Airborne Imaging Spectrometer-1 (AIS-1) images were analyzed in conjunction with Thematic Mapper Simulator (NS001) Large Format Camera color infrared photography and black and white aerial photography. Specific portions of the combined data set were registered and used for classification. Following this procedure, the resulting derived data was tested using an overall accuracy assessment method. Precise photogrammetric 2D-3D-2D geometric modeling techniques is not the basis for this study. Instead, the discussion exposes resultant spectral findings from the image-to-image registrations. Problems associated with the AIS-1 TMS integration are considered, and useful applications of the imagery combination are presented. More advanced methodologies for imagery integration are needed if multisystem data sets are to be utilized fully. Nevertheless, research, described herein, provides a formulation for future Earth Observation Station related multisensor studies.
Engineering design skills coverage in K-12 engineering program curriculum materials in the USA
NASA Astrophysics Data System (ADS)
Chabalengula, Vivien M.; Mumba, Frackson
2017-11-01
The current K-12 Science Education framework and Next Generation Science Standards (NGSS) in the United States emphasise the integration of engineering design in science instruction to promote scientific literacy and engineering design skills among students. As such, many engineering education programmes have developed curriculum materials that are being used in K-12 settings. However, little is known about the nature and extent to which engineering design skills outlined in NGSS are addressed in these K-12 engineering education programme curriculum materials. We analysed nine K-12 engineering education programmes for the nature and extent of engineering design skills coverage. Results show that developing possible solutions and actual designing of prototypes were the highly covered engineering design skills; specification of clear goals, criteria, and constraints received medium coverage; defining and identifying an engineering problem; optimising the design solution; and demonstrating how a prototype works, and making iterations to improve designs were lowly covered. These trends were similar across grade levels and across discipline-specific curriculum materials. These results have implications on engineering design-integrated science teaching and learning in K-12 settings.
NASA Astrophysics Data System (ADS)
Chaerani, D.; Lesmana, E.; Tressiana, N.
2018-03-01
In this paper, an application of Robust Optimization in agricultural water resource management problem under gross margin and water demand uncertainty is presented. Water resource management is a series of activities that includes planning, developing, distributing and managing the use of water resource optimally. Water resource management for agriculture can be one of the efforts to optimize the benefits of agricultural output. The objective function of agricultural water resource management problem is to maximizing total benefits by water allocation to agricultural areas covered by the irrigation network in planning horizon. Due to gross margin and water demand uncertainty, we assume that the uncertain data lies within ellipsoidal uncertainty set. We employ robust counterpart methodology to get the robust optimal solution.
On twelve types of covering-based rough sets.
Safari, Samira; Hooshmandasl, Mohammad Reza
2016-01-01
Covering approximation spaces are a generalization of equivalence-based rough set theories. In this paper, we will consider twelve types of covering based approximation operators by combining four types of covering lower approximation operators and three types of covering upper approximation operators. Then, we will study the properties of these new pairs and show they have most of the common properties among existing covering approximation pairs. Finally, the relation between these new pairs is studied.
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1975-01-01
The author has identified the following significant results. One of the most significant results of this Skylab research involved the geometric correction and overlay of the Skylab multispectral scanner data with the LANDSAT multispectral scanner data, and also with a set of topographic data, including elevation, slope, and aspect. The Skylab S192 multispectral scanner data had distinct differences in noise level of the data in the various wavelength bands. Results of the temporal evaluation of the SL-2 and SL-3 photography were found to be particularly important for proper interpretation of the computer-aided analysis of the SL-2 and SL-3 multispectral scanner data. There was a quality problem involving the ringing effect introduced by digital filtering. The modified clustering technique was found valuable when working with multispectral scanner data involving many wavelength bands and covering large geographic areas. Analysis of the SL-2 scanner data involved classification of major cover types and also forest cover types. Comparison of the results obtained wth Skylab MSS data and LANDSAT MSS data indicated that the improved spectral resolution of the Skylab scanner system enabled a higher classification accuracy to be obtained for forest cover types, although the classification performance for major cover types was not significantly different.
A population-based model for priority setting across the care continuum and across modalities
Segal, Leonie; Mortimer, Duncan
2006-01-01
Background The Health-sector Wide (HsW) priority setting model is designed to shift the focus of priority setting away from 'program budgets' – that are typically defined by modality or disease-stage – and towards well-defined target populations with a particular disease/health problem. Methods The key features of the HsW model are i) a disease/health problem framework, ii) a sequential approach to covering the entire health sector, iii) comprehensiveness of scope in identifying intervention options and iv) the use of objective evidence. The HsW model redefines the unit of analysis over which priorities are set to include all mutually exclusive and complementary interventions for the prevention and treatment of each disease/health problem under consideration. The HsW model is therefore incompatible with the fragmented approach to priority setting across multiple program budgets that currently characterises allocation in many health systems. The HsW model employs standard cost-utility analyses and decision-rules with the aim of maximising QALYs contingent upon the global budget constraint for the set of diseases/health problems under consideration. It is recognised that the objective function may include non-health arguments that would imply a departure from simple QALY maximisation and that political constraints frequently limit degrees of freedom. In addressing these broader considerations, the HsW model can be modified to maximise value-weighted QALYs contingent upon the global budget constraint and any political constraints bearing upon allocation decisions. Results The HsW model has been applied in several contexts, recently to osteoarthritis, that has demonstrated both its practical application and its capacity to derive clear evidenced-based policy recommendations. Conclusion Comparisons with other approaches to priority setting, such as Programme Budgeting and Marginal Analysis (PBMA) and modality-based cost-effectiveness comparisons, as typified by Australia's Pharmaceutical Benefits Advisory Committee process for the listing of pharmaceuticals for government funding, demonstrate the value added by the HsW model notably in its greater likelihood of contributing to allocative efficiency. PMID:16566841
Uncertainty management by relaxation of conflicting constraints in production process scheduling
NASA Technical Reports Server (NTRS)
Dorn, Juergen; Slany, Wolfgang; Stary, Christian
1992-01-01
Mathematical-analytical methods as used in Operations Research approaches are often insufficient for scheduling problems. This is due to three reasons: the combinatorial complexity of the search space, conflicting objectives for production optimization, and the uncertainty in the production process. Knowledge-based techniques, especially approximate reasoning and constraint relaxation, are promising ways to overcome these problems. A case study from an industrial CIM environment, namely high-grade steel production, is presented to demonstrate how knowledge-based scheduling with the desired capabilities could work. By using fuzzy set theory, the applied knowledge representation technique covers the uncertainty inherent in the problem domain. Based on this knowledge representation, a classification of jobs according to their importance is defined which is then used for the straightforward generation of a schedule. A control strategy which comprises organizational, spatial, temporal, and chemical constraints is introduced. The strategy supports the dynamic relaxation of conflicting constraints in order to improve tentative schedules.
Normalization in Lie algebras via mould calculus and applications
NASA Astrophysics Data System (ADS)
Paul, Thierry; Sauzin, David
2017-11-01
We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.
Genetics, biometrics and the informatization of the body.
van der Ploeg, Irma
2007-01-01
"Genetics" is a term covering a wide set of theories, practices, and technologies, only some of which overlap with the practices and technologies of biometrics. In this paper some current technological developments relating to biometric applications of genetics will be highlighted. Next, the author will elaborate the notion of the informatization of the body, by means of a brief philosophical detour on the dualisms of language and reality, words and things. In the subsequent sections she will then draw out some of the questions relevant to the purposes of Biometrics Identification Technology Ethics (BITE), and discuss the ethical problems associated with the informatization of the body. There are, however some problems and limitations to the currently dominant ethical discourse to deal with all things ethical in relation to information technology in general, and biometrics or genetics in particular. The final section will discuss some of these meta-problems.
On Bipartite Graphs Trees and Their Partial Vertex Covers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caskurlu, Bugra; Mkrtchyan, Vahan; Parekh, Ojas D.
2015-03-01
Graphs can be used to model risk management in various systems. Particularly, Caskurlu et al. in [7] have considered a system, which has threats, vulnerabilities and assets, and which essentially represents a tripartite graph. The goal in this model is to reduce the risk in the system below a predefined risk threshold level. One can either restricting the permissions of the users, or encapsulating the system assets. The pointed out two strategies correspond to deleting minimum number of elements corresponding to vulnerabilities and assets, such that the flow between threats and assets is reduced below the predefined threshold level. Itmore » can be shown that the main goal in this risk management system can be formulated as a Partial Vertex Cover problem on bipartite graphs. It is well-known that the Vertex Cover problem is in P on bipartite graphs, however; the computational complexity of the Partial Vertex Cover problem on bipartite graphs has remained open. In this paper, we establish that the Partial Vertex Cover problem is NP-hard on bipartite graphs, which was also recently independently demonstrated [N. Apollonio and B. Simeone, Discrete Appl. Math., 165 (2014), pp. 37–48; G. Joret and A. Vetta, preprint, arXiv:1211.4853v1 [cs.DS], 2012]. We then identify interesting special cases of bipartite graphs, for which the Partial Vertex Cover problem, the closely related Budgeted Maximum Coverage problem, and their weighted extensions can be solved in polynomial time. We also present an 8/9-approximation algorithm for the Budgeted Maximum Coverage problem in the class of bipartite graphs. We show that this matches and resolves the integrality gap of the natural LP relaxation of the problem and improves upon a recent 4/5-approximation.« less
NASA Technical Reports Server (NTRS)
1943-01-01
This is the third of a series of reports covering an investigation of the general instability problem by the California Institute of Technology. The first five reports of this series cover investigations of the general instability problem under the loading conditions of pure bending and were prepared under the sponsorship of the Civil Aeronautics Administration. The succeeding reports of this series cover the work done on other loading conditions under the sponsorship of the National Advisory Committee for Aeronautics. This report is concerned primarily with the continuation of the tests of wire-braced specimens, and preliminary tests of sheet-covered specimens that had been made in the experimental investigation on the problem of the general instability of stiffened metal cylinders at the C.I.T.
Minimizing embedding impact in steganography using trellis-coded quantization
NASA Astrophysics Data System (ADS)
Filler, Tomáš; Judas, Jan; Fridrich, Jessica
2010-01-01
In this paper, we propose a practical approach to minimizing embedding impact in steganography based on syndrome coding and trellis-coded quantization and contrast its performance with bounds derived from appropriate rate-distortion bounds. We assume that each cover element can be assigned a positive scalar expressing the impact of making an embedding change at that element (single-letter distortion). The problem is to embed a given payload with minimal possible average embedding impact. This task, which can be viewed as a generalization of matrix embedding or writing on wet paper, has been approached using heuristic and suboptimal tools in the past. Here, we propose a fast and very versatile solution to this problem that can theoretically achieve performance arbitrarily close to the bound. It is based on syndrome coding using linear convolutional codes with the optimal binary quantizer implemented using the Viterbi algorithm run in the dual domain. The complexity and memory requirements of the embedding algorithm are linear w.r.t. the number of cover elements. For practitioners, we include detailed algorithms for finding good codes and their implementation. Finally, we report extensive experimental results for a large set of relative payloads and for different distortion profiles, including the wet paper channel.
[Economics of health system transformation].
González Pier, Eduardo
2012-01-01
Health conditions in Mexico have evolved along with socioeconomic conditions. As a result, today's health system faces several problems characterized by four overlapping transitions: demand, expectations, funding and health resources. These transitions engender significant pressures on the system itself. Additionally, fragmentation of the health system creates disparities in access to services and generates problems in terms of efficiency and use of available resources. To address these complications and to improve equity in access and efficiency, thorough analysis is required in how the right to access health care should be established at a constitutional level without differentiating across population groups. This should be followed by careful discussion about what rules of health care financing should exist, which set of interventions ought to be covered and how services must be organized to meet the health needs of the population.
Reveal, A General Reverse Engineering Algorithm for Inference of Genetic Network Architectures
NASA Technical Reports Server (NTRS)
Liang, Shoudan; Fuhrman, Stefanie; Somogyi, Roland
1998-01-01
Given the immanent gene expression mapping covering whole genomes during development, health and disease, we seek computational methods to maximize functional inference from such large data sets. Is it possible, in principle, to completely infer a complex regulatory network architecture from input/output patterns of its variables? We investigated this possibility using binary models of genetic networks. Trajectories, or state transition tables of Boolean nets, resemble time series of gene expression. By systematically analyzing the mutual information between input states and output states, one is able to infer the sets of input elements controlling each element or gene in the network. This process is unequivocal and exact for complete state transition tables. We implemented this REVerse Engineering ALgorithm (REVEAL) in a C program, and found the problem to be tractable within the conditions tested so far. For n = 50 (elements) and k = 3 (inputs per element), the analysis of incomplete state transition tables (100 state transition pairs out of a possible 10(exp 15)) reliably produced the original rule and wiring sets. While this study is limited to synchronous Boolean networks, the algorithm is generalizable to include multi-state models, essentially allowing direct application to realistic biological data sets. The ability to adequately solve the inverse problem may enable in-depth analysis of complex dynamic systems in biology and other fields.
Puliafico, Anthony C.; Kurtz, Steven M. S.; Pincus, Donna B.; Comer, Jonathan S.
2014-01-01
Although efficacious psychological treatments for internalizing disorders are now well established for school-aged children, until recently there have regrettably been limited empirical efforts to clarify indicated psychological intervention methods for the treatment of mood and anxiety disorders presenting in early childhood. Young children lack many of the developmental capacities required to effectively participate in established treatments for mood and anxiety problems presenting in older children, making simple downward extensions of these treatments for the management of preschool internalizing problems misguided. In recent years, a number of research groups have successfully adapted and modified parent–child interaction therapy (PCIT), originally developed to treat externalizing problems in young children, to treat various early internalizing problems with a set of neighboring protocols. As in traditional PCIT, these extensions target child symptoms by directly reshaping parent–child interaction patterns associated with the maintenance of symptoms. The present review outlines this emerging set of novel PCIT adaptations and modifications for mood and anxiety problems in young children and reviews preliminary evidence supporting their use. Specifically, we cover (a) PCIT for early separation anxiety disorder; (b) the PCIT-CALM (Coaching Approach behavior and Leading by Modeling) Program for the full range of early anxiety disorders; (c) the group Turtle Program for behavioral inhibition; and (d) the PCIT-ED (Emotional Development) Program for preschool depression. In addition, emerging PCIT-related protocols in need of empirical attention—such as the PCIT-SM (selective mutism) Program for young children with SM—are also considered. Implications of these protocols are discussed with regard to their unique potential to address the clinical needs of young children with internalizing problems. Obstacles to broad dissemination are addressed, and we consider potential solutions, including modular treatment formats and innovative applications of technology. PMID:25212716
Carpenter, Aubrey L; Puliafico, Anthony C; Kurtz, Steven M S; Pincus, Donna B; Comer, Jonathan S
2014-12-01
Although efficacious psychological treatments for internalizing disorders are now well established for school-aged children, until recently there have regrettably been limited empirical efforts to clarify indicated psychological intervention methods for the treatment of mood and anxiety disorders presenting in early childhood. Young children lack many of the developmental capacities required to effectively participate in established treatments for mood and anxiety problems presenting in older children, making simple downward extensions of these treatments for the management of preschool internalizing problems misguided. In recent years, a number of research groups have successfully adapted and modified parent-child interaction therapy (PCIT), originally developed to treat externalizing problems in young children, to treat various early internalizing problems with a set of neighboring protocols. As in traditional PCIT, these extensions target child symptoms by directly reshaping parent-child interaction patterns associated with the maintenance of symptoms. The present review outlines this emerging set of novel PCIT adaptations and modifications for mood and anxiety problems in young children and reviews preliminary evidence supporting their use. Specifically, we cover (a) PCIT for early separation anxiety disorder; (b) the PCIT-CALM (Coaching Approach behavior and Leading by Modeling) Program for the full range of early anxiety disorders; (c) the group Turtle Program for behavioral inhibition; and (d) the PCIT-ED (Emotional Development) Program for preschool depression. In addition, emerging PCIT-related protocols in need of empirical attention--such as the PCIT-SM (selective mutism) Program for young children with SM--are also considered. Implications of these protocols are discussed with regard to their unique potential to address the clinical needs of young children with internalizing problems. Obstacles to broad dissemination are addressed, and we consider potential solutions, including modular treatment formats and innovative applications of technology.
Online Case-Based Discussions: Examining Coverage of the Afforded Problem Space
ERIC Educational Resources Information Center
Ertmer, Peggy A.; Koehler, Adrie A.
2014-01-01
Case studies hold great potential for engaging students in disciplinary content. However, little is known about the extent to which students actually cover the problem space afforded by a particular case study. In this research, we compared the problem space afforded by an instructional design case study with the actual content covered by 16…
NASA Astrophysics Data System (ADS)
Neggers, R. A. J.; Ackerman, A. S.; Angevine, W. M.; Bazile, E.; Beau, I.; Blossey, P. N.; Boutle, I. A.; de Bruijn, C.; Cheng, A.; van der Dussen, J.; Fletcher, J.; Dal Gesso, S.; Jam, A.; Kawai, H.; Cheedela, S. K.; Larson, V. E.; Lefebvre, M.-P.; Lock, A. P.; Meyer, N. R.; de Roode, S. R.; de Rooy, W.; Sandu, I.; Xiao, H.; Xu, K.-M.
2017-10-01
Results are presented of the GASS/EUCLIPSE single-column model intercomparison study on the subtropical marine low-level cloud transition. A central goal is to establish the performance of state-of-the-art boundary-layer schemes for weather and climate models for this cloud regime, using large-eddy simulations of the same scenes as a reference. A novelty is that the comparison covers four different cases instead of one, in order to broaden the covered parameter space. Three cases are situated in the North-Eastern Pacific, while one reflects conditions in the North-Eastern Atlantic. A set of variables is considered that reflects key aspects of the transition process, making use of simple metrics to establish the model performance. Using this method, some longstanding problems in low-level cloud representation are identified. Considerable spread exists among models concerning the cloud amount, its vertical structure, and the associated impact on radiative transfer. The sign and amplitude of these biases differ somewhat per case, depending on how far the transition has progressed. After cloud breakup the ensemble median exhibits the well-known "too few too bright" problem. The boundary-layer deepening rate and its state of decoupling are both underestimated, while the representation of the thin capping cloud layer appears complicated by a lack of vertical resolution. Encouragingly, some models are successful in representing the full set of variables, in particular, the vertical structure and diurnal cycle of the cloud layer in transition. An intriguing result is that the median of the model ensemble performs best, inspiring a new approach in subgrid parameterization.
Bradley, William G
2004-04-01
Radiologists are responsible for providing prompt emergency radiology interpretations 24 hours a day, every day of the year. As a result of the increasing use of multidetector computed tomography, emergency radiology has increased significantly in volume over the past 5 years. Simultaneously, radiologists are working harder during the day because of the workforce shortage. Although teleradiology services located in the continental United States have been providing efficient coverage until recently, they are now having increasing difficulty recruiting radiologists who are willing to work at night. Addressing this problem is "offshore teleradiology." With the increasing use of several enabling technologies--Digital Imaging and Communication in Medicine, the picture archiving and communication system, and the Internet-it is now possible to cover a domestic radiology practice at night from any location in the world where it is daytime. Setting up such a practice is nontrivial, however. The radiologists must all be American trained and certified by the American Board of Radiology. They must have medical licenses in every state and privileges at every hospital they cover. This article describes some of the details involved in setting up an offshore teleradiology practice. It also attempts to make a financial case for using such a practice, particularly in the current economic environment.
BOREAS RSS-14 Level-1 GOES-8 Visible, IR and Water Vapor Images
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Faysash, David; Cooper, Harry J.; Smith, Eric A.; Newcomer, Jeffrey A.
2000-01-01
The BOREAS RSS-14 team collected and processed several GOES-7 and GOES-8 image data sets that covered the BOREAS study region. The level-1 BOREAS GOES-8 images are raw data values collected by RSS-14 personnel at FSU and delivered to BORIS. The data cover 14-Jul-1995 to 21-Sep-1995 and 01-Jan-1996 to 03-Oct-1996. The data start out containing three 8-bit spectral bands and end up containing five 10-bit spectral bands. No major problems with the data have been identified. The data are contained in binary image format files. Due to the large size of the images, the level-1 GOES-8 data are not contained on the BOREAS CD-ROM set. An inventory listing file is supplied on the CD-ROM to inform users of what data were collected. The level-1 GOES-8 image data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). See sections 15 and 16 for more information. The data files are available on a CD-ROM (see document number 20010000884).
An algorithm for designing minimal microbial communities with desired metabolic capacities
Eng, Alexander; Borenstein, Elhanan
2016-01-01
Motivation: Recent efforts to manipulate various microbial communities, such as fecal microbiota transplant and bioreactor systems’ optimization, suggest a promising route for microbial community engineering with numerous medical, environmental and industrial applications. However, such applications are currently restricted in scale and often rely on mimicking or enhancing natural communities, calling for the development of tools for designing synthetic communities with specific, tailored, desired metabolic capacities. Results: Here, we present a first step toward this goal, introducing a novel algorithm for identifying minimal sets of microbial species that collectively provide the enzymatic capacity required to synthesize a set of desired target product metabolites from a predefined set of available substrates. Our method integrates a graph theoretic representation of network flow with the set cover problem in an integer linear programming (ILP) framework to simultaneously identify possible metabolic paths from substrates to products while minimizing the number of species required to catalyze these metabolic reactions. We apply our algorithm to successfully identify minimal communities both in a set of simple toy problems and in more complex, realistic settings, and to investigate metabolic capacities in the gut microbiome. Our framework adds to the growing toolset for supporting informed microbial community engineering and for ultimately realizing the full potential of such engineering efforts. Availability and implementation: The algorithm source code, compilation, usage instructions and examples are available under a non-commercial research use only license at https://github.com/borenstein-lab/CoMiDA. Contact: elbo@uw.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153571
Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Tutorial
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. L. Smith; S. T. Beck; S. T. Wood
2008-08-01
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). This volume is the tutorial manual for the SAPHIRE system. In this document, a series of lessons are provided that guide the user through basic steps common to most analyses preformed with SAPHIRE. The tutorial is divided into two major sections covering both basic and advanced features. The section covering basic topics contains lessons that lead the reader through development of a probabilistic hypothetical problem involving a vehicle accident, highlighting the program’smore » most fundamental features. The advanced features section contains additional lessons that expand on fundamental analysis features of SAPHIRE and provide insights into more complex analysis techniques. Together, these two elements provide an overview into the operation and capabilities of the SAPHIRE software.« less
Classification Based on Pruning and Double Covered Rule Sets for the Internet of Things Applications
Zhou, Zhongmei; Wang, Weiping
2014-01-01
The Internet of things (IOT) is a hot issue in recent years. It accumulates large amounts of data by IOT users, which is a great challenge to mining useful knowledge from IOT. Classification is an effective strategy which can predict the need of users in IOT. However, many traditional rule-based classifiers cannot guarantee that all instances can be covered by at least two classification rules. Thus, these algorithms cannot achieve high accuracy in some datasets. In this paper, we propose a new rule-based classification, CDCR-P (Classification based on the Pruning and Double Covered Rule sets). CDCR-P can induce two different rule sets A and B. Every instance in training set can be covered by at least one rule not only in rule set A, but also in rule set B. In order to improve the quality of rule set B, we take measure to prune the length of rules in rule set B. Our experimental results indicate that, CDCR-P not only is feasible, but also it can achieve high accuracy. PMID:24511304
Li, Shasha; Zhou, Zhongmei; Wang, Weiping
2014-01-01
The Internet of things (IOT) is a hot issue in recent years. It accumulates large amounts of data by IOT users, which is a great challenge to mining useful knowledge from IOT. Classification is an effective strategy which can predict the need of users in IOT. However, many traditional rule-based classifiers cannot guarantee that all instances can be covered by at least two classification rules. Thus, these algorithms cannot achieve high accuracy in some datasets. In this paper, we propose a new rule-based classification, CDCR-P (Classification based on the Pruning and Double Covered Rule sets). CDCR-P can induce two different rule sets A and B. Every instance in training set can be covered by at least one rule not only in rule set A, but also in rule set B. In order to improve the quality of rule set B, we take measure to prune the length of rules in rule set B. Our experimental results indicate that, CDCR-P not only is feasible, but also it can achieve high accuracy.
ERIC Educational Resources Information Center
Poncy, Brian C.; Skinner, Christopher H.; McCallum, Elizabeth
2012-01-01
An adapted alternating treatments design was used to compare the effects of class-wide applications of Taped Problems (TP) and Cover, Copy, and Compare (CCC) procedures designed to enhance subtraction fact fluency in an intact third-grade classroom. During the TP procedure, a tape provided an auditory prompt (i.e., the problem), followed by a…
V&V of MCNP 6.1.1 Beta Against Intermediate and High-Energy Experimental Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashnik, Stepan G
This report presents a set of validation and verification (V&V) MCNP 6.1.1 beta results calculated in parallel, with MPI, obtained using its event generators at intermediate and high-energies compared against various experimental data. It also contains several examples of results using the models at energies below 150 MeV, down to 10 MeV, where data libraries are normally used. This report can be considered as the forth part of a set of MCNP6 Testing Primers, after its first, LA-UR-11-05129, and second, LA-UR-11-05627, and third, LA-UR-26944, publications, but is devoted to V&V with the latest, 1.1 beta version of MCNP6. The MCNP6more » test-problems discussed here are presented in the /VALIDATION_CEM/and/VALIDATION_LAQGSM/subdirectories in the MCNP6/Testing/directory. README files that contain short descriptions of every input file, the experiment, the quantity of interest that the experiment measures and its description in the MCNP6 output files, and the publication reference of that experiment are presented for every test problem. Templates for plotting the corresponding results with xmgrace as well as pdf files with figures representing the final results of our V&V efforts are presented. Several technical “bugs” in MCNP 6.1.1 beta were discovered during our current V&V of MCNP6 while running it in parallel with MPI using its event generators. These “bugs” are to be fixed in the following version of MCNP6. Our results show that MCNP 6.1.1 beta using its CEM03.03, LAQGSM03.03, Bertini, and INCL+ABLA, event generators describes, as a rule, reasonably well different intermediate- and high-energy measured data. This primer isn’t meant to be read from cover to cover. Readers may skip some sections and go directly to any test problem in which they are interested.« less
A cloud cover model based on satellite data
NASA Technical Reports Server (NTRS)
Somerville, P. N.; Bean, S. J.
1980-01-01
A model for worldwide cloud cover using a satellite data set containing infrared radiation measurements is proposed. The satellite data set containing day IR, night IR and incoming and absorbed solar radiation measurements on a 2.5 degree latitude-longitude grid covering a 45 month period was converted to estimates of cloud cover. The global area was then classified into homogeneous cloud cover regions for each of the four seasons. It is noted that the developed maps can be of use to the practicing climatologist who can obtain a considerable amount of cloud cover information without recourse to large volumes of data.
Gruen, Russell L; Knox, Stephanie; Britt, Helena; Bailie, Ross S
2004-01-01
Background The interface between primary care and specialist medical services is an important domain for health services research and policy. Of particular concern is optimising specialist services and the organisation of the specialist workforce to meet the needs and demands for specialist care, particularly those generated by referral from primary care. However, differences in the disease classification and reporting of the work of primary and specialist surgical sectors hamper such research. This paper describes the development of a bridging classification for use in the study of potential surgical problems in primary care settings, and for classifying referrals to surgical specialties. Methods A three stage process was undertaken, which involved: (1) defining the categories of surgical disorders from a specialist perspective that were relevant to the specialist-primary care interface; (2) classifying the 'terms' in the International Classification of Primary Care Version 2-Plus (ICPC-2 Plus) to the surgical categories; and (3) using referral data from 303,000 patient encounters in the BEACH study of general practice activity in Australia to define a core set of surgical conditions. Inclusion of terms was based on the probability of specialist referral of patients with such problems, and specialists' perception that they constitute part of normal surgical practice. Results A four-level hierarchy was developed, containing 8, 27 and 79 categories in the first, second and third levels, respectively. These categories classified 2050 ICPC-2 Plus terms that constituted the fourth level, and which covered the spectrum of problems that were managed in primary care and referred to surgical specialists. Conclusion Our method of classifying terms from a primary care classification system to categories delineated by specialists should be applicable to research addressing the interface between primary and specialist care. By describing the process and putting the bridging classification system in the public domain, we invite comment and application in other settings where similar problems might be faced. PMID:15142280
NASA Astrophysics Data System (ADS)
Kim, Jin-Hong; Lee, Jun-Seok; Lim, Jungshik; Seo, Jung-Kyo
2009-03-01
Narrow gap distance in cover-layer incident near-field recording (NFR) configuration causes a collision problem in the interface between a solid immersion lens and a disk surface. A polymer cover-layer with smooth surface results in a stable gap servo while a nanocomposite cover-layer with high refractive index shows a collision problem during the gap servo test. Even though a dielectric cover-layer, in which the surface is rougher than the polymer, supplements the mechanical properties, an unclear eye pattern due to an unstable gap servo can be obtained after a chemical mechanical polishing. Not only smooth surface but also good mechanical properties of cover-layer are required for the stable gap servo in the NFR.
Treatment of ice cover and other thin elastic layers with the parabolic equation method.
Collins, Michael D
2015-03-01
The parabolic equation method is extended to handle problems involving ice cover and other thin elastic layers. Parabolic equation solutions are based on rational approximations that are designed using accuracy constraints to ensure that the propagating modes are handled properly and stability constrains to ensure that the non-propagating modes are annihilated. The non-propagating modes are especially problematic for problems involving thin elastic layers. It is demonstrated that stable results may be obtained for such problems by using rotated rational approximations [Milinazzo, Zala, and Brooke, J. Acoust. Soc. Am. 101, 760-766 (1997)] and generalizations of these approximations. The approach is applied to problems involving ice cover with variable thickness and sediment layers that taper to zero thickness.
Sakhalin Island terrain intelligence
,
1943-01-01
This folio of maps and explanatory tables outlines the principal terrain features of Sakhalin Island. Each map and table is devoted to a specialized set of problems; together they cover the subjects of terrain appreciation, climate, rivers, water supply, construction materials, suitability for roads, suitability for airfields, fuels and other mineral resources, and geology. In most cases, the map of the island is divided into two parts: N. of latitude 50° N., Russian Sakhalin, and south of latitude 50° N., Japanese Sakhalin or Karafuto. These maps and data were compiled by the United States Geological Survey during the period from March to September, 1943.
Statement of Ethics in Neurosurgery of the World Federation of Neurosurgical Societies.
Umansky, Felix; Black, Peter L; DiRocco, Concenzio; Ferrer, Enrique; Goel, Atul; Malik, Ghaus M; Mathiesen, Tiit; Mendez, Ivar; Palmer, James D; Juanotena, Jorge Rodriguez; Fraifeld, Shifra; Rosenfeld, Jeffrey V
2011-01-01
This Statement of Ethics in Neurosurgery was developed by the Committee for Ethics and Medico-Legal Affairs of the World Federation of Neurosurgical Societies to help neurosurgeons resolve problems in the treatment of individual patients and meet obligations to the larger society. This document is intended as a framework rather than a set of rules. It cannot cover every situation and should be used with flexibility. However, it is our intent that the fundamental principles enunciated here should serve as a guide in the day-to-day practice of neurosurgery. Copyright © 2011 Elsevier Inc. All rights reserved.
BOOK REVIEW: A First Course in Loop Quantum Gravity A First Course in Loop Quantum Gravity
NASA Astrophysics Data System (ADS)
Dittrich, Bianca
2012-12-01
Students who are interested in quantum gravity usually face the difficulty of working through a large amount of prerequisite material before being able to deal with actual quantum gravity. A First Course in Loop Quantum Gravity by Rodolfo Gambini and Jorge Pullin, aimed at undergraduate students, marvellously succeeds in starting from the basics of special relativity and covering basic topics in Hamiltonian dynamics, Yang Mills theory, general relativity and quantum field theory, ending with a tour on current (loop) quantum gravity research. This is all done in a short 173 pages! As such the authors cannot cover any of the subjects in depth and indeed this book should be seen more as a motivation and orientation guide so that students can go on to follow the hints for further reading. Also, as there are many subjects to cover beforehand, slightly more than half of the book is concerned with more general subjects (special and general relativity, Hamiltonian dynamics, constrained systems, quantization) before the starting point for loop quantum gravity, the Ashtekar variables, are introduced. The approach taken by the authors is heuristic and uses simplifying examples in many places. However they take care in motivating all the main steps and succeed in presenting the material pedagogically. Problem sets are provided throughout and references for further reading are given. Despite the shortness of space, alternative viewpoints are mentioned and the reader is also referred to experimental results and bounds. In the second half of the book the reader gets a ride through loop quantum gravity; the material covers geometric operators and their spectra, the Hamiltonian constraints, loop quantum cosmology and, more broadly, black hole thermodynamics. A glimpse of recent developments and open problems is given, for instance a discussion on experimental predictions, where the authors carefully point out the very preliminary nature of the results. The authors close with an 'open issues and controversies' section, addressing some of the criticism of loop quantum gravity and pointing to weak points of the theory. Again, readers aiming at starting research in loop quantum gravity should take this as a guide and motivation for further study, as many technicalities are naturally left out. In summary this book fully reaches the aim set by the authors - to introduce the topic in a way that is widely accessible to undergraduates - and as such is highly recommended.
Time series change detection: Algorithms for land cover change
NASA Astrophysics Data System (ADS)
Boriah, Shyam
The climate and earth sciences have recently undergone a rapid transformation from a data-poor to a data-rich environment. In particular, climate and ecosystem related observations from remote sensors on satellites, as well as outputs of climate or earth system models from large-scale computational platforms, provide terabytes of temporal, spatial and spatio-temporal data. These massive and information-rich datasets offer huge potential for advancing the science of land cover change, climate change and anthropogenic impacts. One important area where remote sensing data can play a key role is in the study of land cover change. Specifically, the conversion of natural land cover into humandominated cover types continues to be a change of global proportions with many unknown environmental consequences. In addition, being able to assess the carbon risk of changes in forest cover is of critical importance for both economic and scientific reasons. In fact, changes in forests account for as much as 20% of the greenhouse gas emissions in the atmosphere, an amount second only to fossil fuel emissions. Thus, there is a need in the earth science domain to systematically study land cover change in order to understand its impact on local climate, radiation balance, biogeochemistry, hydrology, and the diversity and abundance of terrestrial species. Land cover conversions include tree harvests in forested regions, urbanization, and agricultural intensification in former woodland and natural grassland areas. These types of conversions also have significant public policy implications due to issues such as water supply management and atmospheric CO2 output. In spite of the importance of this problem and the considerable advances made over the last few years in high-resolution satellite data, data mining, and online mapping tools and services, end users still lack practical tools to help them manage and transform this data into actionable knowledge of changes in forest ecosystems that can be used for decision making and policy planning purposes. In particular, previous change detection studies have primarily relied on examining differences between two or more satellite images acquired on different dates. Thus, a technological solution that detects global land cover change using high temporal resolution time series data will represent a paradigm-shift in the field of land cover change studies. To realize these ambitious goals, a number of computational challenges in spatio-temporal data mining need to be addressed. Specifically, analysis and discovery approaches need to be cognizant of climate and ecosystem data characteristics such as seasonality, non-stationarity/inter-region variability, multi-scale nature, spatio-temporal autocorrelation, high-dimensionality and massive data size. This dissertation, a step in that direction, translates earth science challenges to computer science problems, and provides computational solutions to address these problems. In particular, three key technical capabilities are developed: (1) Algorithms for time series change detection that are effective and can scale up to handle the large size of earth science data; (2) Change detection algorithms that can handle large numbers of missing and noisy values present in satellite data sets; and (3) Spatio-temporal analysis techniques to identify the scale and scope of disturbance events.
Multitriangulations, pseudotriangulations and some problems of realization of polytopes
NASA Astrophysics Data System (ADS)
Pilaud, Vincent
2010-09-01
This thesis explores two specific topics of discrete geometry, the multitriangulations and the polytopal realizations of products, whose connection is the problem of finding polytopal realizations of a given combinatorial structure. A k-triangulation is a maximal set of chords of the convex n-gon such that no k+1 of them mutually cross. We propose a combinatorial and geometric study of multitriangulations based on their stars, which play the same role as triangles of triangulations. This study leads to interpret multitriangulations by duality as pseudoline arrangements with contact points covering a given support. We exploit finally these results to discuss some open problems on multitriangulations, in particular the question of the polytopal realization of their flip graphs. We study secondly the polytopality of Cartesian products. We investigate the existence of polytopal realizations of cartesian products of graphs, and we study the minimal dimension that can have a polytope whose k-skeleton is that of a product of simplices.
Graphite distortion ``C`` Reactor. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, N.H.
1962-02-08
This report covers the efforts of the Laboratory in an investigation of the graphite distortion in the ``C`` reactor at Hanford. The particular aspects of the problem to be covered by the Laboratory were possible ``fixes`` to the control rod sticking problem caused by VSR channel distortion.
A study of dynamical behavior of space environment
NASA Technical Reports Server (NTRS)
Wu, S. T.
1974-01-01
Studies have covered a wide range of problems in the space environment, such as the problems of the dynamical behavior of the thermosphere, hydromagnetic wave propagation in the ionosphere, and interplanetary space environment. The theories used to analyze these problems range from a continuum theory of magnetohydrodynamics to the kinetic theory of free molecular flow. This is because the problems encountered covered the entire range of the Knudsen number (i.e., the ratio of mean free path to the characteristic length). Significant results are summarized.
NASA Technical Reports Server (NTRS)
Stoner, E. R.; May, G. A.; Kalcic, M. T. (Principal Investigator)
1981-01-01
Sample segments of ground-verified land cover data collected in conjunction with the USDA/ESS June Enumerative Survey were merged with LANDSAT data and served as a focus for unsupervised spectral class development and accuracy assessment. Multitemporal data sets were created from single-date LANDSAT MSS acquisitions from a nominal scene covering an eleven-county area in north central Missouri. Classification accuracies for the four land cover types predominant in the test site showed significant improvement in going from unitemporal to multitemporal data sets. Transformed LANDSAT data sets did not significantly improve classification accuracies. Regression estimators yielded mixed results for different land covers. Misregistration of two LANDSAT data sets by as much and one half pixels did not significantly alter overall classification accuracies. Existing algorithms for scene-to scene overlay proved adequate for multitemporal data analysis as long as statistical class development and accuracy assessment were restricted to field interior pixels.
Absence of Critical Points of Solutions to the Helmholtz Equation in 3D
NASA Astrophysics Data System (ADS)
Alberti, Giovanni S.
2016-11-01
The focus of this paper is to show the absence of critical points for the solutions to the Helmholtz equation in a bounded domain {Ωsubset{R}3} , given by { div(a nabla u_{ω}g)-ω qu_{ω}g=0&quad{in Ω,} u_{ω}g=g quad{on partialΩ.} We prove that for an admissible g there exists a finite set of frequencies K in a given interval and an open cover {overline{Ω}=\\cup_{ωin K} Ω_{ω}} such that {|nabla u_{ω}g(x)| > 0} for every {ωin K} and {xinΩ_{ω}} . The set K is explicitly constructed. If the spectrum of this problem is simple, which is true for a generic domain {Ω} , the admissibility condition on g is a generic property.
Huff, Trevor J; Ludwig, Parker E; Zuniga, Jorge M
2018-05-01
3D-printed anatomical models play an important role in medical and research settings. The recent successes of 3D anatomical models in healthcare have led many institutions to adopt the technology. However, there remain several issues that must be addressed before it can become more wide-spread. Of importance are the problems of cost and time of manufacturing. Machine learning (ML) could be utilized to solve these issues by streamlining the 3D modeling process through rapid medical image segmentation and improved patient selection and image acquisition. The current challenges, potential solutions, and future directions for ML and 3D anatomical modeling in healthcare are discussed. Areas covered: This review covers research articles in the field of machine learning as related to 3D anatomical modeling. Topics discussed include automated image segmentation, cost reduction, and related time constraints. Expert commentary: ML-based segmentation of medical images could potentially improve the process of 3D anatomical modeling. However, until more research is done to validate these technologies in clinical practice, their impact on patient outcomes will remain unknown. We have the necessary computational tools to tackle the problems discussed. The difficulty now lies in our ability to collect sufficient data.
ERIC Educational Resources Information Center
Huitzing, Hiddo A.
2004-01-01
This article shows how set covering with item sampling (SCIS) methods can be used in the analysis and preanalysis of linear programming models for test assembly (LPTA). LPTA models can construct tests, fulfilling a set of constraints set by the test assembler. Sometimes, no solution to the LPTA model exists. The model is then said to be…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neggers, R. A. J.; Ackerman, A. S.; Angevine, W. M.
Results are presented of the GASS/EUCLIPSE single-column model inter-comparison study on the subtropical marine low-level cloud transition. A central goal is to establish the performance of state-of-the-art boundary-layer schemes for weather and climate mod- els for this cloud regime, using large-eddy simulations of the same scenes as a reference. A novelty is that the comparison covers four different cases instead of one, in order to broaden the covered parameter space. Three cases are situated in the North-Eastern Pa- cific, while one reflects conditions in the North-Eastern Atlantic. A set of variables is considered that reflects key aspects of the transitionmore » process, making use of simple met- rics to establish the model performance. Using this method some longstanding problems in low level cloud representation are identified. Considerable spread exists among models concerning the cloud amount, its vertical structure and the associated impact on radia- tive transfer. The sign and amplitude of these biases differ somewhat per case, depending on how far the transition has progressed. After cloud breakup the ensemble median ex- hibits the well-known “too few too bright” problem. The boundary layer deepening rate and its state of decoupling are both underestimated, while the representation of the thin capping cloud layer appears complicated by a lack of vertical resolution. Encouragingly, some models are successful in representing the full set of variables, in particular the verti- cal structure and diurnal cycle of the cloud layer in transition. An intriguing result is that the median of the model ensemble performs best, inspiring a new approach in subgrid pa- rameterization.« less
Use of Web-based library resources by medical students in community and ambulatory settings*
Tannery, Nancy Hrinya; Foust, Jill E.; Gregg, Amy L.; Hartman, Linda M.; Kuller, Alice B.; Worona, Paul; Tulsky, Asher A.
2002-01-01
Purpose: The purpose was to evaluate the use of Web-based library resources by third-year medical students. Setting/Participants/Resources: Third-year medical students (147) in a twelve-week multidisciplinary primary care rotation in community and ambulatory settings. Methodology: Individual user surveys and log file analysis of Website were used. Results/Outcomes: Twenty resource topics were compiled into a Website to provide students with access to electronic library resources from any community-based clerkship location. These resource topics, covering subjects such as hypertension and back pain, linked to curriculum training problems, full-text journal articles, MEDLINE searches, electronic book chapters, and relevant Websites. More than half of the students (69%) accessed the Website on a daily or weekly basis. Over 80% thought the Website was a valuable addition to their clerkship. Discussion/Conclusion: Web-based information resources can provide curriculum support to students for whom access to the library is difficult and time consuming. PMID:12113515
Bagging Voronoi classifiers for clustering spatial functional data
NASA Astrophysics Data System (ADS)
Secchi, Piercesare; Vantini, Simone; Vitelli, Valeria
2013-06-01
We propose a bagging strategy based on random Voronoi tessellations for the exploration of geo-referenced functional data, suitable for different purposes (e.g., classification, regression, dimensional reduction, …). Urged by an application to environmental data contained in the Surface Solar Energy database, we focus in particular on the problem of clustering functional data indexed by the sites of a spatial finite lattice. We thus illustrate our strategy by implementing a specific algorithm whose rationale is to (i) replace the original data set with a reduced one, composed by local representatives of neighborhoods covering the entire investigated area; (ii) analyze the local representatives; (iii) repeat the previous analysis many times for different reduced data sets associated to randomly generated different sets of neighborhoods, thus obtaining many different weak formulations of the analysis; (iv) finally, bag together the weak analyses to obtain a conclusive strong analysis. Through an extensive simulation study, we show that this new procedure - which does not require an explicit model for spatial dependence - is statistically and computationally efficient.
Segmentation schema for enhancing land cover identification: A case study using Sentinel 2 data
NASA Astrophysics Data System (ADS)
Mongus, Domen; Žalik, Borut
2018-04-01
Land monitoring is performed increasingly using high and medium resolution optical satellites, such as the Sentinel-2. However, optical data is inevitably subjected to the variable operational conditions under which it was acquired. Overlapping of features caused by shadows, soft transitions between shadowed and non-shadowed regions, and temporal variability of the observed land-cover types require radiometric corrections. This study examines a new approach to enhancing the accuracy of land cover identification that resolves this problem. The proposed method constructs an ensemble-type classification model with weak classifiers tuned to the particular operational conditions under which the data was acquired. Iterative segmentation over the learning set is applied for this purpose, where feature space is partitioned according to the likelihood of misclassifications introduced by the classification model. As these are a consequence of overlapping features, such partitioning avoids the need for radiometric corrections of the data, and divides land cover types implicitly into subclasses. As a result, improved performance of all tested classification approaches were measured during the validation that was conducted on Sentinel-2 data. The highest accuracies in terms of F1-scores were achieved using the Naive Bayes Classifier as the weak classifier, while supplementing original spectral signatures with normalised difference vegetation index and texture analysis features, namely, average intensity, contrast, homogeneity, and dissimilarity. In total, an F1-score of nearly 95% was achieved in this way, with F1-scores of each particular land cover type reaching above 90%.
NASA Technical Reports Server (NTRS)
1943-01-01
This is the fourth of a series of reports covering an investigation of the general instability problem by the California Institute of Technology. The first five reports of this series cover investigations of the general instability problem under the loading conditions of pure bending and were prepared under the sponsorship of the Civil Aeronautics Administration. The succeeding reports of this series cover the work done on other loading conditions under the sponsorship of the National Advisory Committee for Aeronautics. This report is to deal primarily with the continuation of tests of sheet-covered specimens and studies of the buckling phenomena of unstiffened circular cylinders.
What you should know about land-cover data
Gallant, Alisa L.
2009-01-01
Wildlife biologists are using land-characteristics data sets for a variety of applications. Many kinds of landscape variables have been characterized and the resultant data sets or maps are readily accessible. Often, too little consideration is given to the accuracy or traits of these data sets, most likely because biologists do not know how such data are compiled and rendered, or the potential pitfalls that can be encountered when applying these data. To increase understanding of the nature of land-characteristics data sets, I introduce aspects of source information and data-handling methodology that include the following: ambiguity of land characteristics; temporal considerations and the dynamic nature of the landscape; type of source data versus landscape features of interest; data resolution, scale, and geographic extent; data entry and positional problems; rare landscape features; and interpreter variation. I also include guidance for determining the quality of land-characteristics data sets through metadata or published documentation, visual clues, and independent information. The quality or suitability of the data sets for wildlife applications may be improved with thematic or spatial generalization, avoidance of transitional areas on maps, and merging of multiple data sources. Knowledge of the underlying challenges in compiling such data sets will help wildlife biologists to better assess the strengths and limitations and determine how best to use these data.
Data sets for snow cover monitoring and modelling from the National Snow and Ice Data Center
NASA Astrophysics Data System (ADS)
Holm, M.; Daniels, K.; Scott, D.; McLean, B.; Weaver, R.
2003-04-01
A wide range of snow cover monitoring and modelling data sets are pending or are currently available from the National Snow and Ice Data Center (NSIDC). In-situ observations support validation experiments that enhance the accuracy of remote sensing data. In addition, remote sensing data are available in near-real time, providing coarse-resolution snow monitoring capability. Time series data beginning in 1966 are valuable for modelling efforts. NSIDC holdings include SMMR and SSM/I snow cover data, MODIS snow cover extent products, in-situ and satellite data collected for NASA's recent Cold Land Processes Experiment, and soon-to-be-released ASMR-E passive microwave products. The AMSR-E and MODIS sensors are part of NASA's Earth Observing System flying on the Terra and Aqua satellites Characteristics of these NSIDC-held data sets, appropriateness of products for specific applications, and data set access and availability will be presented.
Introduction: energy and the subsurface.
Christov, Ivan C; Viswanathan, Hari S
2016-10-13
This theme issue covers topics at the forefront of scientific research on energy and the subsurface, ranging from carbon dioxide (CO2) sequestration to the recovery of unconventional shale oil and gas resources through hydraulic fracturing. As such, the goal of this theme issue is to have an impact on the scientific community, broadly, by providing a self-contained collection of articles contributing to and reviewing the state-of-the-art of the field. This collection of articles could be used, for example, to set the next generation of research directions, while also being useful as a self-study guide for those interested in entering the field. Review articles are included on the topics of hydraulic fracturing as a multiscale problem, numerical modelling of hydraulic fracture propagation, the role of computational sciences in the upstream oil and gas industry and chemohydrodynamic patterns in porous media. Complementing the reviews is a set of original research papers covering growth models for branched hydraulic crack systems, fluid-driven crack propagation in elastic matrices, elastic and inelastic deformation of fluid-saturated rock, reaction front propagation in fracture matrices, the effects of rock mineralogy and pore structure on stress-dependent permeability of shales, topographic viscous fingering and plume dynamics in porous media convection.This article is part of the themed issue 'Energy and the subsurface'. © 2016 The Author(s).
Introduction: energy and the subsurface
Viswanathan, Hari S.
2016-01-01
This theme issue covers topics at the forefront of scientific research on energy and the subsurface, ranging from carbon dioxide (CO2) sequestration to the recovery of unconventional shale oil and gas resources through hydraulic fracturing. As such, the goal of this theme issue is to have an impact on the scientific community, broadly, by providing a self-contained collection of articles contributing to and reviewing the state-of-the-art of the field. This collection of articles could be used, for example, to set the next generation of research directions, while also being useful as a self-study guide for those interested in entering the field. Review articles are included on the topics of hydraulic fracturing as a multiscale problem, numerical modelling of hydraulic fracture propagation, the role of computational sciences in the upstream oil and gas industry and chemohydrodynamic patterns in porous media. Complementing the reviews is a set of original research papers covering growth models for branched hydraulic crack systems, fluid-driven crack propagation in elastic matrices, elastic and inelastic deformation of fluid-saturated rock, reaction front propagation in fracture matrices, the effects of rock mineralogy and pore structure on stress-dependent permeability of shales, topographic viscous fingering and plume dynamics in porous media convection. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597784
EFFECTS OF LANDSCAPE CHARACTERISTICS ON LAND-COVER CLASS ACCURACY
Utilizing land-cover data gathered as part of the National Land-Cover Data (NLCD) set accuracy assessment, several logistic regression models were formulated to analyze the effects of patch size and land-cover heterogeneity on classification accuracy. Specific land-cover ...
Effects of Cover Stories on Problem Solving in a Statistics Course
ERIC Educational Resources Information Center
Ricks, Travis Rex; Wiley, Jennifer
2014-01-01
Does having more knowledge or interest in the topics used in example problems facilitate or hinder learning in statistics? Undergraduates enrolled in Introductory Psychology received a lesson on central tendency. Following the lesson, half of the students completed a worksheet with a baseball cover story while the other half received a weather…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-31
... Conservation Program: Proposed Determination of Set-Top Boxes and Network Equipment as a Covered Consumer... published June 15, 2011 that set-top boxes (STBs) and network equipment qualify as a covered product under... action in light of a consensus agreement entered by a broadly representative group that DOE believes has...
A graph-theoretic approach for inparalog detection.
Tremblay-Savard, Olivier; Swenson, Krister M
2012-01-01
Understanding the history of a gene family that evolves through duplication, speciation, and loss is a fundamental problem in comparative genomics. Features such as function, position, and structural similarity between genes are intimately connected to this history; relationships between genes such as orthology (genes related through a speciation event) or paralogy (genes related through a duplication event) are usually correlated with these features. For example, recent work has shown that in human and mouse there is a strong connection between function and inparalogs, the paralogs that were created since the speciation event separating the human and mouse lineages. Methods exist for detecting inparalogs that either use information from only two species, or consider a set of species but rely on clustering methods. In this paper we present a graph-theoretic approach for finding lower bounds on the number of inparalogs for a given set of species; we pose an edge covering problem on the similarity graph and give an efficient 2/3-approximation as well as a faster heuristic. Since the physical position of inparalogs corresponding to recent speciations is not likely to have changed since the duplication, we also use our predictions to estimate the types of duplications that have occurred in some vertebrates and drosophila.
The role of collaborative ontology development in the knowledge negotiation process
NASA Astrophysics Data System (ADS)
Rivera, Norma
Interdisciplinary research (IDR) collaboration can be defined as the process of integrating experts' knowledge, perspectives, and resources to advance scientific discovery. The flourishing of more complex research problems, together with the growth of scientific and technical knowledge has resulted in the need for researchers from diverse fields to provide different expertise and points of view to tackle these problems. These collaborations, however, introduce a new set of "culture" barriers as participating experts are trained to communicate in discipline-specific languages, theories, and research practices. We propose that building a common knowledge base for research using ontology development techniques can provide a starting point for interdisciplinary knowledge exchange, negotiation, and integration. The goal of this work is to extend ontology development techniques to support the knowledge negotiation process in IDR groups. Towards this goal, this work presents a methodology that extends previous work in collaborative ontology development and integrates learning strategies and tools to enhance interdisciplinary research practices. We evaluate the effectiveness of applying such methodology in three different scenarios that cover educational and research settings. The results of this evaluation confirm that integrating learning strategies can, in fact, be advantageous to overall collaborative practices in IDR groups.
Anxiety in adolescents: Update on its diagnosis and treatment for primary care providers
Siegel, Rebecca S; Dickstein, Daniel P
2012-01-01
Anxiety disorders are the most prevalent mental health concern facing adolescents today, yet they are largely undertreated. This is especially concerning given that there are fairly good data to support an evidence-based approach to the diagnosis and treatment of anxiety, and also that untreated, these problems can continue into adulthood, growing in severity. Thus, knowing how to recognize and respond to anxiety in adolescents is of the utmost importance in primary care settings. To that end, this article provides an up-to-date review of the diagnosis and treatment of anxiety disorders geared towards professionals in primary care settings. Topics covered include subtypes, clinical presentation, the etiology and biology, effective screening instruments, evidence-based treatments (both medication and therapy), and the long-term prognosis for adolescents with anxiety. Importantly, we focus on the most common types of anxiety disorders, often known as phobias, which include generalized anxiety disorder, social anxiety/social phobia, separation anxiety disorder, panic disorder, and specific phobias. In summary, anxiety is a common psychiatric problem for adolescents, but armed with the right tools, primary care providers can make a major impact. PMID:24600282
Analysis of male volleyball players' motor activities during a top level match.
Mroczek, Dariusz; Januszkiewicz, Aleksander; Kawczyński, Adam S; Borysiuk, Zbigniew; Chmura, Jan
2014-08-01
The present study aims to assess motor activity of volleyball players using an original video recording method developed by the authors. Twenty-eight volleyball players taking part in 4 matches of the Polish Volleyball League were examined. The recorded data were analyzed in view of the mean total distance covered by volleyball players on different court positions during a match, set, and rally. The results showed that volleyball players cover the mean total distance of 1221 ± 327 m (mean ± SD) in a 3-set match, and 1757 ± 462 m in a 4-set match. A statistically significant difference (p ≤ 0.005) was found between the distance covered by the middle blockers and setters, defenders, spikers, and libero players in a match and in a set. The study revealed a tendency to lengthen the distance by the players in the final sets, which is indicative of the extended time of individual rallies. The mean distance covered in a single rally amounted to 10.92 ± 0.9 m in 4 matches (between 9.12 and 12.56 m). Considering the limited size of the field of play, volleyball players cover relatively long distances during a match and individual sets, with the shortest distance covered by middle blockers, and the longest by setters. From a practical application point of view, detailed topographic analysis of a player's movements on the court as well as precise data on the time of activity and rest breaks provide the coach with valuable information on the ways of development of arrhythmic, changing and dynamic training loads.
Covert photo classification by fusing image features and visual attributes.
Lang, Haitao; Ling, Haibin
2015-10-01
In this paper, we study a novel problem of classifying covert photos, whose acquisition processes are intentionally concealed from the subjects being photographed. Covert photos are often privacy invasive and, if distributed over Internet, can cause serious consequences. Automatic identification of such photos, therefore, serves as an important initial step toward further privacy protection operations. The problem is, however, very challenging due to the large semantic similarity between covert and noncovert photos, the enormous diversity in the photographing process and environment of cover photos, and the difficulty to collect an effective data set for the study. Attacking these challenges, we make three consecutive contributions. First, we collect a large data set containing 2500 covert photos, each of them is verified rigorously and carefully. Second, we conduct a user study on how humans distinguish covert photos from noncovert ones. The user study not only provides an important evaluation baseline, but also suggests fusing heterogeneous information for an automatic solution. Our third contribution is a covert photo classification algorithm that fuses various image features and visual attributes in the multiple kernel learning framework. We evaluate the proposed approach on the collected data set in comparison with other modern image classifiers. The results show that our approach achieves an average classification rate (1-EER) of 0.8940, which significantly outperforms other competitors as well as human's performance.
Using economy of means to evolve transition rules within 2D cellular automata.
Ripps, David L
2010-01-01
Running a cellular automaton (CA) on a rectangular lattice is a time-honored method for studying artificial life on a digital computer. Commonly, the researcher wishes to investigate some specific or general mode of behavior, say, the ability of a coherent pattern of points to glide within the lattice, or to generate copies of itself. This technique has a problem: how to design the transitions table-the set of distinct rules that specify the next content of a cell from its current content and that of its near neighbors. Often the table is painstakingly designed manually, rule by rule. The problem is exacerbated by the potentially vast number of individual rules that need be specified to cover all combinations of center and neighbors when there are several symbols in the alphabet of the CA. In this article a method is presented to have the set of rules evolve automatically while running the CA. The transition table is initially empty, with rules being added as the need arises. A novel principle drives the evolution: maximum economy of means-maximizing the reuse of rules introduced on previous cycles. This method may not be a panacea applicable to all CA studies. Nevertheless, it is sufficiently potent to evolve sets of rules and associated patterns of points that glide (periodically regenerate themselves at another location) and to generate gliding "children" that then "mate" by collision.
Fitzpatrick, F.A.; Scudder, B.C.; Lenz, B.N.; Sullivan, D.J.
2001-01-01
The U.S. Geological Survey examined 25 agricultural streams in eastern Wisconsin to determine relations between fish, invertebrate, and algal metrics and multiple spatial scales of land cover, geologic setting, hydrologic, aquatic habitat, and water chemistry data. Spearman correlation and redundancy analyses were used to examine relations among biotic metrics and environmental characteristics. Riparian vegetation, geologic, and hydrologic conditions affected the response of biotic metrics to watershed agricultural land cover but the relations were aquatic assemblage dependent. It was difficult to separate the interrelated effects of geologic setting, watershed and buffer land cover, and base flow. Watershed and buffer land cover, geologic setting, reach riparian vegetation width, and stream size affected the fish IBI, invertebrate diversity, diatom IBI, and number of algal taxa; however, the invertebrate FBI, percentage of EPT, and the diatom pollution index were more influenced by nutrient concentrations and flow variability. Fish IBI scores seemed most sensitive to land cover in the entire stream network buffer, more so than watershed-scale land cover and segment or reach riparian vegetation width. All but one stream with more than approximately 10 percent buffer agriculture had fish IBI scores of fair or poor. In general, the invertebrate and algal metrics used in this study were not as sensitive to land cover effects as fish metrics. Some of the reach-scale characteristics, such as width/depth ratios, velocity, and bank stability, could be related to watershed influences of both land cover and geologic setting. The Wisconsin habitat index was related to watershed geologic setting, watershed and buffer land cover, riparian vegetation width, and base flow, and appeared to be a good indicator of stream quality. Results from this study emphasize the value of using more than one or two biotic metrics to assess water quality and the importance of environmental characteristics at multiple scales.
Research requirements to improve reliability of civil helicopters
NASA Technical Reports Server (NTRS)
Dougherty, J. J., III; Barrett, L. D.
1978-01-01
The major reliability problems of the civil helicopter fleet as reported by helicopter operational and maintenance personnel are documented. An assessment of each problem is made to determine if the reliability can be improved by application of present technology or whether additional research and development are required. The reliability impact is measured in three ways: (1) The relative frequency of each problem in the fleet. (2) The relative on-aircraft manhours to repair, associated with each fleet problem. (3) The relative cost of repair materials or replacement parts associated with each fleet problem. The data reviewed covered the period of 1971 through 1976 and covered only turbine engine aircraft.
Adolescents' Exposure to Disasters and Substance Use.
Schiff, Miriam; Fang, Lin
2016-06-01
This paper reviews the impact of exposure to man-made or natural disasters on adolescent substance use. It covers empirical studies published from 2005 to 2015 concerning (a) the scope of the problem, (b) vulnerable groups and risk and protective factors, and (c) evidence-based interventions. The review suggests a strong link between adolescent substance use and exposure to either man-made or natural disaster. Vulnerable groups include adolescents with previous exposure to traumatic events, living in areas that are continually exposed to disasters, and ethnic minorities. Risk and protective factors at the individual, familial, community, and societal levels are described based on the bioecological model of mass trauma. Given that mass trauma is unfortunately a global problem, it is important to establish international interdisciplinary working teams to set gold standards for comparative studies on the etiology for adolescent substance use in the context of disasters.
Xie, Rui; Wan, Xianrong; Hong, Sheng; Yi, Jianxin
2017-06-14
The performance of a passive radar network can be greatly improved by an optimal radar network structure. Generally, radar network structure optimization consists of two aspects, namely the placement of receivers in suitable places and selection of appropriate illuminators. The present study investigates issues concerning the joint optimization of receiver placement and illuminator selection for a passive radar network. Firstly, the required radar cross section (RCS) for target detection is chosen as the performance metric, and the joint optimization model boils down to the partition p -center problem (PPCP). The PPCP is then solved by a proposed bisection algorithm. The key of the bisection algorithm lies in solving the partition set covering problem (PSCP), which can be solved by a hybrid algorithm developed by coupling the convex optimization with the greedy dropping algorithm. In the end, the performance of the proposed algorithm is validated via numerical simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Kathleen E.; Humble, Travis S.
Using quantum annealing to solve an optimization problem requires minor embedding a logic graph into a known hardware graph. We introduce the minor set cover (MSC) of a known graph GG : a subset of graph minors which contain any remaining minor of the graph as a subgraph, in an effort to reduce the complexity of the minor embedding problem. Any graph that can be embedded into GG will be embeddable into a member of the MSC. Focusing on embedding into the hardware graph of commercially available quantum annealers, we establish the MSC for a particular known virtual hardware, whichmore » is a complete bipartite graph. Furthermore, we show that the complete bipartite graph K N,N has a MSC of N minors, from which K N+1 is identified as the largest clique minor of K N,N. In the case of determining the largest clique minor of hardware with faults we briefly discussed this open question.« less
Conflation and integration of archived geologic maps and associated uncertainties
Shoberg, Thomas G.
2016-01-01
Old, archived geologic maps are often available with little or no associated metadata. This creates special problems in terms of extracting their data to use with a modern database. This research focuses on some problems and uncertainties associated with conflating older geologic maps in regions where modern geologic maps are, as yet, non-existent as well as vertically integrating the conflated maps with layers of modern GIS data (in this case, The National Map of the U.S. Geological Survey). Ste. Genevieve County, Missouri was chosen as the test area. It is covered by six archived geologic maps constructed in the years between 1928 and 1994. Conflating these maps results in a map that is internally consistent with these six maps, is digitally integrated with hydrography, elevation and orthoimagery data, and has a 95% confidence interval useful for further data set integration.
Advanced space transportation system support contract
NASA Technical Reports Server (NTRS)
1988-01-01
The general focus is on a phase 2 lunar base, or a lunar base during the period after the first return of a crew to the Moon, but before permanent occupancy. The software effort produced a series of trajectory programs covering low earth orbit (LEO) to various node locations, the node locations to the lunar surface, and then back to LEO. The surface operations study took a lunar scenario in the civil needs data base (CNDB) and attempted to estimate the amount of space-suit work or extravehicular activity (EVA) required to set up the base. The maintenance and supply options study was a first look at the problems of supplying and maintaining the base. A lunar surface launch and landing facility was conceptually designed. The lunar storm shelter study examined the problems of radiation protection. The lunar surface construction and equipment assembly study defined twenty surface construction and assembly tasks in detail.
NASA Technical Reports Server (NTRS)
Kuhlen, H.; Horn, P.
1990-01-01
A new concept for a satellite based public mobile communications system, LOOPUS Mob-D, is introduced, whereby most of the classical problems in mobile satellite systems are approached in a different way. The LOOPUS system will offer a total capacity of 6000 high rate channels in three service areas (Europe, Asia, and North America), covering the entire Northern Hemisphere with a set of group special mobile (GSM) compatible mobile services, eventually providing the 'office in the car'. Special characteristics of the LOOPUS orbit and the communications network architecture are highlighted.
What does industry need in the way of meteorology for air pollution problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crow, L.W.
1978-01-01
A discussion covers information which should be supplied to the consulting meteorologist; typical requests made by industrial concerns of various consultants; feeding meteorological data to models; the use of meteorological information in developing prevention of significant deterioration requirements; sources of error, e.g., assuming that digital values have a higher validity than the original analog records; the need for industrial concerns to develop sets of site-specific data; and the greater liability of air flow and stability frequencies estimated by professional meteorologists than transposed historical data from the nearest airport station for use as model input to make preliminary planning decisions.
Reavley, Nicola J; Morgan, Amy J; Jorm, Anthony F
2017-03-01
The aim of the study was to assess the factors predicting experiences of avoidance, discrimination and positive treatment in people with mental health problems. In 2014, telephone interviews were carried out with 5220 Australians aged 18+, 1381 of whom reported a mental health problem or scored highly on a symptom screening questionnaire. Questions covered experiences of avoidance, discrimination and positive treatment by friends, spouse, other family, workplace, educational institution and others in the community; as well as disclosure of mental health problems. Avoidance, discrimination and positive treatment scores were calculated by counting the number of domains in which each occurred. Predictors of avoidance, discrimination and positive treatment were modelled with negative binomial regression analyses. After adjusting for the effects of other predictors in multivariate analyses, symptom severity and a diagnosis of 'any other disorder' (most commonly psychotic disorders or eating disorders) predicted experiences of both avoidance and discrimination but not positive treatment. Disclosing a mental health problem in more settings was also associated with higher rates of avoidance and discrimination, but also with positive treatment. Disclosure of mental health problems to others may increases experiences of discrimination, but may also increase experiences of positive treatment. These findings can help to inform decision making by people with mental health problems about disclosure, particularly in the case of more severe or low-prevalence disorders.
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Macready, William G.
2005-01-01
Recent work on the foundations of optimization has begun to uncover its underlying rich structure. In particular, the "No Free Lunch" (NFL) theorems [WM97] state that any two algorithms are equivalent when their performance is averaged across all possible problems. This highlights the need for exploiting problem-specific knowledge to achieve better than random performance. In this paper we present a general framework covering most search scenarios. In addition to the optimization scenarios addressed in the NFL results, this framework covers multi-armed bandit problems and evolution of multiple co-evolving agents. As a particular instance of the latter, it covers "self-play" problems. In these problems the agents work together to produce a champion, who then engages one or more antagonists in a subsequent multi-player game In contrast to the traditional optimization case where the NFL results hold, we show that in self-play there are free lunches: in coevolution some algorithms have better performance than other algorithms, averaged across all possible problems. However in the typical coevolutionary scenarios encountered in biology, where there is no champion, NFL still holds.
NASA Technical Reports Server (NTRS)
Dillard, J. P.
1981-01-01
The study objective was to develop or modify methods in an operational framework that would allow incorporation of satellite derived snow cover observations for prediction of snowmelt derived runoff. Data were reviewed and verified for five basins in the Pacific Northwest. The data were analyzed for up to a 6-year period ending July 1978, and in all cases cover a low, average, and high snow cover/runoff year. Cloud cover is a major problem in these springtime runoff analyses and have hampered data collection for periods of up to 52 days. Tree cover and terrain are sufficiently dense and rugged to have caused problems. The interpretation of snowlines from satellite data was compared with conventional ground truth data and tested in operational streamflow forecasting models. When the satellite snow-covered area (SCA) data are incorporated in the SSARR (Streamflow Synthesis and Reservoir Regulation) model, there is a definite but minor improvement.
Eastern Siberia terrain intelligence
,
1942-01-01
The following folio of terrain intelligence maps, charts and explanatory tables represent an attempt to bring together available data on natural physical conditions such as will affect military operations in Eastern Siberia. The area covered is the easternmost section of the U.S.S.R.; that is the area east of the Yenisei River. Each map and accompanying table is devoted· to a specialized set of problems; together they cover such subjects as geology, construction materials, mineral fuels, terrain, water supply, rivers and climate. The data is somewhat generalized due to the scale of treatment as well as to the scarcity of basic data. Each of the maps are rated as to reliability according to the reliability scale on the following page. Considerable of the data shown is of an interpretative nature, although precise data from literature was used wherever possible. The maps and tables were compiled by a special group from the United States Geological Survey in cooperation with the Intelligence Branch of the Office, Chief of Engineers, War Department.
On Connected Target k-Coverage in Heterogeneous Wireless Sensor Networks.
Yu, Jiguo; Chen, Ying; Ma, Liran; Huang, Baogui; Cheng, Xiuzhen
2016-01-15
Coverage and connectivity are two important performance evaluation indices for wireless sensor networks (WSNs). In this paper, we focus on the connected target k-coverage (CTC k) problem in heterogeneous wireless sensor networks (HWSNs). A centralized connected target k-coverage algorithm (CCTC k) and a distributed connected target k-coverage algorithm (DCTC k) are proposed so as to generate connected cover sets for energy-efficient connectivity and coverage maintenance. To be specific, our proposed algorithms aim at achieving minimum connected target k-coverage, where each target in the monitored region is covered by at least k active sensor nodes. In addition, these two algorithms strive to minimize the total number of active sensor nodes and guarantee that each sensor node is connected to a sink, such that the sensed data can be forwarded to the sink. Our theoretical analysis and simulation results show that our proposed algorithms outperform a state-of-art connected k-coverage protocol for HWSNs.
Life on the Edge - Improved Forest Cover Mapping in Mixed-Use Tropical Regions
NASA Astrophysics Data System (ADS)
Anderson, C.; Mendenhall, C. D.; Daily, G.
2016-12-01
Tropical ecosystems and biodiversity are experiencing rapid change, primarily due to conversion of forest habitat to agriculture. Protected areas, while effective for conservation, only manage 15% of terrestrial area, whereas approximately 58% is privately owned. To incentivize private forest management and slow the loss of biodiversity, payments for ecosystem services (PES) programs were established in Costa Rica that pay landowners who maintain trees on their property. While this program is effective in improving livelihoods and preventing forest conversion, it is only managing payments to landowners on 1% of eligible, non-protected forested land.A major bottleneck for this program is access to accurate, national-scale tree cover maps. While the remote sensing community has made great progress in global-scale tree cover mapping, these maps are not sufficient to guide investments for PES programs. The major limitations of current global-scale tree-cover maps are that they a) do not distinguish between forest and agriculture and b) overestimate tree cover in mixed land-use areas (e.g. Global Forest Change overestimates by 20% on average in this region). This is especially problematic in biodiversity-rich Costa Rica, where small patches of forest intermix with agricultural production, and where the conservation value of tree-cover is high. To address this problem, we are developing a new forest cover mapping method that a) performs a least-squares spectral mixture analysis (SMA) using repeat Landsat imagery and canopy radiative transfer modeling: b) combines Landsat data, SMA results, and radar backscatter data using multi-sensor fusion techniques and: c) trains tree-cover classification models using high resolution data sets along a land use-intensity gradient. Our method predicted tree cover with 85% accuracy when compared to a fine-scale map of tree cover in a tropical, agricultural landscape, whereas the next-best method, the Global Forest Change map, predicted tree cover with 72% accuracy. Next steps will aim to test, improve, and apply this method globally to guide investments in nature in agricultural landscapes where forest stewardship will sustain biodiversity.
A comparative analysis of the Global Land Cover 2000 and MODIS land cover data sets
Giri, C.; Zhu, Z.; Reed, B.
2005-01-01
Accurate and up-to-date global land cover data sets are necessary for various global change research studies including climate change, biodiversity conservation, ecosystem assessment, and environmental modeling. In recent years, substantial advancement has been achieved in generating such data products. Yet, we are far from producing geospatially consistent high-quality data at an operational level. We compared the recently available Global Land Cover 2000 (GLC-2000) and MODerate resolution Imaging Spectrometer (MODIS) global land cover data to evaluate the similarities and differences in methodologies and results, and to identify areas of spatial agreement and disagreement. These two global land cover data sets were prepared using different data sources, classification systems, and methodologies, but using the same spatial resolution (i.e., 1 km) satellite data. Our analysis shows a general agreement at the class aggregate level except for savannas/shrublands, and wetlands. The disagreement, however, increases when comparing detailed land cover classes. Similarly, percent agreement between the two data sets was found to be highly variable among biomes. The identified areas of spatial agreement and disagreement will be useful for both data producers and users. Data producers may use the areas of spatial agreement for training area selection and pay special attention to areas of disagreement for further improvement in future land cover characterization and mapping. Users can conveniently use the findings in the areas of agreement, whereas users might need to verify the informaiton in the areas of disagreement with the help of secondary information. Learning from past experience and building on the existing infrastructure (e.g., regional networks), further research is necessary to (1) reduce ambiguity in land cover definitions, (2) increase availability of improved spatial, spectral, radiometric, and geometric resolution satellite data, and (3) develop advanced classification algorithms.
Pham, Tuan Anh; Hà, Minh Hoàng; Nguyen, Xuan Hoai
2018-06-01
This data article contains data related to the research article entitled, "Solving the multi-vehicle multi-covering tour problem" (Pham et al., 2017) [4]. All data of this article was generated from instances kroA100, kroB100, kroC100, kroD100, kroA200, and kroB200 from TSPLIB. It can be downloaded from public repository. This data can be used as benchmarks for the covering tour problem (CTP) variants, such as m -CTP- p , m -CTP, mm -CTP- p , mm -CTP, mm -CTP-o, mm -CTP-wo. We tested our algorithm on these data and results are shown in Pham et al. (2017) [4].
Towards monitoring land-cover and land-use changes at a global scale: the global land survey 2005
Gutman, G.; Byrnes, Raymond A.; Masek, J.; Covington, S.; Justice, C.; Franks, S.; Headley, Rachel
2008-01-01
Land cover is a critical component of the Earth system, infl uencing land-atmosphere interactions, greenhouse gas fl uxes, ecosystem health, and availability of food, fi ber, and energy for human populations. The recent Integrated Global Observations of Land (IGOL) report calls for the generation of maps documenting global land cover at resolutions between 10m and 30m at least every fi ve years (Townshend et al., in press). Moreover, despite 35 years of Landsat observations, there has not been a unifi ed global analysis of land-cover trends nor has there been a global assessment of land-cover change at Landsat-like resolution. Since the 1990s, the National Aeronautics and Space Administration (NASA) and the U.S. Geological Survey (USGS) have supported development of data sets based on global Landsat observations (Tucker et al., 2004). These land survey data sets, usually referred to as GeoCover ™, provide global, orthorectifi ed, typically cloud-free Landsat imagery centered on the years 1975, 1990, and 2000, with a preference for leaf-on conditions. Collectively, these data sets provided a consistent set of observations to assess land-cover changes at a decadal scale. These data are freely available via the Internet from the USGS Center for Earth Resources Observation and Science (EROS) (see http://earthexplorer.usgs.gov or http://glovis.usgs.gov). This has resulted in unprecedented downloads of data, which are widely used in scientifi c studies of land-cover change (e.g., Boone et al., 2007; Harris et al., 2005; Hilbert, 2006; Huang et al. 2007; Jantz et al., 2005, Kim et al., 2007; Leimgruber, 2005; Masek et al., 2006). NASA and USGS are continuing to support land-cover change research through the development of GLS2005 - an additional global Landsat assessment circa 20051 . Going beyond the earlier initiatives, this data set will establish a baseline for monitoring changes on a 5-year interval and will pave the way toward continuous global land-cover monitoring at Landsat-like resolution in the next decade.
BOREAS TGB-5 Fire History of Manitoba 1980 to 1991 in Raster Format
NASA Technical Reports Server (NTRS)
Stocks, Brian J.; Zepp, Richard; Knapp, David; Hall, Forrest G. (Editor); Conrad, Sara K. (Editor)
2000-01-01
The BOReal Ecosystem-Atmosphere Study Trace Gas Biogeochemistry (BOREAS TGB-5) team collected several data sets related to the effects of fire on the exchange of trace gases between the surface and the atmosphere. This raster format data set covers the province of Manitoba between 1980 and 1991. The data were gridded into the Albers Equal-Area Conic (AEAC) projection from the original vector data. The original vector data were produced by Forestry Canada from hand-drawn boundaries of fires on photocopies of 1:250,000-scale maps. The locational accuracy of the data is considered fair to poor. When the locations of some fire boundaries were compared to Landsat TM images, they were found to be off by as much as a few kilometers. This problem should be kept in mind when using these data. The data are stored in binary, image format files.
Use of Web-based library resources by medical students in community and ambulatory settings.
Tannery, Nancy Hrinya; Foust, Jill E; Gregg, Amy L; Hartman, Linda M; Kuller, Alice B; Worona, Paul; Tulsky, Asher A
2002-07-01
The purpose was to evaluate the use of Web-based library resources by third-year medical students. Third-year medical students (147) in a twelve-week multidisciplinary primary care rotation in community and ambulatory settings. Individual user surveys and log file analysis of Website were used. Twenty resource topics were compiled into a Website to provide students with access to electronic library resources from any community-based clerkship location. These resource topics, covering subjects such as hypertension and back pain, linked to curriculum training problems, full-text journal articles, MEDLINE searches, electronic book chapters, and relevant Websites. More than half of the students (69%) accessed the Website on a daily or weekly basis. Over 80% thought the Website was a valuable addition to their clerkship. Web-based information resources can provide curriculum support to students for whom access to the library is difficult and time consuming.
Moreno-Martinez, Francisco Javier; Montoro, Pedro R; Laws, Keith R
2011-05-01
This paper presents a new corpus of 140 high quality colour images belonging to 14 subcategories and covering a range of naming difficulty. One hundred and six Spanish speakers named the items and provided data for several psycholinguistic variables: age of acquisition, familiarity, manipulability, name agreement, typicality and visual complexity. Furthermore, we also present lexical frequency data derived internet search hits. Apart from the large number of variables evaluated, these stimuli present an important advantage with respect to other comparable image corpora in so far as naming performance in healthy individuals is less prone to ceiling effect problems. Reliability and validity indexes showed that our items display similar psycholinguistic characteristics to those of other corpora. In sum, this set of ecologically valid stimuli provides a useful tool for scientists engaged in cognitive and neuroscience-based research.
NASA Technical Reports Server (NTRS)
1943-01-01
This is the second of a series of reports covering an investigation of the general instability problem by the California Institute of Technology. The first five reports of this series cover investigations of the general instability problem under the loading conditions of pure bending and were prepared under the sponsorship of the Civil Aeronautics Administration. The succeeding reports of this series cover the work done on other loading conditions under the sponsorship of the National Advisory Committee for Aeronautics.
Crisis management during anaesthesia: the development of an anaesthetic crisis management manual
Runciman, W; Kluger, M; Morris, R; Paix, A; Watterson, L; Webb, R
2005-01-01
Background: All anaesthetists have to handle life threatening crises with little or no warning. However, some cognitive strategies and work practices that are appropriate for speed and efficiency under normal circumstances may become maladaptive in a crisis. It was judged in a previous study that the use of a structured "core" algorithm (based on the mnemonic COVER ABCD–A SWIFT CHECK) would diagnose and correct the problem in 60% of cases and provide a functional diagnosis in virtually all of the remaining 40%. It was recommended that specific sub-algorithms be developed for managing the problems underlying the remaining 40% of crises and assembled in an easy-to-use manual. Sub-algorithms were therefore developed for these problems so that they could be checked for applicability and validity against the first 4000 anaesthesia incidents reported to the Australian Incident Monitoring Study (AIMS). Methods: The need for 24 specific sub-algorithms was identified. Teams of practising anaesthetists were assembled and sets of incidents relevant to each sub-algorithm were identified from the first 4000 reported to AIMS. Based largely on successful strategies identified in these reports, a set of 24 specific sub-algorithms was developed for trial against the 4000 AIMS reports and assembled into an easy-to-use manual. A process was developed for applying each component of the core algorithm COVER at one of four levels (scan-check-alert/ready-emergency) according to the degree of perceived urgency, and incorporated into the manual. The manual was disseminated at a World Congress and feedback was obtained. Results: Each of the 24 specific crisis management sub-algorithms was tested against the relevant incidents among the first 4000 reported to AIMS and compared with the actual management by the anaesthetist at the time. It was judged that, if the core algorithm had been correctly applied, the appropriate sub-algorithm would have been resolved better and/or faster in one in eight of all incidents, and would have been unlikely to have caused harm to any patient. The descriptions of the validation of each of the 24 sub-algorithms constitute the remaining 24 papers in this set. Feedback from five meetings each attended by 60–100 anaesthetists was then collated and is included. Conclusion: The 24 sub-algorithms developed form the basis for developing a rational evidence-based approach to crisis management during anaesthesia. The COVER component has been found to be satisfactory in real life resuscitation situations and the sub-algorithms have been used successfully for several years. It would now be desirable for carefully designed simulator based studies, using naive trainees at the start of their training, to systematically examine the merits and demerits of various aspects of the sub-algorithms. It would seem prudent that these sub-algorithms be regarded, for the moment, as decision aids to support and back up clinicians' natural responses to a crisis when all is not progressing as expected. PMID:15933282
Crisis management during anaesthesia: the development of an anaesthetic crisis management manual.
Runciman, W B; Kluger, M T; Morris, R W; Paix, A D; Watterson, L M; Webb, R K
2005-06-01
All anaesthetists have to handle life threatening crises with little or no warning. However, some cognitive strategies and work practices that are appropriate for speed and efficiency under normal circumstances may become maladaptive in a crisis. It was judged in a previous study that the use of a structured "core" algorithm (based on the mnemonic COVER ABCD-A SWIFT CHECK) would diagnose and correct the problem in 60% of cases and provide a functional diagnosis in virtually all of the remaining 40%. It was recommended that specific sub-algorithms be developed for managing the problems underlying the remaining 40% of crises and assembled in an easy-to-use manual. Sub-algorithms were therefore developed for these problems so that they could be checked for applicability and validity against the first 4000 anaesthesia incidents reported to the Australian Incident Monitoring Study (AIMS). The need for 24 specific sub-algorithms was identified. Teams of practising anaesthetists were assembled and sets of incidents relevant to each sub-algorithm were identified from the first 4000 reported to AIMS. Based largely on successful strategies identified in these reports, a set of 24 specific sub-algorithms was developed for trial against the 4000 AIMS reports and assembled into an easy-to-use manual. A process was developed for applying each component of the core algorithm COVER at one of four levels (scan-check-alert/ready-emergency) according to the degree of perceived urgency, and incorporated into the manual. The manual was disseminated at a World Congress and feedback was obtained. Each of the 24 specific crisis management sub-algorithms was tested against the relevant incidents among the first 4000 reported to AIMS and compared with the actual management by the anaesthetist at the time. It was judged that, if the core algorithm had been correctly applied, the appropriate sub-algorithm would have been resolved better and/or faster in one in eight of all incidents, and would have been unlikely to have caused harm to any patient. The descriptions of the validation of each of the 24 sub-algorithms constitute the remaining 24 papers in this set. Feedback from five meetings each attended by 60-100 anaesthetists was then collated and is included. The 24 sub-algorithms developed form the basis for developing a rational evidence-based approach to crisis management during anaesthesia. The COVER component has been found to be satisfactory in real life resuscitation situations and the sub-algorithms have been used successfully for several years. It would now be desirable for carefully designed simulator based studies, using naive trainees at the start of their training, to systematically examine the merits and demerits of various aspects of the sub-algorithms. It would seem prudent that these sub-algorithms be regarded, for the moment, as decision aids to support and back up clinicians' natural responses to a crisis when all is not progressing as expected.
MODIS Snow Cover Mapping Decision Tree Technique: Snow and Cloud Discrimination
NASA Technical Reports Server (NTRS)
Riggs, George A.; Hall, Dorothy K.
2010-01-01
Accurate mapping of snow cover continues to challenge cryospheric scientists and modelers. The Moderate-Resolution Imaging Spectroradiometer (MODIS) snow data products have been used since 2000 by many investigators to map and monitor snow cover extent for various applications. Users have reported on the utility of the products and also on problems encountered. Three problems or hindrances in the use of the MODIS snow data products that have been reported in the literature are: cloud obscuration, snow/cloud confusion, and snow omission errors in thin or sparse snow cover conditions. Implementation of the MODIS snow algorithm in a decision tree technique using surface reflectance input to mitigate those problems is being investigated. The objective of this work is to use a decision tree structure for the snow algorithm. This should alleviate snow/cloud confusion and omission errors and provide a snow map with classes that convey information on how snow was detected, e.g. snow under clear sky, snow tinder cloud, to enable users' flexibility in interpreting and deriving a snow map. Results of a snow cover decision tree algorithm are compared to the standard MODIS snow map and found to exhibit improved ability to alleviate snow/cloud confusion in some situations allowing up to about 5% increase in mapped snow cover extent, thus accuracy, in some scenes.
A Multi-Hop Clustering Mechanism for Scalable IoT Networks.
Sung, Yoonyoung; Lee, Sookyoung; Lee, Meejeong
2018-03-23
It is expected that up to 26 billion Internet of Things (IoT) equipped with sensors and wireless communication capabilities will be connected to the Internet by 2020 for various purposes. With a large scale IoT network, having each node connected to the Internet with an individual connection may face serious scalability issues. The scalability problem of the IoT network may be alleviated by grouping the nodes of the IoT network into clusters and having a representative node in each cluster connect to the Internet on behalf of the other nodes in the cluster instead of having a per-node Internet connection and communication. In this paper, we propose a multi-hop clustering mechanism for IoT networks to minimize the number of required Internet connections. Specifically, the objective of proposed mechanism is to select the minimum number of coordinators, which take the role of a representative node for the cluster, i.e., having the Internet connection on behalf of the rest of the nodes in the cluster and to map a partition of the IoT nodes onto the selected set of coordinators to minimize the total distance between the nodes and their respective coordinator under a certain constraint in terms of maximum hop count between the IoT nodes and their respective coordinator. Since this problem can be mapped into a set cover problem which is known as NP-hard, we pursue a heuristic approach to solve the problem and analyze the complexity of the proposed solution. Through a set of experiments with varying parameters, the proposed scheme shows 63-87.3% reduction of the Internet connections depending on the number of the IoT nodes while that of the optimal solution is 65.6-89.9% in a small scale network. Moreover, it is shown that the performance characteristics of the proposed mechanism coincide with expected performance characteristics of the optimal solution in a large-scale network.
A Multi-Hop Clustering Mechanism for Scalable IoT Networks
2018-01-01
It is expected that up to 26 billion Internet of Things (IoT) equipped with sensors and wireless communication capabilities will be connected to the Internet by 2020 for various purposes. With a large scale IoT network, having each node connected to the Internet with an individual connection may face serious scalability issues. The scalability problem of the IoT network may be alleviated by grouping the nodes of the IoT network into clusters and having a representative node in each cluster connect to the Internet on behalf of the other nodes in the cluster instead of having a per-node Internet connection and communication. In this paper, we propose a multi-hop clustering mechanism for IoT networks to minimize the number of required Internet connections. Specifically, the objective of proposed mechanism is to select the minimum number of coordinators, which take the role of a representative node for the cluster, i.e., having the Internet connection on behalf of the rest of the nodes in the cluster and to map a partition of the IoT nodes onto the selected set of coordinators to minimize the total distance between the nodes and their respective coordinator under a certain constraint in terms of maximum hop count between the IoT nodes and their respective coordinator. Since this problem can be mapped into a set cover problem which is known as NP-hard, we pursue a heuristic approach to solve the problem and analyze the complexity of the proposed solution. Through a set of experiments with varying parameters, the proposed scheme shows 63–87.3% reduction of the Internet connections depending on the number of the IoT nodes while that of the optimal solution is 65.6–89.9% in a small scale network. Moreover, it is shown that the performance characteristics of the proposed mechanism coincide with expected performance characteristics of the optimal solution in a large-scale network. PMID:29570691
Psychosocial difficulties from the perspective of persons with neuropsychiatric disorders.
Coenen, Michaela; Cabello, Maria; Umlauf, Silvia; Ayuso-Mateos, José Luis; Anczewska, Marta; Tourunen, Jouni; Leonardi, Matilde; Cieza, Alarcos
2016-01-01
The objective of this study is to determine whether persons with neuropsychiatric disorders experience a common set of psychosocial difficulties using qualitative data from focus groups and individual interviews. The study was performed in five European countries (Finland, Italy, Germany, Poland and Spain) using the focus groups and individual interviews with persons with nine neuropsychiatric disorders (dementia, depression, epilepsy, migraine, multiple sclerosis, Parkinson's disease, schizophrenia, stroke and substance dependence). Digitally recorded sessions were analysed using a step-by-step qualitative and quantitative methodology resulting in the compilation of a common set of psychosocial difficulties using the International Classification of Functioning, Disability and Health (ICF) as a framework. Sixty-seven persons participated in the study. Most persons with neuropsychiatric disorders experience difficulties in emotional functions, sleeping, carrying out daily routine, working and interpersonal relationships in common. Sixteen out of 33 psychosocial difficulties made up the common set. This set includes mental functions, pain and issues addressing activities and participation and provides first evidence for the hypothesis of horizontal epidemiology of psychosocial difficulties in neuropsychiatric disorders. This study provides information about psychosocial difficulties that should be covered in the treatment and rehabilitation of persons with neuropsychiatric disorders regardless of clinical diagnoses. Emotional problems, work and sleep problems should be addressed in all the treatments of neuropsychiatric disorders regardless of their specific diagnosis, etiology and severity. Personality issues should be targeted in the treatment for neurological disorders, whereas communication skill training may also be useful for mental disorders. The effects of medication and social environment on patient's daily life should be considered in all the neuropsychiatric conditions.
MODIS Snow Cover Recovery Using Variational Interpolation
NASA Astrophysics Data System (ADS)
Tran, H.; Nguyen, P.; Hsu, K. L.; Sorooshian, S.
2017-12-01
Cloud obscuration is one of the major problems that limit the usages of satellite images in general and in NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) global Snow-Covered Area (SCA) products in particular. Among the approaches to resolve the problem, the Variational Interpolation (VI) algorithm method, proposed by Xia et al., 2012, obtains cloud-free dynamic SCA images from MODIS. This method is automatic and robust. However, computational deficiency is a main drawback that degrades applying the method for larger scales (i.e., spatial and temporal scales). To overcome this difficulty, this study introduces an improved version of the original VI. The modified VI algorithm integrates the MINimum RESidual (MINRES) iteration (Paige and Saunders., 1975) to prevent the system from breaking up when applied to much broader scales. An experiment was done to demonstrate the crash-proof ability of the new algorithm in comparison with the original VI method, an ability that is obtained when maintaining the distribution of the weights set after solving the linear system. After that, the new VI algorithm was applied to the whole Contiguous United States (CONUS) over four winter months of 2016 and 2017, and validated using the snow station network (SNOTEL). The resulting cloud free images have high accuracy in capturing the dynamical changes of snow in contrast with the MODIS snow cover maps. Lastly, the algorithm was applied to create a Cloud free images dataset from March 10, 2000 to February 28, 2017, which is able to provide an overview of snow trends over CONUS for nearly two decades. ACKNOWLEDGMENTSWe would like to acknowledge NASA, NOAA Office of Hydrologic Development (OHD) National Weather Service (NWS), Cooperative Institute for Climate and Satellites (CICS), Army Research Office (ARO), ICIWaRM, and UNESCO for supporting this research.
Investigating the effect of mental set on insight problem solving.
Ollinger, Michael; Jones, Gary; Knoblich, Günther
2008-01-01
Mental set is the tendency to solve certain problems in a fixed way based on previous solutions to similar problems. The moment of insight occurs when a problem cannot be solved using solution methods suggested by prior experience and the problem solver suddenly realizes that the solution requires different solution methods. Mental set and insight have often been linked together and yet no attempt thus far has systematically examined the interplay between the two. Three experiments are presented that examine the extent to which sets of noninsight and insight problems affect the subsequent solutions of insight test problems. The results indicate a subtle interplay between mental set and insight: when the set involves noninsight problems, no mental set effects are shown for the insight test problems, yet when the set involves insight problems, both facilitation and inhibition can be seen depending on the type of insight problem presented in the set. A two process model is detailed to explain these findings that combines the representational change mechanism with that of proceduralization.
Validation of national land-cover characteristics data for regional water-quality assessment
Zelt, Ronald B.; Brown, Jesslyn F.; Kelley, M.S.
1995-01-01
Land-cover information is used routinely to support the interpretation of water-quality data. The Prototype 1990 Conterminous US Land Cover Characteristics Data Set, developed primarily from Advanced Very High Resolution Radiometer (AVHRR) data, was made available to the US Geological Survey's National Water-Quality Assessment (NAWQA) Program. The study described in this paper explored the utility of the 1990 national data set for developing quantitative estimates of the areal extent of principal land-cover types within large areal units. Land-cover data were collected in 1993 at 210 sites in the Central Nebraska Basins, one of the NAWQA study units. Median percentage-corn estimates for each sampling stratum wre used to produce areally weighted estimates of the percentage-corn cover for hydrologic units. Comparison of those areal estimates with an independent source of 1992 land-cover data showed good agreement. -Authors
Stoms, David M.; Davis, Frank W.
2014-01-01
Quantitative methods of spatial conservation prioritization have traditionally been applied to issues in conservation biology and reserve design, though their use in other types of natural resource management is growing. The utility maximization problem is one form of a covering problem where multiple criteria can represent the expected social benefits of conservation action. This approach allows flexibility with a problem formulation that is more general than typical reserve design problems, though the solution methods are very similar. However, few studies have addressed optimization in utility maximization problems for conservation planning, and the effect of solution procedure is largely unquantified. Therefore, this study mapped five criteria describing elements of multifunctional agriculture to determine a hypothetical conservation resource allocation plan for agricultural land conservation in the Central Valley of CA, USA. We compared solution procedures within the utility maximization framework to determine the difference between an open source integer programming approach and a greedy heuristic, and find gains from optimization of up to 12%. We also model land availability for conservation action as a stochastic process and determine the decline in total utility compared to the globally optimal set using both solution algorithms. Our results are comparable to other studies illustrating the benefits of optimization for different conservation planning problems, and highlight the importance of maximizing the effectiveness of limited funding for conservation and natural resource management. PMID:25538868
Kreitler, Jason R.; Stoms, David M.; Davis, Frank W.
2014-01-01
Quantitative methods of spatial conservation prioritization have traditionally been applied to issues in conservation biology and reserve design, though their use in other types of natural resource management is growing. The utility maximization problem is one form of a covering problem where multiple criteria can represent the expected social benefits of conservation action. This approach allows flexibility with a problem formulation that is more general than typical reserve design problems, though the solution methods are very similar. However, few studies have addressed optimization in utility maximization problems for conservation planning, and the effect of solution procedure is largely unquantified. Therefore, this study mapped five criteria describing elements of multifunctional agriculture to determine a hypothetical conservation resource allocation plan for agricultural land conservation in the Central Valley of CA, USA. We compared solution procedures within the utility maximization framework to determine the difference between an open source integer programming approach and a greedy heuristic, and find gains from optimization of up to 12%. We also model land availability for conservation action as a stochastic process and determine the decline in total utility compared to the globally optimal set using both solution algorithms. Our results are comparable to other studies illustrating the benefits of optimization for different conservation planning problems, and highlight the importance of maximizing the effectiveness of limited funding for conservation and natural resource management.
Artificial immune algorithm for multi-depot vehicle scheduling problems
NASA Astrophysics Data System (ADS)
Wu, Zhongyi; Wang, Donggen; Xia, Linyuan; Chen, Xiaoling
2008-10-01
In the fast-developing logistics and supply chain management fields, one of the key problems in the decision support system is that how to arrange, for a lot of customers and suppliers, the supplier-to-customer assignment and produce a detailed supply schedule under a set of constraints. Solutions to the multi-depot vehicle scheduling problems (MDVRP) help in solving this problem in case of transportation applications. The objective of the MDVSP is to minimize the total distance covered by all vehicles, which can be considered as delivery costs or time consumption. The MDVSP is one of nondeterministic polynomial-time hard (NP-hard) problem which cannot be solved to optimality within polynomial bounded computational time. Many different approaches have been developed to tackle MDVSP, such as exact algorithm (EA), one-stage approach (OSA), two-phase heuristic method (TPHM), tabu search algorithm (TSA), genetic algorithm (GA) and hierarchical multiplex structure (HIMS). Most of the methods mentioned above are time consuming and have high risk to result in local optimum. In this paper, a new search algorithm is proposed to solve MDVSP based on Artificial Immune Systems (AIS), which are inspirited by vertebrate immune systems. The proposed AIS algorithm is tested with 30 customers and 6 vehicles located in 3 depots. Experimental results show that the artificial immune system algorithm is an effective and efficient method for solving MDVSP problems.
Pin-Retraction Mechanism On Quick-Release Cover
NASA Technical Reports Server (NTRS)
Macmartin, Malcolm
1994-01-01
Quick-release cover includes pin-retraction mechanism releasing cover quickly from lower of two sets of pin connections holding cover. Cover released at top by pulling lever as described in "Lever-Arm Pin Puller" (NPO-18788). Removal of cover begins when technician or robot pulls upper-pin-release lever. Cover swings downward until tabs on lower pins are pulled through slots in their receptacles. Lower pins are then free.
LAND COVER MAPPING IN AN AGRICULTURAL SETTING USING MULTISEASONAL THEMATIC MAPPER DATA
A multiseasonal Landsat Thematic Mapper (TM) data set consisting of five image dates from a single year was used to characterize agricultural and related land cover in the Willamette River Basin (WRB) of western Oregon. Image registration was accomplished using an automated grou...
Adamu, Yilikal; Macleod, Colin; Adamu, Liknaw; Fikru, Wirtu; Kidu, Beyene; Abashawl, Aida; Dejene, Michael; Chu, Brian K; Flueckiger, Rebecca M; Willis, Rebecca; Pavluck, Alexandre L; Solomon, Anthony W
2016-01-01
Trachoma is a major cause of blindness in Ethiopia, and targeted for elimination as a public health problem by the year 2020. Prevalence data are needed to plan interventions. We set out to estimate the prevalence of trachoma in each evaluation unit of grouped districts ("woredas") in Benishangul Gumuz region, Ethiopia. We conducted seven cross-sectional community-based surveys, covering 20 woredas, between December 2013 and January 2014, as part of the Global Trachoma Mapping Project (GTMP). The standardized GTMP training package and methodologies were used. A total of 5828 households and 21,919 individuals were enumerated in the surveys. 19,583 people (89.3%) were present when survey teams visited. A total of 19,530 (99.7%) consented to examination, 11,063 (56.6%) of whom were female. The region-wide age- and sex-adjusted trichiasis prevalence in adults aged ≥15 years was 1.3%. Two evaluation units covering four woredas (Pawe, Mandura, Bulen and Dibate) with a combined rural population of 166,959 require implementation of the A, F and E components of the SAFE strategy (surgery, antibiotics, facial cleanliness and environmental improvement) for at least three years before re-survey, and intervention planning should begin for these woredas as soon as possible. Both active trachoma and trichiasis are public health problems in Benishangul Gumuz, which needs implementation of the full SAFE strategy.
The crack problem for a half plane stiffened by elastic cover plates
NASA Technical Reports Server (NTRS)
Delale, F.; Erdogan, F.
1981-01-01
An elastic half plane containing a crack and stiffened by a cover plate is discussed. The asymptotic nature of the stress state in the half plane around an end point of the stiffener to determine the likely orientation of a possible fracture initiation and growth was studied. The problem is formulated for an arbitrary oriented radial crack in a system of singular integral equations. For an internal crack and for an edge crack, the problem is solved and the stress intensity factors at the crack tips and the interface stress are calculated. A cracked half plane with two symmetrically located cover plates is also considered. It is concluded that the case of two stiffeners appears to be more severe than that of a single stiffener.
V-TECS Guide for Automobile Engine Performance Technician.
ERIC Educational Resources Information Center
Meyer, Calvin F.; Benson, Robert T.
This guide is intended to assist teachers responsible for instructing future auto engine performance technicians. The following topics are covered: diagnosing engine performance problems, ignition system problems, fuel system problems, mechanically related performance problems, emission control system problems, and electronic control systems;…
Screening for Physical Problems in Classrooms for Severely Handicapped Students.
ERIC Educational Resources Information Center
Dever, Richard; Knapczyk, Dennis
1980-01-01
The authors present a screening device with which teachers of severely handicapped students may detect the presence of a physical problem. The screening approach covers vision, auditory problems, seizures, orthopedic problems, and pain. (CL)
Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang
2017-04-26
This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient ( C ) and kernel width ( s ), in mapping homogeneous specific land cover.
[siRNAs with high specificity to the target: a systematic design by CRM algorithm].
Alsheddi, T; Vasin, L; Meduri, R; Randhawa, M; Glazko, G; Baranova, A
2008-01-01
'Off-target' silencing effect hinders the development of siRNA-based therapeutic and research applications. Common solution to this problem is an employment of the BLAST that may miss significant alignments or an exhaustive Smith-Waterman algorithm that is very time-consuming. We have developed a Comprehensive Redundancy Minimizer (CRM) approach for mapping all unique sequences ("targets") 9-to-15 nt in size within large sets of sequences (e.g. transcriptomes). CRM outputs a list of potential siRNA candidates for every transcript of the particular species. These candidates could be further analyzed by traditional "set-of-rules" types of siRNA designing tools. For human, 91% of transcripts are covered by candidate siRNAs with kernel targets of N = 15. We tested our approach on the collection of previously described experimentally assessed siRNAs and found that the correlation between efficacy and presence in CRM-approved set is significant (r = 0.215, p-value = 0.0001). An interactive database that contains a precompiled set of all human siRNA candidates with minimized redundancy is available at http://129.174.194.243. Application of the CRM-based filtering minimizes potential "off-target" silencing effects and could improve routine siRNA applications.
This paper presents a fuzzy set-based method of mapping spatial accuracy of thematic map and computing several ecological indicators while taking into account spatial variation of accuracy associated with different land cover types and other factors (e.g., slope, soil type, etc.)...
ERIC Educational Resources Information Center
Poncy, Brian C.; Skinner, Christopher H.; Jaspers, Kathryn E.
2007-01-01
An adapted alternating treatments design was used to evaluate and compare the effects of two procedures designed to enhance math fact accuracy and fluency in an elementary student with low cognitive functioning. Results showed that although the cover, copy, compare (CCC) and the taped problems (TP) procedures both increased the student's math fact…
Data layer integration for the national map of the united states
Usery, E.L.; Finn, M.P.; Starbuck, M.
2009-01-01
The integration of geographic data layers in multiple raster and vector formats, from many different organizations and at a variety of resolutions and scales, is a significant problem for The National Map of the United States being developed by the U.S. Geological Survey. Our research has examined data integration from a layer-based approach for five of The National Map data layers: digital orthoimages, elevation, land cover, hydrography, and transportation. An empirical approach has included visual assessment by a set of respondents with statistical analysis to establish the meaning of various types of integration. A separate theoretical approach with established hypotheses tested against actual data sets has resulted in an automated procedure for integration of specific layers and is being tested. The empirical analysis has established resolution bounds on meanings of integration with raster datasets and distance bounds for vector data. The theoretical approach has used a combination of theories on cartographic transformation and generalization, such as T??pfer's radical law, and additional research concerning optimum viewing scales for digital images to establish a set of guiding principles for integrating data of different resolutions.
The construction of an EST database for Bombyx mori and its application
Mita, Kazuei; Morimyo, Mitsuoki; Okano, Kazuhiro; Koike, Yoshiko; Nohata, Junko; Kawasaki, Hideki; Kadono-Okuda, Keiko; Yamamoto, Kimiko; Suzuki, Masataka G.; Shimada, Toru; Goldsmith, Marian R.; Maeda, Susumu
2003-01-01
To build a foundation for the complete genome analysis of Bombyx mori, we have constructed an EST database. Because gene expression patterns deeply depend on tissues as well as developmental stages, we analyzed many cDNA libraries prepared from various tissues and different developmental stages to cover the entire set of Bombyx genes. So far, the Bombyx EST database contains 35,000 ESTs from 36 cDNA libraries, which are grouped into ≈11,000 nonredundant ESTs with the average length of 1.25 kb. The comparison with FlyBase suggests that the present EST database, SilkBase, covers >55% of all genes of Bombyx. The fraction of library-specific ESTs in each cDNA library indicates that we have not yet reached saturation, showing the validity of our strategy for constructing an EST database to cover all genes. To tackle the coming saturation problem, we have checked two methods, subtraction and normalization, to increase coverage and decrease the number of housekeeping genes, resulting in a 5–11% increase of library-specific ESTs. The identification of a number of genes and comprehensive cloning of gene families have already emerged from the SilkBase search. Direct links of SilkBase with FlyBase and WormBase provide ready identification of candidate Lepidoptera-specific genes. PMID:14614147
Dimer covering and percolation frustration.
Haji-Akbari, Amir; Haji-Akbari, Nasim; Ziff, Robert M
2015-09-01
Covering a graph or a lattice with nonoverlapping dimers is a problem that has received considerable interest in areas, such as discrete mathematics, statistical physics, chemistry, and materials science. Yet, the problem of percolation on dimer-covered lattices has received little attention. In particular, percolation on lattices that are fully covered by nonoverlapping dimers has not evidently been considered. Here, we propose a procedure for generating random dimer coverings of a given lattice. We then compute the bond percolation threshold on random and ordered coverings of the square and the triangular lattices on the remaining bonds connecting the dimers. We obtain p_{c}=0.367713(2) and p_{c}=0.235340(1) for random coverings of the square and the triangular lattices, respectively. We observe that the percolation frustration induced as a result of dimer covering is larger in the low-coordination-number square lattice. There is also no relationship between the existence of long-range order in a covering of the square lattice and its percolation threshold. In particular, an ordered covering of the square lattice, denoted by shifted covering in this paper, has an unusually low percolation threshold and is topologically identical to the triangular lattice. This is in contrast to the other ordered dimer coverings considered in this paper, which have higher percolation thresholds than the random covering. In the case of the triangular lattice, the percolation thresholds of the ordered and random coverings are very close, suggesting the lack of sensitivity of the percolation threshold to microscopic details of the covering in highly coordinated networks.
Regional land cover characterization using Landsat thematic mapper data and ancillary data sources
Vogelmann, James E.; Sohl, Terry L.; Campbell, P.V.; Shaw, D.M.; ,
1998-01-01
As part of the activities of the Multi-Resolution Land Characteristics (MRLC) Interagency Consortium, an intermediate-scale land cover data set is being generated for the conterminous United States. This effort is being conducted on a region-by-region basis using U.S. Standard Federal Regions. To date, land cover data sets have been generated for Federal Regions 3 (Pennsylvania, West Virginia, Virginia, Maryland, and Delaware) and 2 (New York and New Jersey). Classification work is currently under way in Federal Region 4 (the southeastern United States), and land cover mapping activities have been started in Federal Regions 5 (the Great Lakes region) and 1 (New England). It is anticipated that a land cover data set for the conterminous United States will be completed by the end of 1999. A standard land cover classification legend is used, which is analogous to and compatible with other classification schemes. The primary MRLC regional classification scheme contains 23 land cover classes.The primary source of data for the project is the Landsat thematic mapper (TM) sensor. For each region, TM scenes representing both leaf-on and leaf-off conditions are acquired, preprocessed, and georeferenced to MRLC specifications. Mosaicked data are clustered using unsupervised classification, and individual clusters are labeled using aerial photographs. Individual clusters that represent more than one land cover unit are split using spatial modeling with multiple ancillary spatial data layers (most notably, digital elevation model, population, land use and land cover, and wetlands information). This approach yields regional land cover information suitable for a wide array of applications, including landscape metric analyses, land management, land cover change studies, and nutrient and pesticide runoff modeling.
29 CFR 778.325 - Effect on salary covering more than 40 hours' pay.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 3 2010-07-01 2010-07-01 false Effect on salary covering more than 40 hours' pay. 778.325... COMPENSATION Special Problems Reduction in Workweek Schedule with No Change in Pay § 778.325 Effect on salary covering more than 40 hours' pay. The same reasoning applies to salary covering straight time pay for a...
ERIC Educational Resources Information Center
Grafman, Joel M.; Cates, Gary L.
2010-01-01
This study compared the fluency and error rates produced when using the Cover, Copy, and Compare (CCC) and a modified CCC procedure (MCCC) called Copy, Cover, and Compare to complete subtraction math problems. Two second-grade classrooms consisting of 47 total students participated in the study. The following items were administered to…
Wissmann, F; Reginatto, M; Möller, T
2010-09-01
The problem of finding a simple, generally applicable description of worldwide measured ambient dose equivalent rates at aviation altitudes between 8 and 12 km is difficult to solve due to the large variety of functional forms and parametrisations that are possible. We present an approach that uses Bayesian statistics and Monte Carlo methods to fit mathematical models to a large set of data and to compare the different models. About 2500 data points measured in the periods 1997-1999 and 2003-2006 were used. Since the data cover wide ranges of barometric altitude, vertical cut-off rigidity and phases in the solar cycle 23, we developed functions which depend on these three variables. Whereas the dependence on the vertical cut-off rigidity is described by an exponential, the dependences on barometric altitude and solar activity may be approximated by linear functions in the ranges under consideration. Therefore, a simple Taylor expansion was used to define different models and to investigate the relevance of the different expansion coefficients. With the method presented here, it is possible to obtain probability distributions for each expansion coefficient and thus to extract reliable uncertainties even for the dose rate evaluated. The resulting function agrees well with new measurements made at fixed geographic positions and during long haul flights covering a wide range of latitudes.
Clostridium difficile Infection and Fecal Microbiota Transplant
Liubakka, Alyssa; Vaughn, Byron P.
2017-01-01
Clostridium difficile infection (CDI) is a major source of morbidity and mortality for hospitalized patients. Although most patients have a clinical response to existing antimicrobial therapies, recurrent infection develops in up to 30% of patients. Fecal microbiota transplant is a novel approach to this complex problem, with an efficacy rate of nearly 90% in the setting of multiple recurrent CDI. This review covers the current epidemiology of CDI (including toxigenic and nontoxigenic strains, risk factors for infection, and recurrent infection), methods of diagnosis, existing first-line therapies in CDI, the role of fecal microbiota transplant for multiple recurrent CDIs, and the potential use of fecal microbial transplant for patients with severe or refractory infection. PMID:27959316
The Affordable Care Act versus Medicare for All.
Seidman, Laurence
2015-08-01
Many problems facing the Affordable Care Act would disappear if the nation were instead implementing Medicare for All - the extension of Medicare to every age group. Every American would be automatically covered for life. Premiums would be replaced with a set of Medicare taxes. There would be no patient cost sharing. Individuals would have free choice of doctors. Medicare's single-payer bargaining power would slow price increases and reduce medical cost as a percentage of gross domestic product (GDP). Taxes as a percentage of GDP would rise from below average to average for economically advanced nations. Medicare for All would be phased in by age. Copyright © 2015 by Duke University Press.
2012-01-01
Computational approaches to generate hypotheses from biomedical literature have been studied intensively in recent years. Nevertheless, it still remains a challenge to automatically discover novel, cross-silo biomedical hypotheses from large-scale literature repositories. In order to address this challenge, we first model a biomedical literature repository as a comprehensive network of biomedical concepts and formulate hypotheses generation as a process of link discovery on the concept network. We extract the relevant information from the biomedical literature corpus and generate a concept network and concept-author map on a cluster using Map-Reduce frame-work. We extract a set of heterogeneous features such as random walk based features, neighborhood features and common author features. The potential number of links to consider for the possibility of link discovery is large in our concept network and to address the scalability problem, the features from a concept network are extracted using a cluster with Map-Reduce framework. We further model link discovery as a classification problem carried out on a training data set automatically extracted from two network snapshots taken in two consecutive time duration. A set of heterogeneous features, which cover both topological and semantic features derived from the concept network, have been studied with respect to their impacts on the accuracy of the proposed supervised link discovery process. A case study of hypotheses generation based on the proposed method has been presented in the paper. PMID:22759614
Lok, Willeke; Anteunis, Lucien J. C.; Chenault, Michelene N.; Meesters, Cor; Haggard, Mark P.
2012-01-01
Objective The present study investigates whether general practitioner (GP) consultation initiated by failing the population hearing screening at age nine months or GP consultation because of parental concern over ear/hearing problems was more important in deciding on referral and/or surgical treatment of otitis media (OM). Design A questionnaire covering the history between birth and 21 months of age was used to obtain information on referral after failing the hearing screening, GP consultations for ear/hearing problems, and subsequent referral to a specialist and possible surgical treatment at an ENT department. Setting The province of Limburg, the Netherlands. Subjects Healthy infants invited for the hearing screening at age nine months, who responded in an earlier study called PEPPER (Persistent Ear Problems, Providing Evidence for Referral, response rate 58%). Main outcome measures The odds of a child being surgically treated for OM. Results The response rate for the present questionnaire was 72%. Of all children tested, 3.9% failed the hearing screening and were referred to their GP. Of all 2619 children in this study, 18.6% visited their GP with ear/hearing problems. Children failing the hearing screening without GP consultation for ear/hearing problems were significantly more often treated surgically for OM than children passing the hearing screening but with GP consultation for ear/hearing problems. Conclusion Objectified hearing loss, i.e. failing the hearing screening, was important in the decision for surgical treatment in infants in the Netherlands. PMID:22794165
Eldercare at Home: Dental Problems
... Nutrition Join our e-newsletter! Resources Eldercare at Home: Dental Problems Caregiving How Tos Understanding the Problem ... suggestions. Related Resources Caregiving How To's Eldercare at Home: Table of Contents Eldercare at Home covers over ...
SCALING-UP INFORMATION IN LAND-COVER DATA FOR LARGE-SCALE ENVIRONMENTAL ASSESSMENTS
The NLCD project provides national-scope land-cover data for the conterminous United States. The first land-cover data set was completed in 2000, and the continuing need for recent land-cover information has motivated continuation of the project to provide current and change info...
Competitive code-based fast palmprint identification using a set of cover trees
NASA Astrophysics Data System (ADS)
Yue, Feng; Zuo, Wangmeng; Zhang, David; Wang, Kuanquan
2009-06-01
A palmprint identification system recognizes a query palmprint image by searching for its nearest neighbor from among all the templates in a database. When applied on a large-scale identification system, it is often necessary to speed up the nearest-neighbor searching process. We use competitive code, which has very fast feature extraction and matching speed, for palmprint identification. To speed up the identification process, we extend the cover tree method and propose to use a set of cover trees to facilitate the fast and accurate nearest-neighbor searching. We can use the cover tree method because, as we show, the angular distance used in competitive code can be decomposed into a set of metrics. Using the Hong Kong PolyU palmprint database (version 2) and a large-scale palmprint database, our experimental results show that the proposed method searches for nearest neighbors faster than brute force searching.
Optimal selection of epitopes for TXP-immunoaffinity mass spectrometry.
Planatscher, Hannes; Supper, Jochen; Poetz, Oliver; Stoll, Dieter; Joos, Thomas; Templin, Markus F; Zell, Andreas
2010-06-25
Mass spectrometry (MS) based protein profiling has become one of the key technologies in biomedical research and biomarker discovery. One bottleneck in MS-based protein analysis is sample preparation and an efficient fractionation step to reduce the complexity of the biological samples, which are too complex to be analyzed directly with MS. Sample preparation strategies that reduce the complexity of tryptic digests by using immunoaffinity based methods have shown to lead to a substantial increase in throughput and sensitivity in the proteomic mass spectrometry approach. The limitation of using such immunoaffinity-based approaches is the availability of the appropriate peptide specific capture antibodies. Recent developments in these approaches, where subsets of peptides with short identical terminal sequences can be enriched using antibodies directed against short terminal epitopes, promise a significant gain in efficiency. We show that the minimal set of terminal epitopes for the coverage of a target protein list can be found by the formulation as a set cover problem, preceded by a filtering pipeline for the exclusion of peptides and target epitopes with undesirable properties. For small datasets (a few hundred proteins) it is possible to solve the problem to optimality with moderate computational effort using commercial or free solvers. Larger datasets, like full proteomes require the use of heuristics.
Registered nurses' clinical reasoning skills and reasoning process: A think-aloud study.
Lee, JuHee; Lee, Young Joo; Bae, JuYeon; Seo, Minjeong
2016-11-01
As complex chronic diseases are increasing, nurses' prompt and accurate clinical reasoning skills are essential. However, little is known about the reasoning skills of registered nurses. This study aimed to determine how registered nurses use their clinical reasoning skills and to identify how the reasoning process proceeds in the complex clinical situation of hospital setting. A qualitative exploratory design was used with a think-aloud method. A total of 13 registered nurses (mean years of experience=11.4) participated in the study, solving an ill-structured clinical problem based on complex chronic patients cases in a hospital setting. Data were analyzed using deductive content analysis. Findings showed that the registered nurses used a variety of clinical reasoning skills. The most commonly used skill was 'checking accuracy and reliability.' The reasoning process of registered nurses covered assessment, analysis, diagnosis, planning/implementation, and evaluation phase. It is critical that registered nurses apply appropriate clinical reasoning skills in complex clinical practice. The main focus of registered nurses' reasoning in this study was assessing a patient's health problem, and their reasoning process was cyclic, rather than linear. There is a need for educational strategy development to enhance registered nurses' competency in determining appropriate interventions in a timely and accurate fashion. Copyright © 2016 Elsevier Ltd. All rights reserved.
High Level Rule Modeling Language for Airline Crew Pairing
NASA Astrophysics Data System (ADS)
Mutlu, Erdal; Birbil, Ş. Ilker; Bülbül, Kerem; Yenigün, Hüsnü
2011-09-01
The crew pairing problem is an airline optimization problem where a set of least costly pairings (consecutive flights to be flown by a single crew) that covers every flight in a given flight network is sought. A pairing is defined by using a very complex set of feasibility rules imposed by international and national regulatory agencies, and also by the airline itself. The cost of a pairing is also defined by using complicated rules. When an optimization engine generates a sequence of flights from a given flight network, it has to check all these feasibility rules to ensure whether the sequence forms a valid pairing. Likewise, the engine needs to calculate the cost of the pairing by using certain rules. However, the rules used for checking the feasibility and calculating the costs are usually not static. Furthermore, the airline companies carry out what-if-type analyses through testing several alternate scenarios in each planning period. Therefore, embedding the implementation of feasibility checking and cost calculation rules into the source code of the optimization engine is not a practical approach. In this work, a high level language called ARUS is introduced for describing the feasibility and cost calculation rules. A compiler for ARUS is also implemented in this work to generate a dynamic link library to be used by crew pairing optimization engines.
Science: Grade 7. Interim Guide.
ERIC Educational Resources Information Center
Manitoba Dept. of Education and Training, Winnipeg.
This guide is one of a set of 10 science guides, each covering a separate grade in Manitoba, together covering kindergarten through grade 9. The guides have been designed to provide a framework for building scientific concepts and developing the learning of process skills. They replace an earlier set of guides dated 1979. Each guide is essentially…
Science: Grade 6. Interim Guide.
ERIC Educational Resources Information Center
Manitoba Dept. of Education and Training, Winnipeg.
This guide is one of a set of 10 science guides, each covering a separate grade in Manitoba, together covering kindergarten through grade 9. The guides have been designed to provide a framework for building scientific concepts and developing the learning of process skills. They replace an earlier set of guides dated 1979. Each guide is essentially…
Science: Grade 8. Interim Guide.
ERIC Educational Resources Information Center
Manitoba Dept. of Education and Training, Winnipeg.
This guide is one of a set of 10 science guides, each covering a separate grade in Manitoba, together covering kindergarten through grade 9. The guides have been designed to provide a framework for building scientific concepts and developing the learning of process skills. They replace an earlier set of guides dated 1979. Each guide is essentially…
Science: Grade 2. Interim Guide.
ERIC Educational Resources Information Center
Manitoba Dept. of Education and Training, Winnipeg.
This guide is one of a set of 10 science guides, each covering a separate grade in Manitoba, together covering kindergarten through grade 9. The guides have been designed to provide a framework for building scientific concepts and developing the learning of process skills. They replace an earlier set of guides dated 1979. Each guide is essentially…
Science: Grade 1. Interim Guide.
ERIC Educational Resources Information Center
Manitoba Dept. of Education and Training, Winnipeg.
This guide is one of a set of 10 science guides, each covering a separate grade in Manitoba, together covering kindergarten through grade 9. The guides have been designed to provide a framework for building scientific concepts and developing the learning of process skills. They replace an earlier set of guides dated 1979. Each guide is essentially…
Science: Kindergarten. Interim Guide.
ERIC Educational Resources Information Center
Manitoba Dept. of Education and Training, Winnipeg.
This guide is one of a set of 10 science guides, each covering a separate grade in Manitoba, together covering kindergarten through grade 9. The guides have been designed to provide a framework for building scientific concepts and developing the learning of process skills. They replace an earlier set of guides dated 1979. Each guide is essentially…
ExSTraCS 2.0: Description and Evaluation of a Scalable Learning Classifier System.
Urbanowicz, Ryan J; Moore, Jason H
2015-09-01
Algorithmic scalability is a major concern for any machine learning strategy in this age of 'big data'. A large number of potentially predictive attributes is emblematic of problems in bioinformatics, genetic epidemiology, and many other fields. Previously, ExS-TraCS was introduced as an extended Michigan-style supervised learning classifier system that combined a set of powerful heuristics to successfully tackle the challenges of classification, prediction, and knowledge discovery in complex, noisy, and heterogeneous problem domains. While Michigan-style learning classifier systems are powerful and flexible learners, they are not considered to be particularly scalable. For the first time, this paper presents a complete description of the ExS-TraCS algorithm and introduces an effective strategy to dramatically improve learning classifier system scalability. ExSTraCS 2.0 addresses scalability with (1) a rule specificity limit, (2) new approaches to expert knowledge guided covering and mutation mechanisms, and (3) the implementation and utilization of the TuRF algorithm for improving the quality of expert knowledge discovery in larger datasets. Performance over a complex spectrum of simulated genetic datasets demonstrated that these new mechanisms dramatically improve nearly every performance metric on datasets with 20 attributes and made it possible for ExSTraCS to reliably scale up to perform on related 200 and 2000-attribute datasets. ExSTraCS 2.0 was also able to reliably solve the 6, 11, 20, 37, 70, and 135 multiplexer problems, and did so in similar or fewer learning iterations than previously reported, with smaller finite training sets, and without using building blocks discovered from simpler multiplexer problems. Furthermore, ExS-TraCS usability was made simpler through the elimination of previously critical run parameters.
Systematic investigation of non-Boussinesq effects in variable-density groundwater flow simulations.
Guevara Morel, Carlos R; van Reeuwijk, Maarten; Graf, Thomas
2015-12-01
The validity of three mathematical models describing variable-density groundwater flow is systematically evaluated: (i) a model which invokes the Oberbeck-Boussinesq approximation (OB approximation), (ii) a model of intermediate complexity (NOB1) and (iii) a model which solves the full set of equations (NOB2). The NOB1 and NOB2 descriptions have been added to the HydroGeoSphere (HGS) model, which originally contained an implementation of the OB description. We define the Boussinesq parameter ερ=βω Δω where βω is the solutal expansivity and Δω is the characteristic difference in solute mass fraction. The Boussinesq parameter ερ is used to systematically investigate three flow scenarios covering a range of free and mixed convection problems: 1) the low Rayleigh number Elder problem (Van Reeuwijk et al., 2009), 2) a convective fingering problem (Xie et al., 2011) and 3) a mixed convective problem (Schincariol et al., 1994). Results indicate that small density differences (ερ≤ 0.05) produce no apparent changes in the total solute mass in the system, plume penetration depth, center of mass and mass flux independent of the mathematical model used. Deviations between OB, NOB1 and NOB2 occur for large density differences (ερ>0.12), where lower description levels will underestimate the vertical plume position and overestimate mass flux. Based on the cases considered here, we suggest the following guidelines for saline convection: the OB approximation is valid for cases with ερ<0.05, and the full NOB set of equations needs to be used for cases with ερ>0.10. Whether NOB effects are important in the intermediate region differ from case to case. Copyright © 2015 Elsevier B.V. All rights reserved.
USDA-ARS?s Scientific Manuscript database
The quality of the seal provided by the plastic cover is a key issue for minimizing losses in bunker and pile silos. Most bunker covers are 6 to 8 mil polyethylene sheets held in place by tires or tire sidewalls. Frequently there are problems with spoilage at the shoulders (i.e., against the walls),...
Vogelmann, James E.; DeFelice, Thomas P.
2003-01-01
Landsat-7 and Landsat-5 have orbits that are offset from each other by 8 days. During the time that the sensors on both satellites are operational, there is an opportunity for conducting analyses that incorporate multiple intra-annual high spatial resolution data sets for characterizing the Earth's land surface. In the current study, nine Landsat thematic mapper (TM) and enhanced thematic mapper plus (ETM+) data sets, covering the same path and row on different dates, were acquired during a 1-year time interval for a region in southeastern South Dakota and analyzed. Scenes were normalized using pseudoinvariant objects, and digital data from a series of test sites were extracted from the imagery and converted to surface reflectance. Sunphotometer data acquired on site were used to atmospherically correct the data. Ground observations that were made throughout the growing season by a large group of volunteers were used to help interpret spectroradiometric patterns and trends. Normalized images were found to be very effective in portraying the seasonal patterns of reflectance change that occurred throughout the region. Many of the radiometric patterns related to plant growth and development, but some also related to different background properties. The different kinds of land cover in the region were spectrally and radiometrically characterized and were found to have different seasonal patterns of reflectance. The degree to which the land cover classes could be separated spectrally and radiometrically, however, depended on the time of year during which the data sets were acquired, and no single data set appeared to be adequate for separating all types of land cover. This has practical implications for classification studies because known patterns of seasonal reflectance properties for the different types of land cover within a region will facilitate selection of the most appropriate data sets for producing land cover classifications.
Jenkyn, JF; Gutteridge, RJ; White, RP
2014-01-01
Experiments on the Rothamsted and Woburn Experimental Farms studied the effects on take-all of different break crops and of set-aside/conservation covers that interrupted sequences of winter wheat. There was no evidence for different effects on take-all of the break crops per se but the presence of volunteers, in crops of oilseed rape, increased the amounts of take-all in the following wheat. Severity of take-all was closely related to the numbers of volunteers in the preceding break crops and covers, and was affected by the date of their destruction. Early destruction of set-aside/conservation covers was usually effective in preventing damaging take-all in the following wheat except, sometimes, when populations of volunteers were very large. The experiments were not designed to test the effects of sowing dates but different amounts of take-all in the first wheats after breaks or covers apparently affected the severity of take-all in the following (second) wheats only where the latter were relatively late sown. In earlier-sown second wheats, take-all was consistently severe and unrelated to the severity of the disease in the preceding (first) wheats. Results from two very simple experiments suggested that substituting set-aside/conservation covers for winter wheat, for 1 year only, did not seriously interfere with the development of take-all disease or with the development or maintenance of take-all decline (TAD). With further research, it might be possible for growers wishing to exploit TAD to incorporate set-aside/conservation covers into their cropping strategies, and especially to avoid the worst effects of the disease on grain yield during the early stages of epidemics. PMID:25653455
El Toro Library Solar Heating and Cooling Demonstration Project. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report is divided into a number of essentially independent sections, each of which covers a specific topic. The sections, and the topics covered, are as follows. Section 1 provides a brief summary description of the solar energy heating and cooling system including the key final design parameters. Section 2 contains a copy of the final Acceptance Test Report. Section 3 consists of a reduced set of final updated as-built mechanical, electrical, control and instrumentations drawings of the solar energy heating and cooling system. Section 4 provides a summary of system maintenance requirements, in the form of a maintenance schedulemore » which lists necessary maintenance tasks to be performed at monthly, quarterly, semi-annual, and annual intervals. Section 5 contains a series of photographs of the final solar energy system installation, including the collector field and the mechanical equipment room. Section 6 provides a concise summary of system operation and performance for the period of December 1981 through June 1982, as measured, computed and reported by Vitro Laboratories Division of Automation Industries, Inc., for the DOE National Solar Data Network. Section 7 provides a summary of key as-built design parameters, compared with the corresponding original design concept parameters. Section 8 provides a description of a series of significant problems encountered during construction, start-up and check-out of the solar energy heating and cooling system, together with the method employed to solve the problem at the time and/or recommendations for avoiding the problem in the future design of similar systems. Appendices A through H contain the installation, operation and maintenance submittals of the various manufacturers on the major items of equipment in the system. Reference CAPE-2823.« less
Fact Sheet on Evapotranspiration Cover Systems for Waste Containment
This Fact Sheet updates the 2003 Fact Sheet on Evapotranspiration Covers and provides information on the regulatory setting for ET covers, general considerations in their design, performance, and monitoring, and status at the time of writing (2011).
Application of high-order numerical schemes and Newton-Krylov method to two-phase drift-flux model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
This study concerns the application and solver robustness of the Newton-Krylov method in solving two-phase flow drift-flux model problems using high-order numerical schemes. In our previous studies, the Newton-Krylov method has been proven as a promising solver for two-phase flow drift-flux model problems. However, these studies were limited to use first-order numerical schemes only. Moreover, the previous approach to treating the drift-flux closure correlations was later revealed to cause deteriorated solver convergence performance, when the mesh was highly refined, and also when higher-order numerical schemes were employed. In this study, a second-order spatial discretization scheme that has been tested withmore » two-fluid two-phase flow model was extended to solve drift-flux model problems. In order to improve solver robustness, and therefore efficiency, a new approach was proposed to treating the mean drift velocity of the gas phase as a primary nonlinear variable to the equation system. With this new approach, significant improvement in solver robustness was achieved. With highly refined mesh, the proposed treatment along with the Newton-Krylov solver were extensively tested with two-phase flow problems that cover a wide range of thermal-hydraulics conditions. Satisfactory convergence performances were observed for all test cases. Numerical verification was then performed in the form of mesh convergence studies, from which expected orders of accuracy were obtained for both the first-order and the second-order spatial discretization schemes. Finally, the drift-flux model, along with numerical methods presented, were validated with three sets of flow boiling experiments that cover different flow channel geometries (round tube, rectangular tube, and rod bundle), and a wide range of test conditions (pressure, mass flux, wall heat flux, inlet subcooling and outlet void fraction).« less
Application of high-order numerical schemes and Newton-Krylov method to two-phase drift-flux model
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2017-08-07
This study concerns the application and solver robustness of the Newton-Krylov method in solving two-phase flow drift-flux model problems using high-order numerical schemes. In our previous studies, the Newton-Krylov method has been proven as a promising solver for two-phase flow drift-flux model problems. However, these studies were limited to use first-order numerical schemes only. Moreover, the previous approach to treating the drift-flux closure correlations was later revealed to cause deteriorated solver convergence performance, when the mesh was highly refined, and also when higher-order numerical schemes were employed. In this study, a second-order spatial discretization scheme that has been tested withmore » two-fluid two-phase flow model was extended to solve drift-flux model problems. In order to improve solver robustness, and therefore efficiency, a new approach was proposed to treating the mean drift velocity of the gas phase as a primary nonlinear variable to the equation system. With this new approach, significant improvement in solver robustness was achieved. With highly refined mesh, the proposed treatment along with the Newton-Krylov solver were extensively tested with two-phase flow problems that cover a wide range of thermal-hydraulics conditions. Satisfactory convergence performances were observed for all test cases. Numerical verification was then performed in the form of mesh convergence studies, from which expected orders of accuracy were obtained for both the first-order and the second-order spatial discretization schemes. Finally, the drift-flux model, along with numerical methods presented, were validated with three sets of flow boiling experiments that cover different flow channel geometries (round tube, rectangular tube, and rod bundle), and a wide range of test conditions (pressure, mass flux, wall heat flux, inlet subcooling and outlet void fraction).« less
10 CFR 950.32 - Final determination on covered events.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 4 2012-01-01 2012-01-01 false Final determination on covered events. 950.32 Section 950... Process § 950.32 Final determination on covered events. (a) If the parties reach a Final Determination on Covered Events through mediation, or Summary Binding Decision as set forth in this subpart, the Final...
10 CFR 950.32 - Final determination on covered events.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Final determination on covered events. 950.32 Section 950... Process § 950.32 Final determination on covered events. (a) If the parties reach a Final Determination on Covered Events through mediation, or Summary Binding Decision as set forth in this subpart, the Final...
10 CFR 950.32 - Final determination on covered events.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Final determination on covered events. 950.32 Section 950... Process § 950.32 Final determination on covered events. (a) If the parties reach a Final Determination on Covered Events through mediation, or Summary Binding Decision as set forth in this subpart, the Final...
10 CFR 950.32 - Final determination on covered events.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 4 2013-01-01 2013-01-01 false Final determination on covered events. 950.32 Section 950... Process § 950.32 Final determination on covered events. (a) If the parties reach a Final Determination on Covered Events through mediation, or Summary Binding Decision as set forth in this subpart, the Final...
10 CFR 950.32 - Final determination on covered events.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 4 2014-01-01 2014-01-01 false Final determination on covered events. 950.32 Section 950... Process § 950.32 Final determination on covered events. (a) If the parties reach a Final Determination on Covered Events through mediation, or Summary Binding Decision as set forth in this subpart, the Final...
Retrieval of Soil Moisture and Roughness from the Polarimetric Radar Response
NASA Technical Reports Server (NTRS)
Sarabandi, Kamal; Ulaby, Fawwaz T.
1997-01-01
The main objective of this investigation was the characterization of soil moisture using imaging radars. In order to accomplish this task, a number of intermediate steps had to be undertaken. In this proposal, the theoretical, numerical, and experimental aspects of electromagnetic scattering from natural surfaces was considered with emphasis on remote sensing of soil moisture. In the general case, the microwave backscatter from natural surfaces is mainly influenced by three major factors: (1) the roughness statistics of the soil surface, (2) soil moisture content, and (3) soil surface cover. First the scattering problem from bare-soil surfaces was considered and a hybrid model that relates the radar backscattering coefficient to soil moisture and surface roughness was developed. This model is based on extensive experimental measurements of the radar polarimetric backscatter response of bare soil surfaces at microwave frequencies over a wide range of moisture conditions and roughness scales in conjunction with existing theoretical surface scattering models in limiting cases (small perturbation, physical optics, and geometrical optics models). Also a simple inversion algorithm capable of providing accurate estimates of soil moisture content and surface rms height from single-frequency multi-polarization radar observations was developed. The accuracy of the model and its inversion algorithm is demonstrated using independent data sets. Next the hybrid model for bare-soil surfaces is made fully polarimetric by incorporating the parameters of the co- and cross-polarized phase difference into the model. Experimental data in conjunction with numerical simulations are used to relate the soil moisture content and surface roughness to the phase difference statistics. For this purpose, a novel numerical scattering simulation for inhomogeneous dielectric random surfaces was developed. Finally the scattering problem of short vegetation cover above a rough soil surface was considered. A general scattering model for grass-blades of arbitrary cross section was developed and incorporated in a first order random media model. The vegetation model and the bare-soil model are combined and the accuracy of the combined model is evaluated against experimental observations from a wheat field over the entire growing season. A complete set of ground-truth data and polarimetric backscatter data were collected. Also an inversion algorithm for estimating soil moisture and surface roughness from multi-polarized multi-frequency observations of vegetation-covered ground is developed.
NASA Astrophysics Data System (ADS)
Hagensieker, Ron; Roscher, Ribana; Rosentreter, Johannes; Jakimow, Benjamin; Waske, Björn
2017-12-01
Remote sensing satellite data offer the unique possibility to map land use land cover transformations by providing spatially explicit information. However, detection of short-term processes and land use patterns of high spatial-temporal variability is a challenging task. We present a novel framework using multi-temporal TerraSAR-X data and machine learning techniques, namely discriminative Markov random fields with spatio-temporal priors, and import vector machines, in order to advance the mapping of land cover characterized by short-term changes. Our study region covers a current deforestation frontier in the Brazilian state Pará with land cover dominated by primary forests, different types of pasture land and secondary vegetation, and land use dominated by short-term processes such as slash-and-burn activities. The data set comprises multi-temporal TerraSAR-X imagery acquired over the course of the 2014 dry season, as well as optical data (RapidEye, Landsat) for reference. Results show that land use land cover is reliably mapped, resulting in spatially adjusted overall accuracies of up to 79% in a five class setting, yet limitations for the differentiation of different pasture types remain. The proposed method is applicable on multi-temporal data sets, and constitutes a feasible approach to map land use land cover in regions that are affected by high-frequent temporal changes.
NASA Astrophysics Data System (ADS)
Roelfsema, Chris M.; Kovacs, Eva M.; Phinn, Stuart R.
2015-08-01
This paper describes seagrass species and percentage cover point-based field data sets derived from georeferenced photo transects. Annually or biannually over a ten year period (2004-2014) data sets were collected using 30-50 transects, 500-800 m in length distributed across a 142 km2 shallow, clear water seagrass habitat, the Eastern Banks, Moreton Bay, Australia. Each of the eight data sets include seagrass property information derived from approximately 3000 georeferenced, downward looking photographs captured at 2-4 m intervals along the transects. Photographs were manually interpreted to estimate seagrass species composition and percentage cover (Coral Point Count excel; CPCe). Understanding seagrass biology, ecology and dynamics for scientific and management purposes requires point-based data on species composition and cover. This data set, and the methods used to derive it are a globally unique example for seagrass ecological applications. It provides the basis for multiple further studies at this site, regional to global comparative studies, and, for the design of similar monitoring programs elsewhere.
NASA Technical Reports Server (NTRS)
McBride, Bonnie J.; Gordon, Sanford
1996-01-01
This users manual is the second part of a two-part report describing the NASA Lewis CEA (Chemical Equilibrium with Applications) program. The program obtains chemical equilibrium compositions of complex mixtures with applications to several types of problems. The topics presented in this manual are: (1) details for preparing input data sets; (2) a description of output tables for various types of problems; (3) the overall modular organization of the program with information on how to make modifications; (4) a description of the function of each subroutine; (5) error messages and their significance; and (6) a number of examples that illustrate various types of problems handled by CEA and that cover many of the options available in both input and output. Seven appendixes give information on the thermodynamic and thermal transport data used in CEA; some information on common variables used in or generated by the equilibrium module; and output tables for 14 example problems. The CEA program was written in ANSI standard FORTRAN 77. CEA should work on any system with sufficient storage. There are about 6300 lines in the source code, which uses about 225 kilobytes of memory. The compiled program takes about 975 kilobytes.
NASA Astrophysics Data System (ADS)
Chen, Liang; Ma, Zhuguo; Mahmood, Rezaul; Zhao, Tianbao; Li, Zhenhua; Li, Yanping
2017-08-01
Reliable land cover data are important for improving numerical simulation by regional climate model, because the land surface properties directly affect climate simulation by partitioning of energy, water and momentum fluxes and by determining temperature and moisture at the interface between the land surface and atmosphere. China has experienced significant land cover change in recent decades and accurate representation of these changes is, hence, essential. In this study, we used a climate model to examine the changes experienced in the regional climate because of the different land cover data in recent decades. Three sets of experiments are performed using the same settings, except for the land use/cover (LC) data for the years 1990, 2000, 2009, and the model default LC data. Three warm season periods are selected, which represented a wet (1998), normal (2000) and a dry year (2011) for China in each set of experiment. The results show that all three sets of land cover experiments simulate a warm bias relative to the control with default LC data for near-surface temperature in summertime in most parts of China. It is especially noticeable in the southwest China and south of the Yangtze River, where significant changes of LC occurred. Deforestation in southwest China and to the south of Yangtze River in the experiment cases may have contributed to the negative precipitation bias relative to the control cases. Large LC changes in northwestern Tibetan Plateau for 2000 and 2009 datasets are also associated with changes in surface temperature, precipitation, and heat fluxes. Wind anomalies and energy budget changes are consistent with the precipitation and temperature changes.
A biologically inspired controller to solve the coverage problem in robotics.
Rañó, Iñaki; Santos, José A
2017-06-05
The coverage problem consists on computing a path or trajectory for a robot to pass over all the points in some free area and has applications ranging from floor cleaning to demining. Coverage is solved as a planning problem-providing theoretical validation of the solution-or through heuristic techniques which rely on experimental validation. Through a combination of theoretical results and simulations, this paper presents a novel solution to the coverage problem that exploits the chaotic behaviour of a simple biologically inspired motion controller, the Braitenberg vehicle 2b. Although chaos has been used for coverage, our approach has much less restrictive assumptions about the environment and can be implemented using on-board sensors. First, we prove theoretically that this vehicle-a well known model of animal tropotaxis-behaves as a charge in an electro-magnetic field. The motion equations can be reduced to a Hamiltonian system, and, therefore the vehicle follows quasi-periodic or chaotic trajectories, which pass arbitrarily close to any point in the work-space, i.e. it solves the coverage problem. Secondly, through a set of extensive simulations, we show that the trajectories cover regions of bounded workspaces, and full coverage is achieved when the perceptual range of the vehicle is short. We compare the performance of this new approach with different types of random motion controllers in the same bounded environments.
A Hybrid Memetic Framework for Coverage Optimization in Wireless Sensor Networks.
Chen, Chia-Pang; Mukhopadhyay, Subhas Chandra; Chuang, Cheng-Long; Lin, Tzu-Shiang; Liao, Min-Sheng; Wang, Yung-Chung; Jiang, Joe-Air
2015-10-01
One of the critical concerns in wireless sensor networks (WSNs) is the continuous maintenance of sensing coverage. Many particular applications, such as battlefield intrusion detection and object tracking, require a full-coverage at any time, which is typically resolved by adding redundant sensor nodes. With abundant energy, previous studies suggested that the network lifetime can be maximized while maintaining full coverage through organizing sensor nodes into a maximum number of disjoint sets and alternately turning them on. Since the power of sensor nodes is unevenly consumed over time, and early failure of sensor nodes leads to coverage loss, WSNs require dynamic coverage maintenance. Thus, the task of permanently sustaining full coverage is particularly formulated as a hybrid of disjoint set covers and dynamic-coverage-maintenance problems, and both have been proven to be nondeterministic polynomial-complete. In this paper, a hybrid memetic framework for coverage optimization (Hy-MFCO) is presented to cope with the hybrid problem using two major components: 1) a memetic algorithm (MA)-based scheduling strategy and 2) a heuristic recursive algorithm (HRA). First, the MA-based scheduling strategy adopts a dynamic chromosome structure to create disjoint sets, and then the HRA is utilized to compensate the loss of coverage by awaking some of the hibernated nodes in local regions when a disjoint set fails to maintain full coverage. The results obtained from real-world experiments using a WSN test-bed and computer simulations indicate that the proposed Hy-MFCO is able to maximize sensing coverage while achieving energy efficiency at the same time. Moreover, the results also show that the Hy-MFCO significantly outperforms the existing methods with respect to coverage preservation and energy efficiency.
National treatment programme of hepatitis C in Egypt: Hepatitis C virus model of care.
El-Akel, W; El-Sayed, M H; El Kassas, M; El-Serafy, M; Khairy, M; Elsaeed, K; Kabil, K; Hassany, M; Shawky, A; Yosry, A; Shaker, M K; ElShazly, Y; Waked, I; Esmat, G; Doss, W
2017-04-01
Hepatitis C virus (HCV) infection is a major health problem in Egypt as the nation bears the highest prevalence rate worldwide. This necessitated establishing a novel model of care (MOC) to contain the epidemic, deliver patient care and ensure global treatment access. In this review, we describe the process of development of the Egyptian model and future strategies for sustainability. Although the magnitude of the HCV problem was known for many years, the HCV MOC only came into being in 2006 with the establishment of the National Committee for Control of Viral Hepatitis (NCCVH) to set up and implement a national control strategy for the disease and other causes of viral hepatitis. The strategy outlines best practices for patient care delivery by applying a set of service principles through identified clinical streams and patient flow continuums. The Egyptian national viral hepatitis treatment programme is considered one of the most successful and effective public health programmes. To date, more than one million patients were evaluated and more than 850 000 received treatment under the umbrella of the programme since 2006. The NCCVH has been successful in establishing a strong infrastructure for controlling viral hepatitis in Egypt. It established a nationwide network of digitally connected viral hepatitis-specialized treatment centres covering the country map to enhance treatment access. Practice guidelines suiting local circumstances were issued and regularly updated and are applied in all affiliated centres. This review illustrates the model and the successful Egyptian experience. It sets an exemplar for states, organizations and policy-makers setting up programmes for care and management of people with hepatitis C. © 2017 John Wiley & Sons Ltd.
Vego, Goran; Kucar-Dragicević, Savka; Koprivanac, Natalija
2008-11-01
The efficiency of providing a waste management system in the coastal part of Croatia consisting of four Dalmatian counties has been modelled. Two multi-criteria decision-making (MCDM) methods, PROMETHEE and GAIA, were applied to assist with the systematic analysis and evaluation of the alternatives. The analysis covered two levels; first, the potential number of waste management centres resulting from possible inter-county cooperation; and second, the relative merits of siting of waste management centres in the coastal or hinterland zone was evaluated. The problem was analysed according to several criteria; and ecological, economic, social and functional criteria sets were identified as relevant to the decision-making process. The PROMETHEE and GAIA methods were shown to be efficient tools for analysing the problem considered. Such an approach provided new insights to waste management planning at the strategic level, and gave a reason for rethinking some of the existing strategic waste management documents in Croatia.
Baseline evaluation of nutritional status and government feeding programs in Chiclayo, Peru.
Gross, Rainer; Lechtig, Aarón; López de Romaña, Daniel
2006-01-01
Because of the rapid growth of the urban population in Peru, food and nutrition insecurity will occur increasingly in this population. For appropriate policy setting and programming, the food and nutrition situation of the urban poor requires better understanding. To gain information about the nature, magnitude, severity, and causes of the nutritional problems of the population in low-income areas of the city of Chiclayo, Peru. A cross-sectional nutrition survey was conducted in 1,604 households, covering children under 5 years of age and their parents. The prevalence rates of stunting, wasting, overweight. and anemia in children were 15.4%, 1.3%, 4.6%, and 65.7%, respectively; one third of adults were overweight, and one tenth were obese; 2.1% of the mothers were underweight; and 34.3% of mothers and 12.2% of fathers had anemia. Governmental feeding programs did not address these problems adequately. Interventions must have adequate targeting; address appropriate responses at the household, community, and national levels; and reduce stunting, obesity, and iron-deficiency anemia.
NASA Astrophysics Data System (ADS)
McDermott, Lillian C.; Shaffer, Peter S.; Somers, Mark D.
1994-01-01
A problem on the Atwood's machine is often introduced early in the teaching of dynamics to demonstrate the application of Newton's laws to the motion of a compound system. In a series of preliminary studies, student understanding of the Atwood's machine was examined after this topic had been covered in a typical calculus-based course. Analysis of the data revealed that many students had serious difficulties with the acceleration, the internal and external forces, and the role of the string. The present study was undertaken to obtain more detailed information about the nature and prevalence of these difficulties and thus provide a sound basis for the design of more effective instruction. The context for the investigation is a group of related problems involving less complicated compound systems. Specific examples illustrate how this research, which was conducted primarily in a classroom setting, has served as a guide in the development of tutorial materials to supplement the lectures and textbook in a standard introductory course.
Accuracy Evaluation of Two Global Land Cover Data Sets Over Wetlands of China
NASA Astrophysics Data System (ADS)
Niu, Z. G.; Shan, Y. X.; Gong, P.
2012-07-01
Although wetlands are well known as one of the most important ecosystems in the world, there are still few global wetland mapping efforts at present. To evaluate the wetland-related types of data accurately for both the Global Land Cover 2000 (GLC2000) data set and MODIS land cover data set (MOD12Q1), we used the China wetland map of 2000, which was interpreted manually based on Landsat TM images, to examine the precision of these global land cover data sets from two aspects (class area accuracy, and spatial agreement) across China. The results show that the area consistency coefficients of wetland-related types between the two global data sets and the reference data are 77.27% and 56.85%, respectively. However, the overall accuracy of relevant wetland types from GLC2000 is only 19.81% based on results of confusion matrix of spatial consistency, and similarly, MOD12Q1 is merely 18.91%. Furthermore, the accuracy of the peatlands is much lower than that of the water bodies according to the results of per-pixel comparison. The categories where errors occurred frequently mainly include grasslands, croplands, bare lands and part of woodland (deciduous coniferous forest, deciduous broadleaf forest and open shrubland). The possible reasons for the low precision of wetland-related land cover types include (1)the different aims of various products and therefore the inconsistent wetland definitions in their systems; (2) the coarse spatial resolution of satellite images used in global data; (3) Discrepancies in dates when images were acquired between the global data set and the reference data. Overall, the unsatisfactory results highlight that more attention should be paid to the application of these two global data products, especially in wetland-relevant types across China.
CLIPSITS - CLIPS INTELLIGENT TUTORING SYSTEM
NASA Technical Reports Server (NTRS)
Riley, G.
1994-01-01
The CLIPS Intelligent Tutoring System (CLIPSITS) is designed to be used to learn CLIPS, the C-language Integrated Production System expert system shell developed by the Software Technology Branch at Johnson Space Center. The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. This version of CLIPSITS is compatible with the Version 4.2 and 4.3 CLIPS User's Guide. However, the program does not cover any new features of CLIPS v4.3 that were added since the release of v4.2. The chapter numbers in the CLIPS User's Guide correspond directly with the lesson numbers in CLIPSITS. Each lesson in the program contains anywhere from 1 to 10 problems. Most of these have multiple parts. The student is given a subset of these problems from each lesson to work. The actual number of problems presented depends on how well the student masters the previous problem(s). The progression through these lessons is maintained in a personalized file under the student's name. As with most computer languages, there is usually more than one way to solve a problem. CLIPSITS attempts to be as flexible as possible and to allow as many correct solutions as possible. CLIPSITS gives the student the option of setting his/her own colors for the screen interface and the option of redefining special keystroke combinations used within the program. CLIPSITS requires an IBM PC compatible with 640K RAM and optional 2 or 3 button mouse. A 286- or 386-based machine is preferable. Performance will be somewhat slower on an XT class machine. The program must be installed on a hard disk with 825 KB space available. The program was developed in 1989. The standard distribution media is three 5.25" IBM PC DOS format diskettes. The program is also sold bundled with CLIPS for a special combined price as COS-10025. NOTE: Only the executable code is distributed. Supporting documentation is included on the diskettes. IBM, IBM PC and XT are registered trademarks of International Business Machines Corporation.
Using movies in family medicine teaching: A reference to EURACT Educational Agenda
Švab, Igor
2017-01-01
Abstract Introduction Cinemeducation is a teaching method where popular movies or movie clips are used. We aimed to determine whether family physicians’ competencies as listed in the Educational Agenda produced by the European Academy of Teachers in General Practice/Family Medicine (EURACT) can be found in movies, and to propose a template for teaching by these movies. Methods A group of family medicine teachers provided a list of movies that they would use in cinemeducation. The movies were categorised according to the key family medicine competencies, thus creating a framework of competences, covered by different movies. These key competencies are Primary care management, Personcentred care, Specific problem-solving skills, Comprehensive approach, Community orientation, and Holistic approach. Results The list consisted of 17 movies. Nine covered primary care management. Person-centred care was covered in 13 movies. Eight movies covered specific problem-solving skills. Comprehensive approach was covered in five movies. Five movies covered community orientation. Holistic approach was covered in five movies. Conclusions All key family medicine competencies listed in the Educational Agenda can be taught using movies. Our results can serve as a template for teachers on how to use any appropriate movies in family medicine education. PMID:28289469
Using movies in family medicine teaching: A reference to EURACT Educational Agenda.
Klemenc Ketiš, Zalika; Švab, Igor
2017-06-01
Cinemeducation is a teaching method where popular movies or movie clips are used. We aimed to determine whether family physicians' competencies as listed in the Educational Agenda produced by the European Academy of Teachers in General Practice/Family Medicine (EURACT) can be found in movies, and to propose a template for teaching by these movies. A group of family medicine teachers provided a list of movies that they would use in cinemeducation. The movies were categorised according to the key family medicine competencies, thus creating a framework of competences, covered by different movies. These key competencies are Primary care management, Personcentred care, Specific problem-solving skills, Comprehensive approach, Community orientation, and Holistic approach. The list consisted of 17 movies. Nine covered primary care management. Person-centred care was covered in 13 movies. Eight movies covered specific problem-solving skills. Comprehensive approach was covered in five movies. Five movies covered community orientation. Holistic approach was covered in five movies. All key family medicine competencies listed in the Educational Agenda can be taught using movies. Our results can serve as a template for teachers on how to use any appropriate movies in family medicine education.
Survey of Technologies for Monitoring Containment Liners and Covers
The report provides information on innovative long-term monitoring technologies to detect contaminant releases beneath a liner containment system and identify potential problems with the integrity of final containment covers.
Environmental problems caused by Istanbul subway excavation and suggestions for remediation
NASA Astrophysics Data System (ADS)
Ocak, Ibrahim
2009-10-01
Many environmental problems caused by subway excavations have inevitably become an important point in city life. These problems can be categorized as transporting and stocking of excavated material, traffic jams, noise, vibrations, piles of dust mud and lack of supplies. Although these problems cause many difficulties, the most pressing for a big city like Istanbul is excavation, since other listed difficulties result from it. Moreover, these problems are environmentally and regionally restricted to the period over which construction projects are underway and disappear when construction is finished. Currently, in Istanbul, there are nine subway construction projects in operation, covering approximately 73 km in length; over 200 km to be constructed in the near future. The amount of material excavated from ongoing construction projects covers approximately 12 million m3. In this study, problems—primarily, the problem with excavation waste (EW)—caused by subway excavation are analyzed and suggestions for remediation are offered.
Estimation of neutron energy distributions from prompt gamma emissions
NASA Astrophysics Data System (ADS)
Panikkath, Priyada; Udupi, Ashwini; Sarkar, P. K.
2017-11-01
A technique of estimating the incident neutron energy distribution from emitted prompt gamma intensities from a system exposed to neutrons is presented. The emitted prompt gamma intensities or the measured photo peaks in a gamma detector are related to the incident neutron energy distribution through a convolution of the response of the system generating the prompt gammas to mono-energetic neutrons. Presently, the system studied is a cylinder of high density polyethylene (HDPE) placed inside another cylinder of borated HDPE (BHDPE) having an outer Pb-cover and exposed to neutrons. The emitted five prompt gamma peaks from hydrogen, boron, carbon and lead can be utilized to unfold the incident neutron energy distribution as an under-determined deconvolution problem. Such an under-determined set of equations are solved using the genetic algorithm based Monte Carlo de-convolution code GAMCD. Feasibility of the proposed technique is demonstrated theoretically using the Monte Carlo calculated response matrix and intensities of emitted prompt gammas from the Pb-covered BHDPE-HDPE system in the case of several incident neutron spectra spanning different energy ranges.
Prince Albert National Park Forest Cover Data in Vector Format
NASA Technical Reports Server (NTRS)
Fitzsimmons, Michael; Nickeson, Jaime; Hall, Forrest G. (Editor)
2000-01-01
This data set provides detailed canopy, understory, and ground cover height, density, and condition information for PANP in the western portion of the BOReal Ecosystem-Atmosphere Study (BOREAS) Southern Study Area (SSA) in vector form. The original biophysical resource data set was produced in 1978 based on aerial photographs taken in 1968 and field work conducted in the mid-1970s, and PANP's update/revision of the data set was completed in 1994. The data are stored in an ARC/INFO export file.
USDA-ARS?s Scientific Manuscript database
The compilation of global Landsat data-sets and the ever-lowering costs of computing now make it feasible to monitor the Earth’s land cover at Landsat resolutions of 30 m. In this article, we describe the methods to create global products of forest cover and cover change at Landsat resolutions. Neve...
Code of Federal Regulations, 2010 CFR
2010-01-01
... include a provision setting forth the type of events that are covered events under the contract. The type...) Litigation in State, Federal, local, or tribal courts, including appeals of Commission decisions related to..., including but not limited to the following types of events: (i) The sponsor's failure to comply with...
USDA-ARS?s Scientific Manuscript database
A retrospective land cover analysis covering the time period from the early 1970s to early 1990s was conducted to gain a sense of the dynamics of land cover changes on the Little Washita River and Fort Cobb Reservoir experimental watersheds (LWREW, FCREW), located in southwestern Oklahoma. This stu...
Rachel Riemann; Jarlath O' Neil-Dunne; Greg C. Liknes
2012-01-01
Tree canopy cover and canopy height information are essential for estimating volume, biomass, and carbon; defining forest cover; and characterizing wildlife habitat. The amount of tree canopy cover also influences water quality and quantity in both rural and urban settings. Tree canopy cover and canopy height are currently collected at FIA plots either in the field or...
NASA Astrophysics Data System (ADS)
Tian, Y.; Dickinson, R. E.; Zhou, L.; Shaikh, M.
2004-10-01
This paper uses the Community Land Model (CLM2) to investigate the improvements of a new land surface data set, created from multiple high-quality collection 4 Moderate Resolution Imaging Spectroradiometer data of leaf area index (LAI), plant functional type, and vegetation continuous fields, for modeled land surface variables. The previous land surface data in CLM2 underestimate LAI and overestimate the percent cover of grass/crop over most of the global area. For snow-covered regions with abundant solar energy the increased LAI and percent cover of tree/shrub in the new data set decreases the percent cover of surface snow and increases net radiation and thus increases ground and surface (2-m) air temperature, which reduces most of the model cold bias. For snow-free regions the increased LAI and changes in the percent cover from grass/crop to tree or shrub decrease ground and surface air temperature by converting most of the increased net radiation to latent heat flux, which decreases the model warm bias. Furthermore, the new data set greatly decreases ground evaporation and increases canopy evapotranspiration over tropical forests, especially during the wet season, owing to the higher LAI and more trees in the new data set. It makes the simulated ground evaporation and canopy evapotranspiration closer to reality and also reduces the warm biases over tropical regions.
Individualized Math Problems in Whole Numbers. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. Problems in this set require computations involving whole numbers.…
PRISM 8 degrees X 10 degrees North Hemisphere paleoclimate reconstruction; digital data
Barron, John A.; Cronin, Thomas M.; Dowsett, Harry J.; Fleming, Farley R.; Holtz, Thomas R.; Ishman, Scott E.; Poore, Richard Z.; Thompson, Robert S.; Willard, Debra A.
1994-01-01
The PRISM 8?x10? data set represents several years of investigation by PRISM (Pliocene Research, Interpretation, and Synoptic Mapping) Project members. One of the goals of PRISM is to produce time-slice reconstructions of intervals of warmer than modern climate within the Pliocene Epoch. The first of these was chosen to be at 3.0 Ma (time scale of Berggren et al., 1985) and is published in Global and Planetary Change (Dowsett et al., 1994). This document contains the actual data sets and a brief explanation of how they were constructed. For paleoenvironmental interpretations and discussion of each data set, see Dowsett et al., in press. The data sets includes sea level, land ice distribution, vegetation or land cover, sea surface temperature and sea-ice cover matrices. This reconstruction of Middle Pliocene climate is organized as a series of datasets representing different environmental attributes. The data sets are designed for use with the GISS Model II atmospheric general circulation model (GCM) using an 8?x10? resolution (Hansen et al., 1983). The first step in documenting the Pliocene climate involves assigning an appropriate fraction of land versus ocean to each grid box. Following grid cell by grid cell, land versus ocean allocations, winter and summer sea ice coverage of ocean areas are assigned and then winter and summer sea surface temperatures are assigned to open ocean areas. Average land ice cover is recorded for land areas and then land areas not covered by ice are assigned proportions of six vegetation or land cover categories modified from Hansen et al. (1983).
Two Methods for Efficient Solution of the Hitting-Set Problem
NASA Technical Reports Server (NTRS)
Vatan, Farrokh; Fijany, Amir
2005-01-01
A paper addresses much of the same subject matter as that of Fast Algorithms for Model-Based Diagnosis (NPO-30582), which appears elsewhere in this issue of NASA Tech Briefs. However, in the paper, the emphasis is more on the hitting-set problem (also known as the transversal problem), which is well known among experts in combinatorics. The authors primary interest in the hitting-set problem lies in its connection to the diagnosis problem: it is a theorem of model-based diagnosis that in the set-theory representation of the components of a system, the minimal diagnoses of a system are the minimal hitting sets of the system. In the paper, the hitting-set problem (and, hence, the diagnosis problem) is translated from a combinatorial to a computational problem by mapping it onto the Boolean satisfiability and integer- programming problems. The paper goes on to describe developments nearly identical to those summarized in the cited companion NASA Tech Briefs article, including the utilization of Boolean-satisfiability and integer- programming techniques to reduce the computation time and/or memory needed to solve the hitting-set problem.
Developing consistent Landsat data sets for large area applications: the MRLC 2001 protocol
Chander, G.; Huang, Chengquan; Yang, Limin; Homer, Collin G.; Larson, C.
2009-01-01
One of the major efforts in large area land cover mapping over the last two decades was the completion of two U.S. National Land Cover Data sets (NLCD), developed with nominal 1992 and 2001 Landsat imagery under the auspices of the MultiResolution Land Characteristics (MRLC) Consortium. Following the successful generation of NLCD 1992, a second generation MRLC initiative was launched with two primary goals: (1) to develop a consistent Landsat imagery data set for the U.S. and (2) to develop a second generation National Land Cover Database (NLCD 2001). One of the key enhancements was the formulation of an image preprocessing protocol and implementation of a consistent image processing method. The core data set of the NLCD 2001 database consists of Landsat 7 Enhanced Thematic Mapper Plus (ETM+) images. This letter details the procedures for processing the original ETM+ images and more recent scenes added to the database. NLCD 2001 products include Anderson Level II land cover classes, percent tree canopy, and percent urban imperviousness at 30-m resolution derived from Landsat imagery. The products are freely available for download to the general public from the MRLC Consortium Web site at http://www.mrlc.gov.
Cover crops to improve soil health and pollinator habitat in nut orchards: Part II
Jerry Van Sambeek
2017-01-01
Integrating cover crops into a nut orchard can have some unique benefits and problems not found when used cover crops during the fallow period between cash crops. Studies show ground covers can reduce hardwood tree growth anywhere from a few percent to more than 70 percent in the case of tall fescue. This means if it takes 3 years to put on one inch of diameter growth...
Use of Cover Crops in Hardwood Production
Randy Rentz
2005-01-01
Cover crops are as essential a practice in hardwood production as in pine production or any other nursery operation. Without proper cover crop rotation in a nursery plan, we open ourselves up to an array of problems: more diseases, wrong pH, more weeds, reduced fertility, and less downward percolation of soil moisture due, in part, to compaction....
Facilitating the exploitation of ERTS imagery using snow enhancement techniques
NASA Technical Reports Server (NTRS)
Wobber, F. J.; Martin, K. (Principal Investigator); Amato, R. V.; Leshendok, T.
1973-01-01
The author has identified the following significant results. Comparative analysis of snow-free and snow-covered imagery of the New England Test Area has resulted in a larger number of lineaments mapped from snow-covered imagery in three out of four sets of comparative imagery. Analysts unfamiliar with the New England Test Area were utilized; the quality of imagery was independently judged to be uniform. In all image sets, a greater total length of lineaments was mapped with the snow-covered imagery. The value of this technique for fracture mapping in areas with thick soil cover is suggested. A number of potentially useful environmental applications of snow enhancement related to such areas as mining, land use, and hydrology have been identified.
[On risk-oriented model of sanitary epidemiologic surveillance in occupational hygiene].
Zaitseval, N V; Mai, I V; Kostarev, V G; Bashketova, N S
2015-01-01
In 2015, Federal Service on surveillance in consumers rights protection and public well-being set a task to organize planned work of regional agencies on basis of risk-oriented model of control and supervision. Based on results of pilot project in Rospotrebnadzor Department of Perm area and St-Petersburg, the article covers methodic approaches to classification of objects liable to surveillance in occupational hygiene. The classification considers possibility of sanitary law violation, severity of this violation consequences and number of workers exposed to risk factors including hazardous work conditions. The authors specified recommendations on periodicity and forms of planned inspections considering evaluation of potential risk for human health, determined problems that require solution in implementation of risk-oriented model of surveillance.
Parallel PAB3D: Experiences with a Prototype in MPI
NASA Technical Reports Server (NTRS)
Guerinoni, Fabio; Abdol-Hamid, Khaled S.; Pao, S. Paul
1998-01-01
PAB3D is a three-dimensional Navier Stokes solver that has gained acceptance in the research and industrial communities. It takes as computational domain, a set disjoint blocks covering the physical domain. This is the first report on the implementation of PAB3D using the Message Passing Interface (MPI), a standard for parallel processing. We discuss briefly the characteristics of tile code and define a prototype for testing. The principal data structure used for communication is derived from preprocessing "patching". We describe a simple interface (COMMSYS) for MPI communication, and some general techniques likely to be encountered when working on problems of this nature. Last, we identify levels of improvement from the current version and outline future work.
Does inequality in health impede economic growth?
Grimm, Michael
2011-01-01
This paper investigates the effects of inequality in health on economic growth in low and middle income countries. The empirical part of the paper uses an original cross-national panel data set covering 62 low and middle income countries over the period 1985 to 2007. I find a substantial and relatively robust negative effect of health inequality on income levels and income growth controlling for life expectancy, country and time fixed-effects and a large number of other effects that have been shown to matter for growth. The effect also holds if health inequality is instrumented to circumvent a potential problem of reverse causality. Hence, reducing inequality in the access to health care and to health-related information can make a substantial contribution to economic growth.
A data reduction, management, and analysis system for a 10-terabyte data set
NASA Technical Reports Server (NTRS)
DeMajistre, R.; Suther, L.
1995-01-01
Within 12 months a 5-year space-based research investigation with an estimated daily data volume of 10 to 15 gigabytes will be launched. Our instrument/analysis team will analyze 2 to 8 gigabytes per day from this mission. Most of these data will be spatial and multispectral collected from nine sensors covering the UV/Visible/NlR spectrum. The volume and diversity of these data and the nature of its analysis require a very robust reduction and management system. This paper is a summary of the systems requirements and a high-level description of a solution. The paper is intended as a case study of the problems and potential solutions faced by the new generation of Earth observation data support systems.
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
[Standard operating procedures in ethic committees].
Czarkowski, Marek
2006-02-01
Polish ethic committees should have to work together in order to maintain and develop high quality standards in the protection of human subjects. Exchanging knowledge, know-how and information polish ethic committees should have to implement standard operating procedures. Procedures should improve quality and proficiency of all types of ethic committee's activities. Standard operating procedures should cover as important problems as conflict of interest, trial's insurance or elections of ethic committees. The opinions of experts who have been reviewing medical research projects for several years may prove to be especially valuable in this setting. Governmental initiatives and creation of forum for polish ethic committees are essential in the effective standardisation, coordination and implementation of procedures in regional ethic committees. These projects need support via public funding from our authorities.
NASA Astrophysics Data System (ADS)
Liu, GaiYun; Chao, Daniel Yuh
2015-08-01
To date, research on the supervisor design for flexible manufacturing systems focuses on speeding up the computation of optimal (maximally permissive) liveness-enforcing controllers. Recent deadlock prevention policies for systems of simple sequential processes with resources (S3PR) reduce the computation burden by considering only the minimal portion of all first-met bad markings (FBMs). Maximal permissiveness is ensured by not forbidding any live state. This paper proposes a method to further reduce the size of minimal set of FBMs to efficiently solve integer linear programming problems while maintaining maximal permissiveness using a vector-covering approach. This paper improves the previous work and achieves the simplest structure with the minimal number of monitors.
Uncertainty quantification and propagation in nuclear density functional theory
Schunck, N.; McDonnell, J. D.; Higdon, D.; ...
2015-12-23
Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this study, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statisticalmore » analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.« less
Radar polarimetry - Analysis tools and applications
NASA Technical Reports Server (NTRS)
Evans, Diane L.; Farr, Tom G.; Van Zyl, Jakob J.; Zebker, Howard A.
1988-01-01
The authors have developed several techniques to analyze polarimetric radar data from the NASA/JPL airborne SAR for earth science applications. The techniques determine the heterogeneity of scatterers with subregions, optimize the return power from these areas, and identify probable scattering mechanisms for each pixel in a radar image. These techniques are applied to the discrimination and characterization of geologic surfaces and vegetation cover, and it is found that their utility varies depending on the terrain type. It is concluded that there are several classes of problems amenable to single-frequency polarimetric data analysis, including characterization of surface roughness and vegetation structure, and estimation of vegetation density. Polarimetric radar remote sensing can thus be a useful tool for monitoring a set of earth science parameters.
Making Safe Surgery Affordable: Design of a Surgical Drill Cover System for Scale.
Buchan, Lawrence L; Black, Marianne S; Cancilla, Michael A; Huisman, Elise S; Kooyman, Jeremy J R; Nelson, Scott C; OʼHara, Nathan N; OʼBrien, Peter J; Blachut, Piotr A
2015-10-01
Many surgeons in low-resource settings do not have access to safe, affordable, or reliable surgical drilling tools. Surgeons often resort to nonsterile hardware drills because they are affordable, robust, and efficient, but they are impossible to sterilize using steam. A promising alternative is to use a Drill Cover system (a sterilizable fabric bag plus surgical chuck adapter) so that a nonsterile hardware drill can be used safely for surgical bone drilling. Our objective was to design a safe, effective, affordable Drill Cover system for scale in low-resource settings. We designed our device based on feedback from users at Mulago Hospital (Kampala, Uganda) and focused on 3 main aspects. First, the design included a sealed barrier between the surgical field and hardware drill that withstands pressurized fluid. Second, the selected hardware drill had a maximum speed of 1050 rpm to match common surgical drills and reduce risk of necrosis. Third, the fabric cover was optimized for ease of assembly while maintaining a sterile technique. Furthermore, with the Drill Cover approach, multiple Drill Covers can be provided with a single battery-powered drill in a "kit," so that the drill can be used in back-to-back surgeries without requiring immediate sterilization. The Drill Cover design presented here provides a proof-of-concept for a product that can be commercialized, produced at scale, and used in low-resource settings globally to improve access to safe surgery.
Unified Health Gamification can significantly improve well-being in corporate environments.
Shahrestani, Arash; Van Gorp, Pieter; Le Blanc, Pascale; Greidanus, Fabrizio; de Groot, Kristel; Leermakers, Jelle
2017-07-01
There is a multitude of mHealth applications that aim to solve societal health problems by stimulating specific types of physical activities via gamification. However, physical health activities cover just one of the three World Health Organization (WHO) dimensions of health. This paper introduces the novel notion of Unified Health Gamification (UHG), which covers besides physical health also social and cognitive health and well-being. Instead of rewarding activities in the three WHO dimensions using different mHealth competitions, UHG combines the scores for such activities on unified leaderboards and lets people interact in social circles beyond personal interests. This approach is promising in corporate environments since UHG can connect the employees with intrinsic motivation for physical health with those who have quite different interests. In order to evaluate this approach, we realized an app prototype and we evaluated it in two corporate pilot studies. In total, eighteen pilot users participated voluntarily for six weeks. Half of the participants were recruited from an occupational health setting and the other half from a treatment setting. Our results suggest that the UHG principles are worth more investigation: various positive health effects were found based on a validated survey. The mean mental health improved significantly at one pilot location and at the level of individual pilot participants, multiple other effects were found to be significant: among others, significant mental health improvements were found for 28% of the participants. Most participants intended to use the app beyond the pilot, especially if it would be further developed.
NASA Astrophysics Data System (ADS)
Ioannidis, Eleftherios; Lolis, Christos J.; Papadimas, Christos D.; Hatzianastassiou, Nikolaos; Bartzokas, Aristides
2017-04-01
The seasonal variability of total cloud cover in the Mediterranean region is examined for the period 1948-2014 using a multivariate statistical methodology. The data used consist of: i) daily gridded (1.875°x1.905°) values of total cloud cover over the broader Mediterranean region for the 66-year period 1948-2014, obtained from NCEP/NCAR Reanalysis data set, ii) daily gridded (1°x1°) values of total cloud cover for the period 2003-2014 obtained from the Moderate resolution Imaging Spectroradiometer (MODIS) satellite data set and iii) daily station cloud cover data for the period 2003-2014 obtained from the European Climate Assessment & Dataset (ECA&D). At first, the multivariate statistical method of Factor Analysis (S-mode) with varimax rotation is applied as a dimensionality reduction tool on the mean day to day intra-annual variation of NCEP/NCAR cloud cover for the period 1948-2014. According to the results, three main modes of intra-annual variation of cloud cover are found. The first mode is characterized by a winter maximum and a summer minimum and prevails mainly over the sea; a weak see-saw teleconnection over the Alps represents the opposite intra-annual marching. The second mode presents maxima in early autumn and late spring, and minima in late summer and winter, and prevails over the SW Europe and NW Africa inland regions. The third mode shows a maximum in June and a minimum in October and prevails over the eastern part of central Europe. Next, the mean day to day intra-annual variation of NCEP/NCAR cloud cover over the core regions of the above factors is calculated for the entire period 1948-2014 and the three 22-year sub-periods 1948-70, 1970-92 and 1992-2014. A comparison is carried out between each of the three sub-periods and the total period in order to reveal possible long-term changes in seasonal march of total cloud cover. The results show that cloud cover was reduced above all regions during the last 22-year sub-period 1992-2014 throughout the year, but especially in winter. Finally, given the different nature of the utilized NCEP/NCAR (Reanalysis), MODIS (satellite) and ECAD (stations) cloud cover data sets, an inter-comparison is made among them as it concerns the intra-annual variation of cloud cover for the common period 2003-2014. The results show a nice similarity among the three datasets, with some differences in magnitude during the cold period of the year.
Malkin, Robert; Keane, Allison
2010-07-01
Much of the laboratory and medical equipment in resource-poor settings is out-of-service. The most commonly cited reasons are (1) a lack of spare parts and (2) a lack of highly trained technicians. However, there is little data to support these hypotheses, or to generate evidence-based solutions to the problem. We studied 2,849 equipment-repair requests (of which 2,529 were out-of-service medical equipment) from 60 resource-poor hospitals located in 11 nations in Africa, Europe, Asia, and Central America. Each piece of equipment was analyzed by an engineer or an engineering student and a repair was attempted using only locally available materials. If the piece was placed back into service, we assumed that the engineer's problem analysis was correct. A total of 1,821 pieces of medical equipment were placed back into service, or 72%, without requiring the use of imported spare parts. Of those pieces repaired, 1,704 were sufficiently documented to determine what knowledge was required to place the equipment back into service. We found that six domains of knowledge were required to accomplish 99% of the repairs: electrical (18%), mechanical (18%), power supply (14%), plumbing (19%), motors (5%), and installation or user training (25%). A further analysis of the domains shows that 66% of the out-of-service equipment was placed back into service using only 107 skills covering basic knowledge in each domain; far less knowledge than that required of a biomedical engineer or biomedical engineering technician. We conclude that a great majority of laboratory and medical equipment can be put back into service without importing spare parts and using only basic knowledge. Capacity building in resource-poor settings should first focus on a limited set of knowledge; a body of knowledge that we call the biomedical technician's assistant (BTA). This data set suggests that a supported BTA could place 66% of the out-of-service laboratory and medical equipment in their hospital back into service.
Interleaved Observation Execution and Rescheduling on Earth Observing Systems
NASA Technical Reports Server (NTRS)
Khatib, Lina; Frank, Jeremy; Smith, David; Morris, Robert; Dungan, Jennifer
2003-01-01
Observation scheduling for Earth orbiting satellites solves the following problem: given a set of requests for images of the Earth, a set of instruments for acquiring those images distributed on a collecting of orbiting satellites, and a set of temporal and resource constraints, generate a set of assignments of instruments and viewing times to those requests that satisfy those constraints. Observation scheduling is often construed as a constrained optimization problem with the objective of maximizing the overall utility of the science data acquired. The utility of an image is typically based on the intrinsic importance of acquiring it (for example, its importance in meeting a mission or science campaign objective) as well as the expected value of the data given current viewing conditions (for example, if the image is occluded by clouds, its value is usually diminished). Currently, science observation scheduling for Earth Observing Systems is done on the ground, for periods covering a day or more. Schedules are uplinked to the satellites and are executed rigorously. An alternative to this scenario is to do some of the decision-making about what images are to be acquired on-board. The principal argument for this capability is that the desirability of making an observation can change dynamically, because of changes in meteorological conditions (e.g. cloud cover), unforeseen events such as fires, floods, or volcanic eruptions, or un-expected changes in satellite or ground station capability. Furthermore, since satellites can only communicate with the ground between 5% to 10% of the time, it may be infeasible to make the desired changes to the schedule on the ground, and uplink the revisions in time for the on-board system to execute them. Examples of scenarios that motivate an on-board capability for revising schedules include the following. First, if a desired visual scene is completely obscured by clouds, then there is little point in taking it. In this case, satellite resources, such as power and storage space can be better utilized taking another image that is higher quality. Second, if an unexpected but important event occurs (such as a fire, flood, or volcanic eruption), there may be good reason to take images of it, instead of expending satellite resources on some of the lower priority scheduled observations. Finally, if there is unexpected loss of capability, it may be impossible to carry out the schedule of planned observations. For example, if a ground station goes down temporarily, a satellite may not be able to free up enough storage space to continue with the remaining schedule of observations. This paper describes an approach for interleaving execution of observation schedules with dynamic schedule revision based on changes to the expected utility of the acquired images. We describe the problem in detail, formulate an algorithm for interleaving schedule revision and execution, and discuss refinements to the algorithm based on the need for search efficiency. We summarize with a brief discussion of the tests performed on the system.
A Model Evaluation Data Set for the Tropical ARM Sites
Jakob, Christian
2008-01-15
This data set has been derived from various ARM and external data sources with the main aim of providing modelers easy access to quality controlled data for model evaluation. The data set contains highly aggregated (in time) data from a number of sources at the tropical ARM sites at Manus and Nauru. It spans the years of 1999 and 2000. The data set contains information on downward surface radiation; surface meteorology, including precipitation; atmospheric water vapor and cloud liquid water content; hydrometeor cover as a function of height; and cloud cover, cloud optical thickness and cloud top pressure information provided by the International Satellite Cloud Climatology Project (ISCCP).
A Course on Surface Phenomena.
ERIC Educational Resources Information Center
Woods, Donald R.
1983-01-01
Describes a graduate or senior elective course combining fundamentals of surface phenomena with practical problem-solving structured around a series of case problems. Discusses topics covered and their development through acquiring new knowledge applied to the case problem, practical calculations of solutions, and applications to additional…
Abdel-Ghany, Ahmed M.; Al-Helal, Ibrahim M.; Alzahrani, Saeed M.; Alsadon, Abdullah A.; Ali, Ilias M.; Elleithy, Rabeh M.
2012-01-01
Cooling greenhouses is essential to provide a suitable environment for plant growth in arid regions characterized by brackish water resources. However, using conventional cooling methods are facing many challenges. Filtering out near infra-red radiation (NIR) at the greenhouse cover can significantly reduce the heating load and can solve the overheating problem of the greenhouse air. This paper is to review (i) the problems of using conventional cooling methods and (ii) the advantages of greenhouse covers that incorporate NIR reflectors. This survey focuses on how the cover type affects the transmittance of photosynthetically active radiation (PAR), the reflectance or absorptance of NIR and the greenhouse air temperature. NIR-reflecting plastic films seem to be the most suitable, low cost and simple cover for greenhouses under arid conditions. Therefore, this review discusses how various additives should be incorporated in plastic film to increase its mechanical properties, durability and ability to stand up to extremely harsh weather. Presently, NIR-reflecting covers are able to reduce greenhouse air temperature by no more than 5°C. This reduction is not enough in regions where the ambient temperature may exceed 45°C in summer. There is a need to develop improved NIR-reflecting plastic film covers. PMID:22629223
... eyes (strabismus) Hearing problems Increased body hair ( hirsutism ) Scoliosis Seizures Streaked, whorled or mottled patches of skin ... may be used to cover the patches. Seizures, scoliosis, and other problems are treated as needed.
Statistical physics of hard combinatorial optimization: Vertex cover problem
NASA Astrophysics Data System (ADS)
Zhao, Jin-Hua; Zhou, Hai-Jun
2014-07-01
Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.
[Geriatric assessment. Development, status quo and perspectives].
Lüttje, D; Varwig, D; Teigel, B; Gilhaus, B
2011-08-01
Multimorbidity is typical for geriatric patients. Problems not identified in time may lead to increased hospitalisation or prolonged hospital stay. Problems of multimorbidity are not covered by most guidelines or clinical pathways. The geriatric assessment supports standard clinical and technical assessment. Geriatric identification screening is basic for general practitioners and in emergency rooms to filter those patients bearing a special risk. Geriatric basic assessment covers most of the problems relevant for people in old age, revealing even problems that had so far been hidden. It permits to structure a comprehensive and holistic therapeutic approach and to evaluate the targets of treatment relevant for independent living and well-being. This results in reduction of morbidity and mortality. Assessment tools focusing on pain, nutrition and frailty should be added to the standardized geriatric basic assessment in Germany.
1992-01-30
Agent Resistant Coating, retractable nylon handles, ethylene propylenediene monomer ( EPDM ) rubber securing straps, and a woven monofilament polypropylene...without Z-folds: overlapping layers of fabric sewn as reinforcement in the cover opposite the spreader bars. These covers were made from fabrics that...problem is not isolated to the decon litter and mattress. Comparable slippage should be expected when the vinyl mattress is used with the new nylon
Selecting indicators for patient safety at the health system level in OECD countries.
McLoughlin, Vivienne; Millar, John; Mattke, Soeren; Franca, Margarida; Jonsson, Pia Maria; Somekh, David; Bates, David
2006-09-01
Concerns about patient safety have arisen with growing documentation of the extent and nature of harm. Yet there are no robust and meaningful data that can be used internationally to assess the extent of the problem and considerable methodological difficulties. This article describes a project undertaken as part of the Organization for Economic Cooperation and Development (OECD) Quality Indicator Project, which aimed at developing an initial set of patient safety indicators. Patient safety indicators from OECD countries were identified and then rated against three principal criteria: importance to patient safety, scientific soundness, and potential feasibility. Although some countries are developing multi-source monitoring systems, these are not yet mature enough for international exchange. This project reviewed routine data collections as a starting point. Of an initial set of 59 candidate indicators identified, 21 were selected which cover known areas of harm to patients. This project is an important initial step towards defining a usable set of patient safety indicators that will allow comparisons to be made internationally and will support mutual learning and quality improvement in health care. Measures of harm should be complemented over time with measures of effective improvement factors.
Fast, Safe, Propellant-Efficient Spacecraft Motion Planning Under Clohessy-Wiltshire-Hill Dynamics
NASA Technical Reports Server (NTRS)
Starek, Joseph A.; Schmerling, Edward; Maher, Gabriel D.; Barbee, Brent W.; Pavone, Marco
2016-01-01
This paper presents a sampling-based motion planning algorithm for real-time and propellant-optimized autonomous spacecraft trajectory generation in near-circular orbits. Specifically, this paper leverages recent algorithmic advances in the field of robot motion planning to the problem of impulsively actuated, propellant- optimized rendezvous and proximity operations under the Clohessy-Wiltshire-Hill dynamics model. The approach calls upon a modified version of the FMT* algorithm to grow a set of feasible trajectories over a deterministic, low-dispersion set of sample points covering the free state space. To enforce safety, the tree is only grown over the subset of actively safe samples, from which there exists a feasible one-burn collision-avoidance maneuver that can safely circularize the spacecraft orbit along its coasting arc under a given set of potential thruster failures. Key features of the proposed algorithm include 1) theoretical guarantees in terms of trajectory safety and performance, 2) amenability to real-time implementation, and 3) generality, in the sense that a large class of constraints can be handled directly. As a result, the proposed algorithm offers the potential for widespread application, ranging from on-orbit satellite servicing to orbital debris removal and autonomous inspection missions.
Condon, Lea; Pyke, David A.
2016-01-01
Biological soil crusts contribute to ecosystem functions and occupy space that could be available to invasive annual grasses. Given disturbances in the semiarid shrub steppe communities, we embarked on a set of studies to investigate restoration potential of mosses in sagebrush steppe ecosystems. We examined establishment and growth of two moss species common to the Great Basin, USA: Bryum argenteum and Syntrichia ruralis from two environmental settings (warm dry vs. cool moist). Moss fragments were inoculated into a third warm dry setting, on bare soil in spring and fall, both with and without a jute net and with and without spring irrigation. Moss cover was monitored in spring seasons of three consecutive years. Both moss species increased in cover over the winter. When Bryum received spring irrigation that was out of sync with natural precipitation patterns, moss cover increased and then crashed, taking two seasons to recover. Syntrichia did not respond to the irrigation treatment. The addition of jute net increased moss cover under all conditions, except Syntrichia following fall inoculation, which required a second winter to increase in cover. The warm dry population of Bryum combined with jute achieved on average 60% cover compared to the cool moist population that achieved only 28% cover by the end of the study. Differences were less pronounced for Syntrichia where moss from the warm dry population with jute achieved on average 51% cover compared to the cool moist population that achieved 43% cover by the end of the study. Restoration of arid land mosses may quickly protect soils from erosion while occupying sites before invasive plants. We show that higher moss cover will be achieved quickly with the addition of organic matter and when moss fragments originate from sites with a climate that is similar to that of the restoration site.
DOT National Transportation Integrated Search
2016-08-01
Attaining adequate vegetation cover along highways is important for Nebraska Department of Roads (NDOR) to comply with the Environmental Protection Agency's (EPAs) stormwater regulations. However, low plant cover is a common problem on shoulders (...
Representative landscapes in the forested area of Canada.
Cardille, Jeffrey A; White, Joanne C; Wulder, Mike A; Holland, Tara
2012-01-01
Canada is a large nation with forested ecosystems that occupy over 60% of the national land base, and knowledge of the patterns of Canada's land cover is important to proper environmental management of this vast resource. To this end, a circa 2000 Landsat-derived land cover map of the forested ecosystems of Canada has created a new window into understanding the composition and configuration of land cover patterns in forested Canada. Strategies for summarizing such large expanses of land cover are increasingly important, as land managers work to study and preserve distinctive areas, as well as to identify representative examples of current land-cover and land-use assemblages. Meanwhile, the development of extremely efficient clustering algorithms has become increasingly important in the world of computer science, in which billions of pieces of information on the internet are continually sifted for meaning for a vast variety of applications. One recently developed clustering algorithm quickly groups large numbers of items of any type in a given data set while simultaneously selecting a representative-or "exemplar"-from each cluster. In this context, the availability of both advanced data processing methods and a nationally available set of landscape metrics presents an opportunity to identify sets of representative landscapes to better understand landscape pattern, variation, and distribution across the forested area of Canada. In this research, we first identify and provide context for a small, interpretable set of exemplar landscapes that objectively represent land cover in each of Canada's ten forested ecozones. Then, we demonstrate how this approach can be used to identify flagship and satellite long-term study areas inside and outside protected areas in the province of Ontario. These applications aid our understanding of Canada's forest while augmenting its management toolbox, and may signal a broad range of applications for this versatile approach.
Representative Landscapes in the Forested Area of Canada
NASA Astrophysics Data System (ADS)
Cardille, Jeffrey A.; White, Joanne C.; Wulder, Mike A.; Holland, Tara
2012-01-01
Canada is a large nation with forested ecosystems that occupy over 60% of the national land base, and knowledge of the patterns of Canada's land cover is important to proper environmental management of this vast resource. To this end, a circa 2000 Landsat-derived land cover map of the forested ecosystems of Canada has created a new window into understanding the composition and configuration of land cover patterns in forested Canada. Strategies for summarizing such large expanses of land cover are increasingly important, as land managers work to study and preserve distinctive areas, as well as to identify representative examples of current land-cover and land-use assemblages. Meanwhile, the development of extremely efficient clustering algorithms has become increasingly important in the world of computer science, in which billions of pieces of information on the internet are continually sifted for meaning for a vast variety of applications. One recently developed clustering algorithm quickly groups large numbers of items of any type in a given data set while simultaneously selecting a representative—or "exemplar"—from each cluster. In this context, the availability of both advanced data processing methods and a nationally available set of landscape metrics presents an opportunity to identify sets of representative landscapes to better understand landscape pattern, variation, and distribution across the forested area of Canada. In this research, we first identify and provide context for a small, interpretable set of exemplar landscapes that objectively represent land cover in each of Canada's ten forested ecozones. Then, we demonstrate how this approach can be used to identify flagship and satellite long-term study areas inside and outside protected areas in the province of Ontario. These applications aid our understanding of Canada's forest while augmenting its management toolbox, and may signal a broad range of applications for this versatile approach.
Shi, Xiaohe; Lu, Wen-Cong; Cai, Yu-Dong; Chou, Kuo-Chen
2011-01-01
Background With the huge amount of uncharacterized protein sequences generated in the post-genomic age, it is highly desirable to develop effective computational methods for quickly and accurately predicting their functions. The information thus obtained would be very useful for both basic research and drug development in a timely manner. Methodology/Principal Findings Although many efforts have been made in this regard, most of them were based on either sequence similarity or protein-protein interaction (PPI) information. However, the former often fails to work if a query protein has no or very little sequence similarity to any function-known proteins, while the latter had similar problem if the relevant PPI information is not available. In view of this, a new approach is proposed by hybridizing the PPI information and the biochemical/physicochemical features of protein sequences. The overall first-order success rates by the new predictor for the functions of mouse proteins on training set and test set were 69.1% and 70.2%, respectively, and the success rate covered by the results of the top-4 order from a total of 24 orders was 65.2%. Conclusions/Significance The results indicate that the new approach is quite promising that may open a new avenue or direction for addressing the difficult and complicated problem. PMID:21283518
Code of Federal Regulations, 2010 CFR
2010-01-01
... (pooling ageement) with other land owners or operators to solve mutual water quality problems. Each participant must enter into an RCWP contract to treat water quality problems not covered by the joint...
Dynamics of on-orbit construction process
NASA Technical Reports Server (NTRS)
Chiou, J. C.; Alexander, S.; Natori, M. C.; Mikulas, M.; Park, K. C.
1991-01-01
The topics covered are presented in viewgraph form and include the following: problem definition and motivation; survey of current technology; focus problems; approach; progress/discussion; and future direction and anticipated results.
2010-05-13
This map sheet covers a 15-series image set covering the entire surface of Enceladus. The map data was acquired by NASA Cassini imaging experiment. Individual images can be viewed via the Photojournal.
Characterization and classification of South American land cover types using satellite data
NASA Technical Reports Server (NTRS)
Townshend, J. R. G.; Justice, C. O.; Kalb, V.
1987-01-01
Various methods are compared for carrying out land cover classifications of South America using multitemporal Advanced Very High Resolution Radiometer data. Fifty-two images of the normalized difference vegetation index (NDVI) from a 1-year period are used to generate multitemporal data sets. Three main approaches to land cover classification are considered, namely the use of the principal components transformed images, the use of a characteristic curves procedure based on NDVI values plotted against time, and finally application of the maximum likelihood rule to multitemporal data sets. Comparison of results from training sites indicates that the last approach yields the most accurate results. Despite the reliance on training site figures for performance assessment, the results are nevertheless extremely encouraging, with accuracies for several cover types exceeding 90 per cent.
The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial
ERIC Educational Resources Information Center
Crissinger, Bryan R.
2015-01-01
Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…
Mathematical modeling of a dynamic thin plate deformation in acoustoelasticity problems
NASA Astrophysics Data System (ADS)
Badriev, I. B.; Paimuhin, V. N.
2018-01-01
The coupled problem of planar acoustic wave propagation through a composite plate covered with a second damping layer with a large logarithmic decrement of oscillations is formulated. The aerohydrodynamic interaction of a plate with external acoustic environment is described by three-dimensional wave equations and the mechanical behavior of a two-layer plate by the classical Kirchhoff-Love model. An exact analytic solution of the problem is found for the case of hinged support of the edges of a plate. On the basis of this, the parameters of the covering damping layer were found, under which it is possible to achieve a practically complete damping of the plate vibration under resonant modes of its acoustic loading.
NASA Astrophysics Data System (ADS)
Lin, Y.; Chen, X.
2016-12-01
Land cover classification systems used in remote sensing image data have been developed to meet the needs for depicting land covers in scientific investigations and policy decisions. However, accuracy assessments of a spate of data sets demonstrate that compared with the real physiognomy, each of the thematic map of specific land cover classification system contains some unavoidable flaws and unintended deviation. This work proposes a web-based land cover classification system, an integrated prototype, based on an ontology model of various classification systems, each of which is assigned the same weight in the final determination of land cover type. Ontology, a formal explication of specific concepts and relations, is employed in this prototype to build up the connections among different systems to resolve the naming conflicts. The process is initialized by measuring semantic similarity between terminologies in the systems and the search key to produce certain set of satisfied classifications, and carries on through searching the predefined relations in concepts of all classification systems to generate classification maps with user-specified land cover type highlighted, based on probability calculated by votes from data sets with different classification system adopted. The present system is verified and validated by comparing the classification results with those most common systems. Due to full consideration and meaningful expression of each classification system using ontology and the convenience that the web brings with itself, this system, as a preliminary model, proposes a flexible and extensible architecture for classification system integration and data fusion, thereby providing a strong foundation for the future work.
Pedagogy and/or technology: Making difference in improving students' problem solving skills
NASA Astrophysics Data System (ADS)
Hrepic, Zdeslav; Lodder, Katherine; Shaw, Kimberly A.
2013-01-01
Pen input computers combined with interactive software may have substantial potential for promoting active instructional methodologies and for facilitating students' problem solving ability. An excellent example is a study in which introductory physics students improved retention, conceptual understanding and problem solving abilities when one of three weekly lectures was replaced with group problem solving sessions facilitated with Tablet PCs and DyKnow software [1,2]. The research goal of the present study was to isolate the effect of the methodology itself (using additional time to teach problem solving) from that of the involved technology. In Fall 2011 we compared the performance of students taking the same introductory physics lecture course while enrolled in two separate problem-solving sections. One section used pen-based computing to facilitate group problem solving while the other section used low-tech methods for one third of the semester (covering Kinematics), and then traded technologies for the middle third of the term (covering Dynamics). Analysis of quiz, exam and standardized pre-post test results indicated no significant difference in scores of the two groups. Combining this result with those of previous studies implies primacy of pedagogy (collaborative problem solving itself) over technology for student learning in problem solving recitations.
Evaluating the Use of Problem-Based Video Podcasts to Teach Mathematics in Higher Education
ERIC Educational Resources Information Center
Kay, Robin; Kletskin, Ilona
2012-01-01
Problem-based video podcasts provide short, web-based, audio-visual explanations of how to solve specific procedural problems in subject areas such as mathematics or science. A series of 59 problem-based video podcasts covering five key areas (operations with functions, solving equations, linear functions, exponential and logarithmic functions,…
2016-01-01
Objective To explore the experiences of athletes with spinal cord injury (SCI) in Korea with respect to dilemmas of participating in sports with regards to the facilitators and barriers, using the International Classification of Functioning, Disability and Health (ICF). Methods The facilitators and barriers to sports participation of individuals with SCI were examined using 112 ICF categories. A questionnaire in dichotomous scale was answered, which covered the subjects 'Body functions', 'Body structures', 'Activity and participation' and 'Environmental factors'. Data analysis included the use of descriptive statistics to examine the frequency and magnitude of reported issues. Results Sixty-two community-dwelling participants were recruited. Frequently addressed barriers in 'Body functions' were mobility related problems such as muscle and joint problems, bladder and bowel functions, pressure ulcers, and pain. In 'Activity and participation', most frequently reported were mobility and self-care problems. Highly addressed barriers in 'Environmental factors' were sports facilities, financial cost, transportation problems and lack of information. Relationships such as peer, family and friends were the most important facilitators. Conclusion Numerous barriers still exist for SCI survivors to participate in sports, especially in the area of health care needs and environmental factors. Our results support the need for a multidisciplinary approach to promote sports participation. PMID:27847720
Relation between SM-covers and SM-decompositions of Petri nets
NASA Astrophysics Data System (ADS)
Karatkevich, Andrei; Wiśniewski, Remigiusz
2015-12-01
A task of finding for a given Petri net a set of sequential components being able to represent together the behavior of the net arises often in formal analysis of Petri nets and in applications of Petri net to logical control. Such task can be met in two different variants: obtaining a Petri net cover or a decomposition. Petri net cover supposes that a set of the subnets of given net is selected, and the sequential nets forming a decomposition may have additional places, which do not belong to the decomposed net. The paper discusses difference and relations between two mentioned tasks and their results.
NASA Technical Reports Server (NTRS)
1972-01-01
A handbook which sets forth the Kennedy Space Center radiation protection policy is presented. The book also covers administrative direction and guidance on organizational and procedural requirements of the program. Only ionizing radiation is covered.
ERIC Educational Resources Information Center
Freeman, Richard B.; And Others
This collection of papers on the youth employment problem consists of 15 papers that cover the dimensions, causes, and consequences of youth unemployment and that also focus on problems in measuring the extent of the problem, the dynamic aspects of youth labor force participation, and problems associated with adequately assessing the consequences…
Argentina spectral-agronomic multitemporal data set
NASA Technical Reports Server (NTRS)
Helmer, D.; Kinzler, C.; Tomppkins, M. A.; Badhwar, G. D.
1983-01-01
A multitemporal LANDSAT spectral data set was created. The data set is over five 5 nm-by-6 nm areas over Argentina and contains by field, the spectral data, vegetation type and cloud cover information.
Darrell N. Ueckert; Robert A. Phillips; Joseph L. Petersen; X. Ben Wu
2001-01-01
Redberry juniper (Juniperus pinchotii) is a major problem on Texas rangelands, yet little is known about the rate it is increasing. This study estimated long-term rates of change of redberry juniper canopy cover on undisturbed sites and adjacent sites that were either chained or grubbed at five locations in western Texas. Juniper cover was estimated from positive...
USDA-ARS?s Scientific Manuscript database
Numerous studies have been conducted that evaluate the utility of remote sensing for monitoring and assessing vegetation and ground cover to support land management decisions and complement ground-measurements. However, few land cover comparisons have been made using high-resolution imagery and obj...
42 CFR 403.904 - Reports of payments or other transfers of value to covered recipients.
Code of Federal Regulations, 2014 CFR
2014-10-01
... research payment, including all research-related costs for activities outlined in a written agreement... reporting food and beverage. (1) When allocating the cost of food and beverage among covered recipients in a group setting where the cost of each individual covered recipient's meal is not separately identifiable...
42 CFR 403.904 - Reports of payments or other transfers of value to covered recipients.
Code of Federal Regulations, 2013 CFR
2013-10-01
... research payment, including all research-related costs for activities outlined in a written agreement... reporting food and beverage. (1) When allocating the cost of food and beverage among covered recipients in a group setting where the cost of each individual covered recipient's meal is not separately identifiable...
Managing wilderness recreation use: common problems and potential solutions
David N. Cole; Margaret E. Petersen; Robert C. Lucas
1987-01-01
Describes pros and cons of potential solutions to common wilderness recreation problems. Covers the purpose of each potential solution, costs to visitors and management, effectiveness, other considerations, and sources of additional information.
Geopolymer for protective coating of transportation infrastructures.
DOT National Transportation Integrated Search
1998-09-01
Surface deterioration of exposed transportation structures is a major problem. In most cases, : surface deterioration could lead to structural problems because of the loss of cover and ensuing : reinforcement corrosion. To minimize the deterioration,...
Using Clickers to Facilitate Development of Problem-Solving Skills
ERIC Educational Resources Information Center
Levesque, Aime A.
2011-01-01
Classroom response systems, or clickers, have become pedagogical staples of the undergraduate science curriculum at many universities. In this study, the effectiveness of clickers in promoting problem-solving skills in a genetics class was investigated. Students were presented with problems requiring application of concepts covered in lecture and…
On the Beauty of Mathematics as Exemplified by a Problem in Combinatorics.
ERIC Educational Resources Information Center
Dence, Thomas P.
1982-01-01
The beauty of discovering some simple yet elegant proof either to something new or to an already established fact is discussed. A combinatorial problem that deals with covering a checkerboard with dominoes is presented as a starting point for individual investigation of similar problems. (MP)
ERIC Educational Resources Information Center
Rowan, Helen
The purpose of this paper, prepared for the U. S. Commission on Civil Rights, is to indicate the types and ranges of problems facing the Mexican American community and to suggest ways in which these problems are peculiar to Mexican Americans. Specific examples are cited to illustrate major problems and personal experiences. Topics covered in the…
Symposium on Spina Bifida (Denver, Colorado, November, 1969).
ERIC Educational Resources Information Center
Colorado Univ., Denver. Medical Center.
The objectives of the symposium were to define the problems of the child with spina bifida and to present practical means of management, using a multi-disciplinary team approach. Eight papers defining the problem cover the epidemiology of spina bifida, pathophysiology, musculoskeletal defects, incontinence of bladder and bowel, problems of…
Probing for quantum speedup in spin-glass problems with planted solutions
NASA Astrophysics Data System (ADS)
Hen, Itay; Job, Joshua; Albash, Tameem; Rønnow, Troels F.; Troyer, Matthias; Lidar, Daniel A.
2015-10-01
The availability of quantum annealing devices with hundreds of qubits has made the experimental demonstration of a quantum speedup for optimization problems a coveted, albeit elusive goal. Going beyond earlier studies of random Ising problems, here we introduce a method to construct a set of frustrated Ising-model optimization problems with tunable hardness. We study the performance of a D-Wave Two device (DW2) with up to 503 qubits on these problems and compare it to a suite of classical algorithms, including a highly optimized algorithm designed to compete directly with the DW2. The problems are generated around predetermined ground-state configurations, called planted solutions, which makes them particularly suitable for benchmarking purposes. The problem set exhibits properties familiar from constraint satisfaction (SAT) problems, such as a peak in the typical hardness of the problems, determined by a tunable clause density parameter. We bound the hardness regime where the DW2 device either does not or might exhibit a quantum speedup for our problem set. While we do not find evidence for a speedup for the hardest and most frustrated problems in our problem set, we cannot rule out that a speedup might exist for some of the easier, less frustrated problems. Our empirical findings pertain to the specific D-Wave processor and problem set we studied and leave open the possibility that future processors might exhibit a quantum speedup on the same problem set.
Minimum triplet covers of binary phylogenetic X-trees.
Huber, K T; Moulton, V; Steel, M
2017-12-01
Trees with labelled leaves and with all other vertices of degree three play an important role in systematic biology and other areas of classification. A classical combinatorial result ensures that such trees can be uniquely reconstructed from the distances between the leaves (when the edges are given any strictly positive lengths). Moreover, a linear number of these pairwise distance values suffices to determine both the tree and its edge lengths. A natural set of pairs of leaves is provided by any 'triplet cover' of the tree (based on the fact that each non-leaf vertex is the median vertex of three leaves). In this paper we describe a number of new results concerning triplet covers of minimum size. In particular, we characterize such covers in terms of an associated graph being a 2-tree. Also, we show that minimum triplet covers are 'shellable' and thereby provide a set of pairs for which the inter-leaf distance values will uniquely determine the underlying tree and its associated branch lengths.
Using Minimum-Surface Bodies for Iteration Space Partitioning
NASA Technical Reports Server (NTRS)
Frumlin, Michael; VanderWijngaart, Rob F.; Biegel, Bryan (Technical Monitor)
2001-01-01
A number of known techniques for improving cache performance in scientific computations involve the reordering of the iteration space. Some of these reorderings can be considered as coverings of the iteration space with the sets having good surface-to-volume ratio. Use of such sets reduces the number of cache misses in computations of local operators having the iteration space as a domain. We study coverings of iteration spaces represented by structured and unstructured grids. For structured grids we introduce a covering based on successive minima tiles of the interference lattice of the grid. We show that the covering has good surface-to-volume ratio and present a computer experiment showing actual reduction of the cache misses achieved by using these tiles. For unstructured grids no cache efficient covering can be guaranteed. We present a triangulation of a 3-dimensional cube such that any local operator on the corresponding grid has significantly larger number of cache misses than a similar operator on a structured grid.
Human factors in aviation operations: The hearback problem
NASA Technical Reports Server (NTRS)
Monan, William P.
1988-01-01
This report covers a study of ASRS reports wherein ATC controllers failed to monitor adequately (hearback) incorrect readbacks of ATC clearances. A total of 417 reports received over a period of 29 months from April 1981 through July 1983 comprised the study data set. Factors examined were: the reasons for a flight crew's getting clearances incorrectly, the operating factors that caused controllers to mishear or not hear the correct readbacks, and consequences of the various types of hearback misses. The principle conclusion of the study takes the form of a precaution to flight crews that a controller's not challenging a readback does not necessariliy mean the readback is correct and that flight crews must explicitly question any doubtful or unusual aspects of clearances rather than depending on controllers to detect readback errors.
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice L.; Baggs, Rhoda
2007-01-01
Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.
The effect of an outdoor setting on the transfer of earth science concepts
NASA Astrophysics Data System (ADS)
Simmons, Jerry Marvin
The ability of students to transfer concepts learned in school to future learning and employment settings is critical to their academic and career success. Concept transfer can best be studied by defining it as a process rather than an isolated event. Preparation for future learning (PFL) is a process definition of transfer which recognizes the student's ability to draw from past experiences, make assumptions, and generate potential questions and strategies for problem resolution. The purpose of this study was to use the PFL definition of concept transfer to examine whether a knowledge-rich outdoor setting better prepares students for future learning of science concepts than the classroom setting alone does. The research hypothesis was that sixth-grade students experiencing a geology-rich outdoor setting would be better prepared to learn advanced earth science concepts than students experiencing classroom learning only. A quasi-experimental research design was used for this study on two non-equivalent, self-contained sixth-grade rural public school classes. After a pretest was given on prior geology knowledge, the outdoor treatment group was taken on a geology-rich field excursion which introduced them to the concepts of mineral formation and mining. The indoor treatment group received exposure to the same concepts in the classroom setting via color slides and identification of mineral specimens. Subsequently, both groups received direct instruction on advanced concepts about mineral formation and mining. They were then given a posttest, which presented the students with a problem-solving scenario and questions related to concepts covered in the direct instruction. A t-test done on pretest data revealed that the indoor treatment group had previously learned classroom geology material significantly better than the outdoor treatment group had. Therefore an analysis of covariance was performed on posttest data which showed that the outdoor treatment group was better prepared for future learning of advanced geology concepts than the indoor treatment group. Because the environment chosen for this study was by nature one that contained variables outside the control of the researcher, it can only be speculated that the outdoor environment was the agent of transfer. Subsequent studies need to be done to substantiate this hypothesis.
Patterns of Home and School Behavior Problems in Rural and Urban Settings
Hope, Timothy L; Bierman, Karen L
2009-01-01
This study examined the cross-situational patterns of behavior problems shown by children in rural and urban communities at school entry. Behavior problems exhibited in home settings were not expected to vary significantly across urban and rural settings. In contrast, it was anticipated that child behavior at school would be heavily influenced by the increased exposure to aggressive models and deviant peer support experienced by children in urban as compared to rural schools, leading to higher rates of school conduct problems for children in urban settings. Statistical comparisons of the patterns of behavior problems shown by representative samples of 89 rural and 221 urban children provided support for these hypotheses, as significant rural-urban differences emerged in school and not in home settings. Cross-situational patterns of behavior problems also varied across setting, with home-only patterns of problems characterizing more children at the rural site and school-only, patterns of behavior problems characterizing more children at the urban sites. In addition, whereas externalizing behavior was the primary school problem exhibited by urban children, rural children displayed significantly higher rates of internalizing problems at school. The implications of these results are discussed for developmental models of behavior problems and for preventive interventions. PMID:19834584
Enhanced Historical Land-Use and Land-Cover Data Sets of the U.S. Geological Survey
Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.; Clawges, Rick M.
2007-01-01
Historical land-use and land-cover data, available from the U.S. Geological Survey (USGS) for the conterminous United States and Hawaii, have been enhanced for use in geographic information systems (GIS) applications. The original digital data sets were created by the USGS in the late 1970s and early 1980s and were later converted by USGS and the U.S. Environmental Protection Agency (USEPA) to a geographic information system (GIS) format in the early 1990s. These data were made available on USEPA's Web site since the early 1990s and have been used for many national applications, despite minor coding and topological errors. During the 1990s, a group of USGS researchers made modifications to the data set for use in the National Water-Quality Assessment Program. These edited files have been further modified to create a more accurate, topologically clean, and seamless national data set. Several different methods, including custom editing software and several batch processes, were applied to create this enhanced version of the national data set. The data sets are included in this report in the commonly used shapefile and Tagged Image Format File (TIFF) formats. In addition, this report includes two polygon data sets (in shapefile format) representing (1) land-use and land-cover source documentation extracted from the previously published USGS data files, and (2) the extent of each polygon data file.
Wong, Alex W K; Lau, Stephen C L; Fong, Mandy W M; Cella, David; Lai, Jin-Shei; Heinemann, Allen W
2018-04-03
To determine the extent to which the content of the Quality of Life in Neurological Disorders (Neuro-QoL) covers the International Classification of Functioning, Disability and Health (ICF) Core Sets for multiple sclerosis (MS), stroke, spinal cord injury (SCI), and traumatic brain injury (TBI) using summary linkage indicators. Content analysis by linking content of the Neuro-QoL to corresponding ICF codes of each Core Set for MS, stroke, SCI, and TBI. Three academic centers. None. None. Four summary linkage indicators proposed by MacDermid et al were estimated to compare the content coverage between Neuro-QoL and the ICF codes of Core Sets for MS, stroke, MS, and TBI. Neuro-QoL represented 20% to 30% Core Set codes for different conditions in which more codes in Core Sets for MS (29%), stroke (28%), and TBI (28%) were covered than those for SCI in the long-term (20%) and early postacute (19%) contexts. Neuro-QoL represented nearly half of the unique Activity and Participation codes (43%-49%) and less than one third of the unique Body Function codes (12%-32%). It represented fewer Environmental Factors codes (2%-6%) and no Body Structures codes. Absolute linkage indicators found that at least 60% of Neuro-QoL items were linked to Core Set codes (63%-95%), but many items covered the same codes as revealed by unique linkage indicators (7%-13%), suggesting high concept redundancy among items. The Neuro-QoL links more closely to ICF Core Sets for stroke, MS, and TBI than to those for SCI, and primarily covers activity and participation ICF domains. Other instruments are needed to address concepts not measured by the Neuro-QoL when a comprehensive health assessment is needed. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
He, Tao; Liang, Shunlin; Song, Dan-Xia
2014-09-01
For several decades, long-term time series data sets of multiple global land surface albedo products have been generated from satellite observations. These data sets have been used as one of the key variables in climate change studies. This study aims to assess the surface albedo climatology and to analyze long-term albedo changes, from nine satellite-based data sets for the period 1981-2010, on a global basis. Results show that climatological surface albedo data sets derived from satellite observations can be used to validate, calibrate, and further improve surface albedo simulations and parameterizations in current climate models. However, the albedo products derived from the International Satellite Cloud Climatology Project and the Global Energy and Water Exchanges Project have large seasonal biases. At latitudes higher than 50°, the maximal difference in winter zonal albedo ranges from 0.1 to 0.4 among the nine satellite data sets. Satellite-based albedo data sets agree relatively well during the summer at high latitudes, with a standard deviation of 0.04 for the 70°-80° zone in both hemispheres. The fine-resolution (0.05°) data sets agree well with each other for all the land cover types in middle to low latitudes; however, large spread was identified for their albedos at middle to high latitudes over land covers with mixed snow and sparse vegetation. By analyzing the time series of satellite-based albedo products over the past three decades, albedo of the Northern Hemisphere was found to be decreasing in July, likely due to the shrinking snow cover. Meanwhile, albedo in January was found to be increasing, likely because of the expansion of snow cover in northern winter. However, to improve the albedo estimation at high latitudes, and ultimately the climate models used for long-term climate change studies, a still better understanding of differences between satellite-based albedo data sets is required.
Smit, Izak P J; Prins, Herbert H T
2015-01-01
With grasslands and savannas covering 20% of the world's land surface, accounting for 30-35% of worldwide Net Primary Productivity and supporting hundreds of millions of people, predicting changes in tree/grass systems is priority. Inappropriate land management and rising atmospheric CO2 levels result in increased woody cover in savannas. Although woody encroachment occurs world-wide, Africa's tourism and livestock grazing industries may be particularly vulnerable. Forecasts of responses of African wildlife and available grazing biomass to increases in woody cover are thus urgently needed. These predictions are hard to make due to non-linear responses and poorly understood feedback mechanisms between woody cover and other ecological responders, problems further amplified by the lack of long-term and large-scale datasets. We propose that a space-for-time analysis along an existing woody cover gradient overcomes some of these forecasting problems. Here we show, using an existing woody cover gradient (0-65%) across the Kruger National Park, South Africa, that increased woody cover is associated with (i) changed herbivore assemblage composition, (ii) reduced grass biomass, and (iii) reduced fire frequency. Furthermore, although increased woody cover is associated with reduced livestock production, we found indigenous herbivore biomass (excluding elephants) remains unchanged between 20-65% woody cover. This is due to a significant reorganization in the herbivore assemblage composition, mostly as a result of meso-grazers being substituted by browsers at increasing woody cover. Our results suggest that woody encroachment will have cascading consequences for Africa's grazing systems, fire regimes and iconic wildlife. These effects will pose challenges and require adaptation of livelihoods and industries dependent on conditions currently prevailing.
Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity
NASA Astrophysics Data System (ADS)
Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.
As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.
ERIC Educational Resources Information Center
Anwar, Rahmad Bustanul; Rahmawati, Dwi
2017-01-01
The purpose of this research was to reveal how the construction process of symbolic representation and verbal representation made by students in problem solving. The construction process in this study referred to the problem-solving stage by Polya covering; 1) understanding the problem, 2) devising a plan, 3) carrying out the plan, and 4) looking…
A Least-Squares-Based Weak Galerkin Finite Element Method for Second Order Elliptic Equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mu, Lin; Wang, Junping; Ye, Xiu
Here, in this article, we introduce a least-squares-based weak Galerkin finite element method for the second order elliptic equation. This new method is shown to provide very accurate numerical approximations for both the primal and the flux variables. In contrast to other existing least-squares finite element methods, this new method allows us to use discontinuous approximating functions on finite element partitions consisting of arbitrary polygon/polyhedron shapes. We also develop a Schur complement algorithm for the resulting discretization problem by eliminating all the unknowns that represent the solution information in the interior of each element. Optimal order error estimates for bothmore » the primal and the flux variables are established. An extensive set of numerical experiments are conducted to demonstrate the robustness, reliability, flexibility, and accuracy of the least-squares-based weak Galerkin finite element method. Finally, the numerical examples cover a wide range of applied problems, including singularly perturbed reaction-diffusion equations and the flow of fluid in porous media with strong anisotropy and heterogeneity.« less
A Least-Squares-Based Weak Galerkin Finite Element Method for Second Order Elliptic Equations
Mu, Lin; Wang, Junping; Ye, Xiu
2017-08-17
Here, in this article, we introduce a least-squares-based weak Galerkin finite element method for the second order elliptic equation. This new method is shown to provide very accurate numerical approximations for both the primal and the flux variables. In contrast to other existing least-squares finite element methods, this new method allows us to use discontinuous approximating functions on finite element partitions consisting of arbitrary polygon/polyhedron shapes. We also develop a Schur complement algorithm for the resulting discretization problem by eliminating all the unknowns that represent the solution information in the interior of each element. Optimal order error estimates for bothmore » the primal and the flux variables are established. An extensive set of numerical experiments are conducted to demonstrate the robustness, reliability, flexibility, and accuracy of the least-squares-based weak Galerkin finite element method. Finally, the numerical examples cover a wide range of applied problems, including singularly perturbed reaction-diffusion equations and the flow of fluid in porous media with strong anisotropy and heterogeneity.« less
A visualization framework for design and evaluation
NASA Astrophysics Data System (ADS)
Blundell, Benjamin J.; Ng, Gary; Pettifer, Steve
2006-01-01
The creation of compelling visualisation paradigms is a craft often dominated by intuition and issues of aesthetics, with relatively few models to support good design. The majority of problem cases are approached by simply applying a previously evaluated visualisation technique. A large body of work exists covering the individual aspects of visualisation design such as the human cognition aspects visualisation methods for specific problem areas, psychology studies and so forth, yet most frameworks regarding visualisation are applied after-the-fact as an evaluation measure. We present an extensible framework for visualisation aimed at structuring the design process, increasing decision traceability and delineating the notions of function, aesthetics and usability. The framework can be used to derive a set of requirements for good visualisation design and evaluating existing visualisations, presenting possible improvements. Our framework achieves this by being both broad and general, built on top of existing works, with hooks for extensions and customizations. This paper shows how existing theories of information visualisation fit into the scheme, presents our experience in the application of this framework on several designs, and offers our evaluation of the framework and the designs studied.
Ernren, A.T.; Arthur, R.; Glynn, P.D.; McMurry, J.
1999-01-01
Four researchers were asked to provide independent modeled estimates of the solubility of a radionuclide solid phase, specifically Pu(OH)4, under five specified sets of conditions. The objectives of the study were to assess the variability in the results obtained and to determine the primary causes for this variability.In the exercise, modelers were supplied with the composition, pH and redox properties of the water and with a description of the mineralogy of the surrounding fracture system A standard thermodynamic data base was provided to all modelers. Each modeler was encouraged to use other data bases in addition to the standard data base and to try different approaches to solving the problem.In all, about fifty approaches were used, some of which included a large number of solubility calculations. For each of the five test cases, the calculated solubilities from different approaches covered several orders of magnitude. The variability resulting from the use of different thermodynamic data bases was in most cases, far smaller than that resulting from the use of different approaches to solving the problem.
Path scheduling for multiple mobile actors in wireless sensor network
NASA Astrophysics Data System (ADS)
Trapasiya, Samir D.; Soni, Himanshu B.
2017-05-01
In wireless sensor network (WSN), energy is the main constraint. In this work we have addressed this issue for single as well as multiple mobile sensor actor network. In this work, we have proposed Rendezvous Point Selection Scheme (RPSS) in which Rendezvous Nodes are selected by set covering problem approach and from that, Rendezvous Points are selected in a way to reduce the tour length. The mobile actors tour is scheduled to pass through those Rendezvous Points as per Travelling Salesman Problem (TSP). We have also proposed novel rendezvous node rotation scheme for fair utilisation of all the nodes. We have compared RPSS with Stationery Actor scheme as well as RD-VT, RD-VT-SMT and WRP-SMT for performance metrics like energy consumption, network lifetime, route length and found the better outcome in all the cases for single actor. We have also applied RPSS for multiple mobile actor case like Multi-Actor Single Depot (MASD) termination and Multi-Actor Multiple Depot (MAMD) termination and observed by extensive simulation that MAMD saves the network energy in optimised way and enhance network lifetime compared to all other schemes.
Multiscale modelling for tokamak pedestals
NASA Astrophysics Data System (ADS)
Abel, I. G.
2018-04-01
Pedestal modelling is crucial to predict the performance of future fusion devices. Current modelling efforts suffer either from a lack of kinetic physics, or an excess of computational complexity. To ameliorate these problems, we take a first-principles multiscale approach to the pedestal. We will present three separate sets of equations, covering the dynamics of edge localised modes (ELMs), the inter-ELM pedestal and pedestal turbulence, respectively. Precisely how these equations should be coupled to each other is covered in detail. This framework is completely self-consistent; it is derived from first principles by means of an asymptotic expansion of the fundamental Vlasov-Landau-Maxwell system in appropriate small parameters. The derivation exploits the narrowness of the pedestal region, the smallness of the thermal gyroradius and the low plasma (the ratio of thermal to magnetic pressures) typical of current pedestal operation to achieve its simplifications. The relationship between this framework and gyrokinetics is analysed, and possibilities to directly match our systems of equations onto multiscale gyrokinetics are explored. A detailed comparison between our model and other models in the literature is performed. Finally, the potential for matching this framework onto an open-field-line region is briefly discussed.
Spectral response data for development of cool coloured tile coverings
NASA Astrophysics Data System (ADS)
Libbra, Antonio; Tarozzi, Luca; Muscio, Alberto; Corticelli, Mauro A.
2011-03-01
Most ancient or traditional buildings in Italy show steep-slope roofs covered by red clay tiles. As the rooms immediately below the roof are often inhabited in historical or densely urbanized centres, the combination of low solar reflectance of tile coverings and low thermal inertia of either wooden roof structures or sub-tile insulation panels makes summer overheating a major problem. The problem can be mitigated by using tiles coated with cool colours, that is colours with the same spectral response of clay tiles in the visible, but highly reflecting in the near infrared range, which includes more than half of solar radiation. Cool colours can yield the same visible aspect of common building surfaces, but higher solar reflectance. Studies aimed at developing cool colour tile coverings for traditional Italian buildings have been started. A few coating solutions with the typical red terracotta colour have been produced and tested in the laboratory, using easily available materials. The spectral response and the solar reflectance have been measured and compared with that of standard tiles.
Satellite Studies of Cirrus Clouds for Project Fire
NASA Technical Reports Server (NTRS)
1997-01-01
Examine global cloud climatologies for evidence of human caused changes in cloud cover and their effect on the Earth's heat budget through radiative processes. Quantify climatological changes in global cloud cover and estimate their effect on the Earth's heat budget. Improve our knowledge of global cloud cover and its changes through the merging of several satellite data sets.
The effects of changing land cover on streamflow simulation in Puerto Rico
A.E. Van Beusekom; L.E. Hay; R.J. Viger; W.A. Gould; J.A. Collazo; A. Henareh Khalyani
2014-01-01
This study quantitatively explores whether land cover changes have a substantive impact on simulated streamflow within the tropical island setting of Puerto Rico. The Precipitation Runoff Modeling System (PRMS) was used to compare streamflow simulations based on five static parameterizations of land cover with those based on dynamically varying parameters derived from...
2014-01-01
Background In complex large-scale experiments, in addition to simultaneously considering a large number of features, multiple hypotheses are often being tested for each feature. This leads to a problem of multi-dimensional multiple testing. For example, in gene expression studies over ordered categories (such as time-course or dose-response experiments), interest is often in testing differential expression across several categories for each gene. In this paper, we consider a framework for testing multiple sets of hypothesis, which can be applied to a wide range of problems. Results We adopt the concept of the overall false discovery rate (OFDR) for controlling false discoveries on the hypothesis set level. Based on an existing procedure for identifying differentially expressed gene sets, we discuss a general two-step hierarchical hypothesis set testing procedure, which controls the overall false discovery rate under independence across hypothesis sets. In addition, we discuss the concept of the mixed-directional false discovery rate (mdFDR), and extend the general procedure to enable directional decisions for two-sided alternatives. We applied the framework to the case of microarray time-course/dose-response experiments, and proposed three procedures for testing differential expression and making multiple directional decisions for each gene. Simulation studies confirm the control of the OFDR and mdFDR by the proposed procedures under independence and positive correlations across genes. Simulation results also show that two of our new procedures achieve higher power than previous methods. Finally, the proposed methodology is applied to a microarray dose-response study, to identify 17 β-estradiol sensitive genes in breast cancer cells that are induced at low concentrations. Conclusions The framework we discuss provides a platform for multiple testing procedures covering situations involving two (or potentially more) sources of multiplicity. The framework is easy to use and adaptable to various practical settings that frequently occur in large-scale experiments. Procedures generated from the framework are shown to maintain control of the OFDR and mdFDR, quantities that are especially relevant in the case of multiple hypothesis set testing. The procedures work well in both simulations and real datasets, and are shown to have better power than existing methods. PMID:24731138
Cosmological Inflation: A Personal Perspective
NASA Technical Reports Server (NTRS)
Kazanas, D.
2007-01-01
Approximately twenty five years ago a novel proposal was made to explain two of the outstanding cosmological conundrums, namely those of the Horizon Problem and the Flatness Problem of the Universe. These are the fact that widely separated parts of the sky that have never been in causal contact during the evolution of the Universe have apparently the same CMB temperature and the fact that the mean density of the Universe is very close to the critical one, i.e. very close to the density that separates the closed and open models. These coincidences implied that the corresponding initial condition of the Universe must have been set to exquisite accuracy. This novel proposal posted that at these very early times, the energy density of the Universe was dominated by a fluid which had the equation state attributed to the vacuum (i.e. dominated by tension rather than pressure) and that this led to an exponential expansion of the Universe which was "inflated" by many orders of magnitude of its original size. It was then shown that this "inflation" could provide a resolution of the above outstanding problems. The talk will cover the speaker's personal perspective and contributions to this idea and the subsequent developments over the following 25 years since its inception.
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator); Knowlton, D. J.; Dean, M. E.
1981-01-01
A set of training statistics for the 30 meter resolution simulated thematic mapper MSS data was generated based on land use/land cover classes. In addition to this supervised data set, a nonsupervised multicluster block of training statistics is being defined in order to compare the classification results and evaluate the effect of the different training selection methods on classification performance. Two test data sets, defined using a stratified sampling procedure incorporating a grid system with dimensions of 50 lines by 50 columns, and another set based on an analyst supervised set of test fields were used to evaluate the classifications of the TMS data. The supervised training data set generated training statistics, and a per point Gaussian maximum likelihood classification of the 1979 TMS data was obtained. The August 1980 MSS data was radiometrically adjusted. The SAR data was redigitized and the SAR imagery was qualitatively analyzed.
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
Benchmark problems and solutions
NASA Technical Reports Server (NTRS)
Tam, Christopher K. W.
1995-01-01
The scientific committee, after careful consideration, adopted six categories of benchmark problems for the workshop. These problems do not cover all the important computational issues relevant to Computational Aeroacoustics (CAA). The deciding factor to limit the number of categories to six was the amount of effort needed to solve these problems. For reference purpose, the benchmark problems are provided here. They are followed by the exact or approximate analytical solutions. At present, an exact solution for the Category 6 problem is not available.
Trends and uncertainties in U.S. cloud cover from weather stations and satellite data
NASA Astrophysics Data System (ADS)
Free, M. P.; Sun, B.; Yoo, H. L.
2014-12-01
Cloud cover data from ground-based weather observers can be an important source of climate information, but the record of such observations in the U.S. is disrupted by the introduction of automated observing systems and other artificial shifts that interfere with our ability to assess changes in cloudiness at climate time scales. A new dataset using 54 National Weather Service (NWS) and 101 military stations that continued to make human-augmented cloud observations after the 1990s has been adjusted using statistical changepoint detection and visual scrutiny. The adjustments substantially reduce the trends in U.S. mean total cloud cover while increasing the agreement between the cloud cover time series and those of physically related climate variables such as diurnal temperature range and number of precipitation days. For 1949-2009, the adjusted time series give a trend in U.S. mean total cloud of 0.11 ± 0.22 %/decade for the military data, 0.55 ± 0.24 %/decade for the NWS data, and 0.31 ± 0.22 %/decade for the combined dataset. These trends are less than half those in the original data. For 1976-2004, the original data give a significant increase but the adjusted data show an insignificant trend of -0.17 (military stations) to 0.66 %/decade (NWS stations). The differences between the two sets of station data illustrate the uncertainties in the U.S. cloud cover record. We compare the adjusted station data to cloud cover time series extracted from several satellite datasets: ISCCP (International Satellite Cloud Climatology Project), PATMOS-x (AVHRR Pathfinder Atmospheres Extended) and CLARA-a1 (CM SAF cLoud Albedo and RAdiation), and the recently developed PATMOS-x diurnally corrected dataset. Like the station data, satellite cloud cover time series may contain inhomogeneities due to changes in the observing systems and problems with retrieval algorithms. Overall we find good agreement between interannual variability in most of the satellite data and that in our station data, with the diurnally corrected PATMOS-x product generally showing the best match. For the satellite period 1984-2007, trends in the U.S. mean cloud cover from satellite data vary widely among the datasets, and all are more negative than those in the station data, with PATMOS-x having the trends closest to those in the station data.
Swinford, A E; McKeag, D B
1990-01-01
There has been recent interest in the development of problem-based human genetics curricula in U.S. medical schools. The College of Human Medicine at Michigan State University has had a problem-based curriculum since 1974. The vertical integration of genetics within the problem-based curriculum, called "Track II," has recently been revised. On first inspection, the curriculum appeared to lack a significant genetics component; however, on further analysis it was found that many genetics concepts were covered in the biochemistry, microbiology, pathology, and clinical science components. Both basic science concepts and clinical applications of genetics are covered in the curriculum by providing appropriate references for basic concepts and including inherited conditions within the differential diagnosis in the cases studied. Evaluations consist of a multiple-choice content exam and a modified essay exam based on a clinical case, allowing evaluation of both basic concepts and problem-solving ability. This curriculum prepares students to use genetics in a clinical context in their future careers. PMID:2220816
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maclaurin, Galen; Sengupta, Manajit; Xie, Yu
A significant source of bias in the transposition of global horizontal irradiance to plane-of-array (POA) irradiance arises from inaccurate estimations of surface albedo. The current physics-based model used to produce the National Solar Radiation Database (NSRDB) relies on model estimations of surface albedo from a reanalysis climatalogy produced at relatively coarse spatial resolution compared to that of the NSRDB. As an input to spectral decomposition and transposition models, more accurate surface albedo data from remotely sensed imagery at finer spatial resolutions would improve accuracy in the final product. The National Renewable Energy Laboratory (NREL) developed an improved white-sky (bi-hemispherical reflectance)more » broadband (0.3-5.0 ..mu..m) surface albedo data set for processing the NSRDB from two existing data sets: a gap-filled albedo product and a daily snow cover product. The Moderate Resolution Imaging Spectroradiometer (MODIS) sensors onboard the Terra and Aqua satellites have provided high-quality measurements of surface albedo at 30 arc-second spatial resolution and 8-day temporal resolution since 2001. The high spatial and temporal resolutions and the temporal coverage of the MODIS sensor will allow for improved modeling of POA irradiance in the NSRDB. However, cloud and snow cover interfere with MODIS observations of ground surface albedo, and thus they require post-processing. The MODIS production team applied a gap-filling methodology to interpolate observations obscured by clouds or ephemeral snow. This approach filled pixels with ephemeral snow cover because the 8-day temporal resolution is too coarse to accurately capture the variability of snow cover and its impact on albedo estimates. However, for this project, accurate representation of daily snow cover change is important in producing the NSRDB. Therefore, NREL also used the Integrated Multisensor Snow and Ice Mapping System data set, which provides daily snow cover observations of the Northern Hemisphere for the temporal extent of the NSRDB (1998-2015). We provide a review of validation studies conducted on these two products and describe the methodology developed by NREL to remap the data products to the NSRDB grid and integrate them into a seamless daily data set.« less
Simultaneous isoform discovery and quantification from RNA-seq.
Hiller, David; Wong, Wing Hung
2013-05-01
RNA sequencing is a recent technology which has seen an explosion of methods addressing all levels of analysis, from read mapping to transcript assembly to differential expression modeling. In particular the discovery of isoforms at the transcript assembly stage is a complex problem and current approaches suffer from various limitations. For instance, many approaches use graphs to construct a minimal set of isoforms which covers the observed reads, then perform a separate algorithm to quantify the isoforms, which can result in a loss of power. Current methods also use ad-hoc solutions to deal with the vast number of possible isoforms which can be constructed from a given set of reads. Finally, while the need of taking into account features such as read pairing and sampling rate of reads has been acknowledged, most existing methods do not seamlessly integrate these features as part of the model. We present Montebello, an integrated statistical approach which performs simultaneous isoform discovery and quantification by using a Monte Carlo simulation to find the most likely isoform composition leading to a set of observed reads. We compare Montebello to Cufflinks, a popular isoform discovery approach, on a simulated data set and on 46.3 million brain reads from an Illumina tissue panel. On this data set Montebello appears to offer a modest improvement over Cufflinks when considering discovery and parsimony metrics. In addition Montebello mitigates specific difficulties inherent in the Cufflinks approach. Finally, Montebello can be fine-tuned depending on the type of solution desired.
ERIC Educational Resources Information Center
Flannery, Carol A.
This manuscript provides information and problems for teaching mathematics to vocational education students. Problems reflect applications of mathematical concepts to specific technical areas. The materials are organized into six chapters. Chapter 1 covers basic arithmetic, including fractions, decimals, ratio and proportions, percentages, and…
Training for Corrections: Rationale and Techniques.
ERIC Educational Resources Information Center
Southern Illinois Univ., Carbondale. Center for the Study of Crime, Delinquency and Corrections.
A manual focuses on how to teach in inservice training programs for professional personnel in correctional agencies. A chapter on rationale discusses training objectives and curriculum. A second chapter covers learning environment, lesson plans, and learning problems. One, on teaching techniques, covers lecture, group discussion, case study,…
EDUCATIONAL TELEVISION, THE NEXT 10 YEARS.
ERIC Educational Resources Information Center
SCHRAMM, WILBUR
THIS DOCUMENT IS A COMPILATION OF STUDIES ON THE PROBLEMS AND POTENTIALS OF EDUCATIONAL TELEVISION DURING THE TIME PERIOD 1961-71. SIX TOPIC AREAS WERE COVERED--(1) RECOMMENDATIONS, (2) THE FUTURE OF EDUCATIONAL TELEVISION, (3) THE COMMUNITY JOB OF EDUCATIONAL TELEVISION, (4) THE PROBLEM OF IMPROVING PROGRAMS, (5) THE PROBLEM OF FINANCING, (6) THE…
Effect of a "Look-Ahead" Problem on Undergraduate Engineering Students' Concept Comprehension
ERIC Educational Resources Information Center
Goodman, Kevin; Davis, Julian; McDonald, Thomas
2016-01-01
In an effort to motivate undergraduate engineering students to prepare for class by reviewing material before lectures, a "Look-Ahead" problem was utilized. Students from two undergraduate engineering courses; Statics and Electronic Circuits, were assigned problems from course material that had not yet been covered in class. These…
Suzuki, Takayuki; Siddiqui, Ali; Taylor, Linda J; Cox, Kristen; Hasan, Raza A; Laique, Sobia N; Mathew, Arun; Wrobel, Piotr; Adler, Douglas G
2016-01-01
Esophageal stents are commonly used to treat benign esophageal conditions including refractory benign esophageal strictures, anastomotic strictures, fistulae, perforations and anastomotic leaks. Data on outcomes in these settings remain limited. We performed a retrospective multicenter study of patients who underwent fully or partially covered self-expandable stent placement for benign esophageal diseases. Esophageal stent placements were performed for the following indications: (1) benign refractory esophageal strictures, (2) surgical anastomotic strictures, (3) esophageal perforations, (4) esophageal fistulae, and (5) surgical anastomotic leaks. A total of 70 patients underwent esophageal stent placement for benign esophageal conditions. A total of 114 separate procedures were performed. The most common indication for esophageal stent placement was refractory benign esophageal stricture (48.2%). Global treatment success rate was 55.7%. Treatment success rate was 33.3% in refractory benign strictures, 23.1% in anastomotic strictures, 100% in perforations, 71.4% in fistulae, and 80% in anastomotic leaks. Stent migration was noted in 28 of 70 patients (40%), most commonly seen in refractory benign strictures. This is one of the largest studies to date of esophageal stents to treat benign esophageal diseases. Success rates are lowest in benign esophageal strictures. These patients have few other options beyond chronic dilations, feeding tubes, and surgery, and fully covered self-expandable metallic stent give patients a chance to have their problem fixed endoscopically and still eat by mouth. Perforations, fistulas, and leaks respond very well to esophageal stenting, and stenting should be considered as a first-line therapy in these settings.
Practicability of hygienic wrapping of touchscreen operated mobile devices in a clinical setting.
Hammon, Matthias; Kunz, Bernd; Dinzl, Veronika; Kammerer, Ferdinand J; Schwab, Siegfried A; Bogdan, Christian; Uder, Michael; Schlechtweg, Philipp M
2014-01-01
To prove effectiveness of wrapping tablet computers in order to reduce microbiological contamination and to evaluate whether a plastic bag-covered tablet leads to impaired user satisfaction or touchscreen functionality. Within a period of 11 days 115 patients were provided with a tablet computer while waiting for their magnetic resonance imaging examination. Every day the contamination of the surface of the tablet was determined before the first and after the final use. Before the device was handed over to a patient, it was enclosed in a customized single-use plastic bag, which was analyzed for bacterial contamination after each use. A questionnaire was applied to determine whether the plastic bag impairs the user satisfaction and the functionality of the touchscreen. Following the use by patients the outside of the plastic bags was found to be contaminated with various bacteria (657.5 ± 368.5 colony forming units/day); some of them were potentially pathogenic. In contrast, the plastic bag covered surface of the tablet was significantly less contaminated (1.7 ± 1.9 colony forming units/day). Likewise, unused plastic bags did not show any contamination. 11% of the patients reported problems with the functionality of the touchscreen. These patients admitted that they had never used a tablet or a smartphone before. Tablets get severely contaminated during usage in a clinical setting. Wrapping with a customized single-use plastic bag significantly reduces microbiological contamination of the device, protects patients from the acquisition of potentially pathogenic bacteria and hardly impairs the user satisfaction and the functionality of the touchscreen.
Practicability of Hygienic Wrapping of Touchscreen Operated Mobile Devices in a Clinical Setting
Hammon, Matthias; Kunz, Bernd; Dinzl, Veronika; Kammerer, Ferdinand J.; Schwab, Siegfried A.; Bogdan, Christian; Uder, Michael; Schlechtweg, Philipp M.
2014-01-01
Background To prove effectiveness of wrapping tablet computers in order to reduce microbiological contamination and to evaluate whether a plastic bag-covered tablet leads to impaired user satisfaction or touchscreen functionality. Materials and Methods Within a period of 11 days 115 patients were provided with a tablet computer while waiting for their magnetic resonance imaging examination. Every day the contamination of the surface of the tablet was determined before the first and after the final use. Before the device was handed over to a patient, it was enclosed in a customized single-use plastic bag, which was analyzed for bacterial contamination after each use. A questionnaire was applied to determine whether the plastic bag impairs the user satisfaction and the functionality of the touchscreen. Results Following the use by patients the outside of the plastic bags was found to be contaminated with various bacteria (657.5 ± 368.5 colony forming units/day); some of them were potentially pathogenic. In contrast, the plastic bag covered surface of the tablet was significantly less contaminated (1.7 ± 1.9 colony forming units/day). Likewise, unused plastic bags did not show any contamination. 11% of the patients reported problems with the functionality of the touchscreen. These patients admitted that they had never used a tablet or a smartphone before. Conclusions Tablets get severely contaminated during usage in a clinical setting. Wrapping with a customized single-use plastic bag significantly reduces microbiological contamination of the device, protects patients from the acquisition of potentially pathogenic bacteria and hardly impairs the user satisfaction and the functionality of the touchscreen. PMID:25180580
Using modern analogues to reconstruct past landcover
NASA Astrophysics Data System (ADS)
Brewer, Simon
2016-04-01
The physical cover of the earth plays an important role in the earth system. It affects the climate through feedbacks such as albedo and surface roughness, forms part of the carbon cycle as both sink and source and is both affected by and can affect human societies. Reconstructing past changes in land use and land cover helps to understand how these interactions may have changed over time, and provides important boundary conditions for paleoclimate models. Pollen assemblages, extracted from sedimentary sequences, provide one of the most abundant sources of information about past changes in land cover over the Holocene period. However, the relationship between plant cover and sedimentary pollen abundance is complex and non-linear, being affected by differential dispersal, production and taxonomic resolution. One method to correct for this and provide quantified estimates of past land cover is to calibrate modern pollen assemblages against contemporary remotely sensed estimates of land cover. Results will be presented from developing such a calibration for a set of European modern pollen samples and AVHRR-based tree cover estimates. An emphasis will be placed on the output of validation tests of the calibration, and what this indicates for the predictive skill of this approach. The calibration will then be applied to a set of pollen sequences for the European continent for the past 11,000 years, and the patterns of reconstructed land cover will be discussed.
Individualized Math Problems in Percent. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. This volume includes problems concerned with computing percents.…
Individualized Math Problems in Algebra. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic, and contains problems related to diverse vocations. Solutions are provided for all problems. Problems presented in this package concern ratios used in food…
Individualized Math Problems in Fractions. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. This package contains problems involving computation with common…
Individualized Math Problems in Geometry. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. The volume contains problems in applied geometry. Measurement of…
Individualized Math Problems in Measurement and Conversion. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. This volume includes problems involving measurement, computation of…
Individualized Math Problems in Integers. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. This volume presents problems involving operations with positive and…
Li, Mingzhong; Xue, Jianquan; Li, Yanchao; Tang, Shukai
2014-01-01
Considering the influence of particle shape and the rheological properties of fluid, two artificial intelligence methods (Artificial Neural Network and Support Vector Machine) were used to predict the wall factor which is widely introduced to deduce the net hydrodynamic drag force of confining boundaries on settling particles. 513 data points were culled from the experimental data of previous studies, which were divided into training set and test set. Particles with various shapes were divided into three kinds: sphere, cylinder, and rectangular prism; feature parameters of each kind of particle were extracted; prediction models of sphere and cylinder using artificial neural network were established. Due to the little number of rectangular prism sample, support vector machine was used to predict the wall factor, which is more suitable for addressing the problem of small samples. The characteristic dimension was presented to describe the shape and size of the diverse particles and a comprehensive prediction model of particles with arbitrary shapes was established to cover all types of conditions. Comparisons were conducted between the predicted values and the experimental results. PMID:24772024
Investigation of virtual reality concept based on system analysis of conceptual series
NASA Astrophysics Data System (ADS)
Romanova, A.; Shuklin, D. A.; Kalinkina, M. E.; Gotskaya, I. B.; Ponomarev, Y. E.
2018-05-01
The paper covers approaches to the definition of virtual reality from the point of view of the humanitarian sciences and technology. Each approach analyzing problems of concept perception of methods interpreted by representatives of philosophy, psychology and sociology is singled out. Terminological analysis of the basic concepts is carried out and their refinement is constructed in the process of comparing the concepts of virtuality and virtual reality. Using the analysis of selected sources, a number of singularity characteristics of the given concept are singled out and its definition is specified. Results consist in combining the interpretation of all approaches to determine the concept of virtual reality. Due to the use of a comprehensive approach to the definition of the investigated concept, which allows us to consider the object of research as a set of elements that are subject to study with the help of a corresponding set of methods, one can conclude that the concept under study is complex and multifaceted. The authors noted that virtual reality technologies have a flexible concept depending on the field of application.
ACS experiment for atmospheric studies on "ExoMars-2016" Orbiter
NASA Astrophysics Data System (ADS)
Korablev, O. I.; Montmessin, F.; Fedorova, A. A.; Ignatiev, N. I.; Shakun, A. V.; Trokhimovskiy, A. V.; Grigoriev, A. V.; Anufreichik, K. A.; Kozlova, T. O.
2015-12-01
ACS is a set of spectrometers for atmospheric studies (Atmospheric Chemistry Suite). It is one of the Russian instruments for the Trace Gas Orbiter (TGO) of the Russian-European "ExoMars" program. The purpose of the experiment is to study the Martian atmosphere by means of two observations regimes: sensitive trace gases measurements in solar occultations and by monitoring the atmospheric state during nadir observations. The experiment will allow us to approach global problems of Mars research such as current volcanism, and the modern climate status and its evolution. Also, the experiment is intended to solve the mystery of methane presence in the Martian atmosphere. Spectrometers of the ACS set cover the spectral range from the near IR-range (0.7 μm) to the thermal IR-range (17 μm) with spectral resolution λ/Δλ reaching 50000. The ACS instrument consists of three independent IR spectrometers and an electronics module, all integrated in a single unit with common mechanical, electrical and thermal interfaces. The article gives an overview of scientific tasks and presents the concept of the experiment.
Li, Mingzhong; Zhang, Guodong; Xue, Jianquan; Li, Yanchao; Tang, Shukai
2014-01-01
Considering the influence of particle shape and the rheological properties of fluid, two artificial intelligence methods (Artificial Neural Network and Support Vector Machine) were used to predict the wall factor which is widely introduced to deduce the net hydrodynamic drag force of confining boundaries on settling particles. 513 data points were culled from the experimental data of previous studies, which were divided into training set and test set. Particles with various shapes were divided into three kinds: sphere, cylinder, and rectangular prism; feature parameters of each kind of particle were extracted; prediction models of sphere and cylinder using artificial neural network were established. Due to the little number of rectangular prism sample, support vector machine was used to predict the wall factor, which is more suitable for addressing the problem of small samples. The characteristic dimension was presented to describe the shape and size of the diverse particles and a comprehensive prediction model of particles with arbitrary shapes was established to cover all types of conditions. Comparisons were conducted between the predicted values and the experimental results.
The role of taxonomy in species conservation.
Mace, Georgina M
2004-01-01
Taxonomy and species conservation are often assumed to be completely interdependent activities. However, a shortage of taxonomic information and skills, and confusion over where the limits to 'species' should be set, both cause problems for conservationists. There is no simple solution because species lists used for conservation planning (e.g. threatened species, species richness estimates, species covered by legislation) are often also used to determine which units should be the focus of conservation actions; this despite the fact that the two processes have such different goals and information needs. Species conservation needs two kinds of taxonomic solution: (i) a set of practical rules to standardize the species units included on lists; and (ii) an approach to the units chosen for conservation recovery planning which recognizes the dynamic nature of natural systems and the differences from the units in listing processes that result. These solutions are well within our grasp but require a new kind of collaboration among conservation biologists, taxonomists and legislators, as well as an increased resource of taxonomists with relevant and high-quality skills. PMID:15253356
Signal Detection and Monitoring Based on Longitudinal Healthcare Data
Suling, Marc; Pigeot, Iris
2012-01-01
Post-marketing detection and surveillance of potential safety hazards are crucial tasks in pharmacovigilance. To uncover such safety risks, a wide set of techniques has been developed for spontaneous reporting data and, more recently, for longitudinal data. This paper gives a broad overview of the signal detection process and introduces some types of data sources typically used. The most commonly applied signal detection algorithms are presented, covering simple frequentistic methods like the proportional reporting rate or the reporting odds ratio, more advanced Bayesian techniques for spontaneous and longitudinal data, e.g., the Bayesian Confidence Propagation Neural Network or the Multi-item Gamma-Poisson Shrinker and methods developed for longitudinal data only, like the IC temporal pattern detection. Additionally, the problem of adjustment for underlying confounding is discussed and the most common strategies to automatically identify false-positive signals are addressed. A drug monitoring technique based on Wald’s sequential probability ratio test is presented. For each method, a real-life application is given, and a wide set of literature for further reading is referenced. PMID:24300373
Remote sensing techniques in cultural resource management archaeology
NASA Astrophysics Data System (ADS)
Johnson, Jay K.; Haley, Bryan S.
2003-04-01
Cultural resource management archaeology in the United States concerns compliance with legislation set in place to protect archaeological resources from the impact of modern activities. Traditionally, surface collection, shovel testing, test excavation, and mechanical stripping are used in these projects. These methods are expensive, time consuming, and may poorly represent the features within archaeological sites. The use of remote sensing techniques in cultural resource management archaeology may provide an answer to these problems. Near-surface geophysical techniques, including magnetometry, resistivity, electromagnetics, and ground penetrating radar, have proven to be particularly successful at efficiently locating archaeological features. Research has also indicated airborne and satellite remote sensing may hold some promise in the future for large-scale archaeological survey, although this is difficult in many areas of the world where ground cover reflect archaeological features in an indirect manner. A cost simulation of a hypothetical data recovery project on a large complex site in Mississippi is presented to illustrate the potential advantages of remote sensing in a cultural resource management setting. The results indicate these techniques can save a substantial amount of time and money for these projects.
Racking Response of Reinforced Concrete Cut and Cover Tunnel
DOT National Transportation Integrated Search
2016-01-01
Currently, the knowledge base and quantitative data sets concerning cut and cover tunnel seismic response are scarce. In this report, a large-scale experimental program is conducted to assess: i) stiffness, capacity, and potential seismically-induced...
NASA Technical Reports Server (NTRS)
Cibula, W. G.
1981-01-01
Four LANDSAT frames, each corresponding to one of the four seasons were spectrally classified and processed using NASA-developed computer programs. One data set was selected or two or more data sets were marged to improve surface cover classifications. Selected areas representing each spectral class were chosen and transferred to USGS 1:62,500 topographic maps for field use. Ground truth data were gathered to verify the accuracy of the classifications. Acreages were computed for each of the land cover types. The application of elevational data to seasonal LANDSAT frames resulted in the separation of high elevation meadows (both with and without recently emergent perennial vegetation) as well as areas in oak forests which have an evergreen understory as opposed to other areas which do not.
Sharpe, Jennifer B.; Soong, David T.
2015-01-01
This study used the National Land Cover Dataset (NLCD) and developed an automated process for determining the area of the three land cover types, thereby allowing faster updating of future models, and for evaluating land cover changes by use of historical NLCD datasets. The study also carried out a raingage partitioning analysis so that the segmentation of land cover and rainfall in each modeled unit is directly applicable to the HSPF modeling. Historical and existing impervious, grass, and forest land acreages partitioned by percentages covered by two sets of raingages for the Lake Michigan diversion SCAs, gaged basins, and ungaged basins are presented.
NASA Technical Reports Server (NTRS)
Jackson, C. E., Jr.
1977-01-01
A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.
Optimization of Self-Directed Target Coverage in Wireless Multimedia Sensor Network
Yang, Yang; Wang, Yufei; Pi, Dechang; Wang, Ruchuan
2014-01-01
Video and image sensors in wireless multimedia sensor networks (WMSNs) have directed view and limited sensing angle. So the methods to solve target coverage problem for traditional sensor networks, which use circle sensing model, are not suitable for WMSNs. Based on the FoV (field of view) sensing model and FoV disk model proposed, how expected multimedia sensor covers the target is defined by the deflection angle between target and the sensor's current orientation and the distance between target and the sensor. Then target coverage optimization algorithms based on expected coverage value are presented for single-sensor single-target, multisensor single-target, and single-sensor multitargets problems distinguishingly. Selecting the orientation that sensor rotated to cover every target falling in the FoV disk of that sensor for candidate orientations and using genetic algorithm to multisensor multitargets problem, which has NP-complete complexity, then result in the approximated minimum subset of sensors which covers all the targets in networks. Simulation results show the algorithm's performance and the effect of number of targets on the resulting subset. PMID:25136667
Statistical mechanics of the vertex-cover problem
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2003-10-01
We review recent progress in the study of the vertex-cover problem (VC). The VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits a coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping the VC to a hard-core lattice gas, and then applying techniques such as the replica trick or the cavity approach. Using these methods, the phase diagram of the VC could be obtained exactly for connectivities c < e, where the VC is replica symmetric. Recently, this result could be confirmed using traditional mathematical techniques. For c > e, the solution of the VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for the VC. Finally, we describe recent results for the VC when studied on other ensembles of finite- and infinite-dimensional graphs.
NASA Astrophysics Data System (ADS)
Ebrahimi Zade, Amir; Sadegheih, Ahmad; Lotfi, Mohammad Mehdi
2014-07-01
Hubs are centers for collection, rearrangement, and redistribution of commodities in transportation networks. In this paper, non-linear multi-objective formulations for single and multiple allocation hub maximal covering problems as well as the linearized versions are proposed. The formulations substantially mitigate complexity of the existing models due to the fewer number of constraints and variables. Also, uncertain shipments are studied in the context of hub maximal covering problems. In many real-world applications, any link on the path from origin to destination may fail to work due to disruption. Therefore, in the proposed bi-objective model, maximizing safety of the weakest path in the network is considered as the second objective together with the traditional maximum coverage goal. Furthermore, to solve the bi-objective model, a modified version of NSGA-II with a new dynamic immigration operator is developed in which the accurate number of immigrants depends on the results of the other two common NSGA-II operators, i.e. mutation and crossover. Besides validating proposed models, computational results confirm a better performance of modified NSGA-II versus traditional one.
Query-Adaptive Reciprocal Hash Tables for Nearest Neighbor Search.
Liu, Xianglong; Deng, Cheng; Lang, Bo; Tao, Dacheng; Li, Xuelong
2016-02-01
Recent years have witnessed the success of binary hashing techniques in approximate nearest neighbor search. In practice, multiple hash tables are usually built using hashing to cover more desired results in the hit buckets of each table. However, rare work studies the unified approach to constructing multiple informative hash tables using any type of hashing algorithms. Meanwhile, for multiple table search, it also lacks of a generic query-adaptive and fine-grained ranking scheme that can alleviate the binary quantization loss suffered in the standard hashing techniques. To solve the above problems, in this paper, we first regard the table construction as a selection problem over a set of candidate hash functions. With the graph representation of the function set, we propose an efficient solution that sequentially applies normalized dominant set to finding the most informative and independent hash functions for each table. To further reduce the redundancy between tables, we explore the reciprocal hash tables in a boosting manner, where the hash function graph is updated with high weights emphasized on the misclassified neighbor pairs of previous hash tables. To refine the ranking of the retrieved buckets within a certain Hamming radius from the query, we propose a query-adaptive bitwise weighting scheme to enable fine-grained bucket ranking in each hash table, exploiting the discriminative power of its hash functions and their complement for nearest neighbor search. Moreover, we integrate such scheme into the multiple table search using a fast, yet reciprocal table lookup algorithm within the adaptive weighted Hamming radius. In this paper, both the construction method and the query-adaptive search method are general and compatible with different types of hashing algorithms using different feature spaces and/or parameter settings. Our extensive experiments on several large-scale benchmarks demonstrate that the proposed techniques can significantly outperform both the naive construction methods and the state-of-the-art hashing algorithms.
Afghanistan Glacier Diminution
NASA Astrophysics Data System (ADS)
Shroder, J. F.; Bishop, M.; Haritashya, U.; Olsenholler, J.
2008-12-01
Glaciers in Afghanistan represent a late summer - early fall source of melt water for late season crop irrigation in a chronically drought-torn region. Precise river discharge figures associated with glacierized drainage basins are generally unavailable because of the destruction of hydrological gauging stations built in pre-war times although historic discharge data and prior (1960s) mapped glacier regions offer some analytical possibilities. The best satellite data sets for glacier-change detection are declassified Cornona and Keyhole satellite data sets, standard Landsat sources, and new ASTER images assessed in our GLIMS (Global Land Ice Measurements from Space) Regional Center for Southwest Asia (Afghanistan and Pakistan). The new hyperspectral remote sensing survey of Afghanistan completed by the US Geological Survey and the Afghanistan Ministry of Mines offers potential for future detailed assessments. Long-term climate change in southwest Asia has decreased precipitation for millennia so that glaciers, rivers and lakes have all declined from prehistoric and historic highs. As many glaciers declined in ice volume, they increased in debris cover until they were entirely debris-covered or became rock glaciers, and the ice was protected thereby from direct solar radiation, to presumably reduce ablation rates. We have made a preliminary assessment of glacier location and extent for the country, with selected, more-detailed, higher-resolution studies underway. In the Great Pamir of the Wakhan Corridor where the largest glaciers occur, we assessed fluctuations of a randomly selected 30 glaciers from 1976 to 2003. Results indicate that 28 glacier-terminus positions have retreated, and the largest average retreat rate was 36 m/yr. High albedo, non-vegetated glacier forefields formed prior to 1976, and geomorphological evidence shows apparent glacier-surface downwasting after 1976. Climatic conditions and glacier retreat have resulted in disconnection of tributary glaciers to their main trunk, the formation of high-altitude lakes, and an increased frequency and size of proglacial lakes that are, however, genrally unavailable for irrigation sources. Similar conditions of glacier diminution have occurred in almost all other high altitude parts of the country. Generally decreased precipitation in all seasons, coupled with decreased glacier storage of potential melt-water, augers continued severe problems for beleaguered Afghanistan agriculture, along with concomitant social problems as a result.
International Spinal Cord Injury Data Sets for non-traumatic spinal cord injury.
New, P W; Marshall, R
2014-02-01
Multifaceted: extensive discussions at workshop and conference presentations, survey of experts and feedback. Present the background, purpose and development of the International Spinal Cord Injury (SCI) Data Sets for Non-Traumatic SCI (NTSCI), including a hierarchical classification of aetiology. International. Consultation via e-mail, presentations and discussions at ISCoS conferences (2006-2009), and workshop (1 September 2008). The consultation processes aimed to: (1) clarify aspects of the classification structure, (2) determine placement of certain aetiologies and identify important missing causes of NTSCI and (3) resolve coding issues and refine definitions. Every effort was made to consider feedback and suggestions from participants. The International Data Sets for NTSCI includes basic and an extended versions. The extended data set includes a two-axis classification system for the causes of NTSCI. Axis 1 consists of a five-level, two-tier (congenital-genetic and acquired) hierarchy that allows for increasing detail to specify the aetiology. Axis 2 uses the International Statistical Classification of Diseases (ICD) and Related Health Problems for coding the initiating diseases(s) that may have triggered the events that resulted in the axis 1 diagnosis, where appropriate. Additional items cover the timeframe of onset of NTSCI symptoms and presence of iatrogenicity. Complete instructions for data collection, data sheet and training cases are available at the websites of ISCoS (http://www.iscos.org.uk) and ASIA (http://www.asia-spinalinjury.org). The data sets should facilitate comparative research involving NTSCI participants, especially epidemiological studies and prevention projects. Further work is anticipated to refine the data sets, particularly regarding iatrogenicity.
BOREAS Regional Soils Data in Raster Format and AEAC Projection
NASA Technical Reports Server (NTRS)
Monette, Bryan; Knapp, David; Hall, Forrest G. (Editor); Nickeson, Jaime (Editor)
2000-01-01
This data set was gridded by BOREAS Information System (BORIS) Staff from a vector data set received from the Canadian Soil Information System (CanSIS). The original data came in two parts that covered Saskatchewan and Manitoba. The data were gridded and merged into one data set of 84 files covering the BOREAS region. The data were gridded into the AEAC projection. Because the mapping of the two provinces was done separately in the original vector data, there may be discontinuities in some of the soil layers because of different interpretations of certain soil properties. The data are stored in binary, image format files.
ERIC Educational Resources Information Center
Erbas, Dilek; Turan, Yasemin; Ozen, Arzu; Halle, James W.
2006-01-01
The purpose of the present study was to assess the effectiveness of the "cover write" method of teaching word-naming and spelling to two Turkish students with developmental disabilities. A multiple-probe design across three, 5-word sets was employed to assess the effectiveness of the intervention. The "cover write" method was…
Paul Dunham; Dale Weyermann; Dale Azuma
2002-01-01
Stratifications developed from National Land Cover Data (NLCD) and from photointerpretation (PI) were tested for effectiveness in reducing sampling error associated with estimates of timberland area and volume from FIA plots in western Oregon. Strata were created from NLCD through the aggregation of cover classes and the creation of 'edge' strata by...
NASA Technical Reports Server (NTRS)
Townshend, John R.; Masek, Jeffrey G.; Huang, ChengQuan; Vermote, Eric F.; Gao, Feng; Channan, Saurabh; Sexton, Joseph O.; Feng, Min; Narasimhan, Ramghuram; Kim, Dohyung;
2012-01-01
The compilation of global Landsat data-sets and the ever-lowering costs of computing now make it feasible to monitor the Earth's land cover at Landsat resolutions of 30 m. In this article, we describe the methods to create global products of forest cover and cover change at Landsat resolutions. Nevertheless, there are many challenges in ensuring the creation of high-quality products. And we propose various ways in which the challenges can be overcome. Among the challenges are the need for atmospheric correction, incorrect calibration coefficients in some of the data-sets, the different phenologies between compilations, the need for terrain correction, the lack of consistent reference data for training and accuracy assessment, and the need for highly automated characterization and change detection. We propose and evaluate the creation and use of surface reflectance products, improved selection of scenes to reduce phenological differences, terrain illumination correction, automated training selection, and the use of information extraction procedures robust to errors in training data along with several other issues. At several stages we use Moderate Resolution Spectroradiometer data and products to assist our analysis. A global working prototype product of forest cover and forest cover change is included.
Classification of surface types using SIR-C/X-SAR, Mount Everest Area, Tibet
Albright, Thomas P.; Painter, Thomas H.; Roberts, Dar A.; Shi, Jiancheng; Dozier, Jeff; Fielding, Eric
1998-01-01
Imaging radar is a promising tool for mapping snow and ice cover in alpine regions. It combines a high-resolution, day or night, all-weather imaging capability with sensitivity to hydrologic and climatic snow and ice parameters. We use the spaceborne imaging radar-C/X-band synthetic aperture radar (SIR-C/X-SAR) to map snow and glacial ice on the rugged north slope of Mount Everest. From interferometrically derived digital elevation data, we compute the terrain calibration factor and cosine of the local illumination angle. We then process and terrain-correct radar data sets acquired on April 16, 1994. In addition to the spectral data, we include surface slope to improve discrimination among several surface types. These data sets are then used in a decision tree to generate an image classification. This method is successful in identifying and mapping scree/talus, dry snow, dry snow-covered glacier, wet snow-covered glacier, and rock-covered glacier, as corroborated by comparison with existing surface cover maps and other ancillary information. Application of the classification scheme to data acquired on October 7 of the same year yields accurate results for most surface types but underreports the extent of dry snow cover.
Mathematical Foundation for Plane Covering Using Hexagons
NASA Technical Reports Server (NTRS)
Johnson, Gordon G.
1999-01-01
This work is to indicate the development and mathematical underpinnings of the algorithms previously developed for covering the plane and the addressing of the elements of the covering. The algorithms are of interest in that they provides a simple systematic way of increasing or decreasing resolution, in the sense that if we have the covering in place and there is an image superimposed upon the covering, then we may view the image in a rough form or in a very detailed form with minimal effort. Such ability allows for quick searches of crude forms to determine a class in which to make a detailed search. In addition, the addressing algorithms provide an efficient way to process large data sets that have related subsets. The algorithms produced were based in part upon the work of D. Lucas "A Multiplication in N Space" which suggested a set of three vectors, any two of which would serve as a bases for the plane and also that the hexagon is the natural geometric object to be used in a covering with a suggested bases. The second portion is a refinement of the eyeball vision system, the globular viewer.
Learning Activity Package, Algebra 93-94, LAPs 12-22.
ERIC Educational Resources Information Center
Evans, Diane
A set of 11 teacher-prepared Learning Activity Packages (LAPs) in beginning algebra, these units cover sets, properties of operations, operations over real numbers, open expressions, solution sets of equations and inequalities, equations and inequalities with two variables, solution sets of equations with two variables, exponents, factoring and…
Yi, S.; Li, N.; Xiang, B.; Wang, X.; Ye, B.; McGuire, A.D.
2013-01-01
Soil surface temperature is a critical boundary condition for the simulation of soil temperature by environmental models. It is influenced by atmospheric and soil conditions and by vegetation cover. In sophisticated land surface models, it is simulated iteratively by solving surface energy budget equations. In ecosystem, permafrost, and hydrology models, the consideration of soil surface temperature is generally simple. In this study, we developed a methodology for representing the effects of vegetation cover and atmospheric factors on the estimation of soil surface temperature for alpine grassland ecosystems on the Qinghai-Tibetan Plateau. Our approach integrated measurements from meteorological stations with simulations from a sophisticated land surface model to develop an equation set for estimating soil surface temperature. After implementing this equation set into an ecosystem model and evaluating the performance of the ecosystem model in simulating soil temperature at different depths in the soil profile, we applied the model to simulate interactions among vegetation cover, freeze-thaw cycles, and soil erosion to demonstrate potential applications made possible through the implementation of the methodology developed in this study. Results showed that (1) to properly estimate daily soil surface temperature, algorithms should use air temperature, downward solar radiation, and vegetation cover as independent variables; (2) the equation set developed in this study performed better than soil surface temperature algorithms used in other models; and (3) the ecosystem model performed well in simulating soil temperature throughout the soil profile using the equation set developed in this study. Our application of the model indicates that the representation in ecosystem models of the effects of vegetation cover on the simulation of soil thermal dynamics has the potential to substantially improve our understanding of the vulnerability of alpine grassland ecosystems to changes in climate and grazing regimes.
NASA Astrophysics Data System (ADS)
Yi, S.; Li, N.; Xiang, B.; Wang, X.; Ye, B.; McGuire, A. D.
2013-07-01
surface temperature is a critical boundary condition for the simulation of soil temperature by environmental models. It is influenced by atmospheric and soil conditions and by vegetation cover. In sophisticated land surface models, it is simulated iteratively by solving surface energy budget equations. In ecosystem, permafrost, and hydrology models, the consideration of soil surface temperature is generally simple. In this study, we developed a methodology for representing the effects of vegetation cover and atmospheric factors on the estimation of soil surface temperature for alpine grassland ecosystems on the Qinghai-Tibetan Plateau. Our approach integrated measurements from meteorological stations with simulations from a sophisticated land surface model to develop an equation set for estimating soil surface temperature. After implementing this equation set into an ecosystem model and evaluating the performance of the ecosystem model in simulating soil temperature at different depths in the soil profile, we applied the model to simulate interactions among vegetation cover, freeze-thaw cycles, and soil erosion to demonstrate potential applications made possible through the implementation of the methodology developed in this study. Results showed that (1) to properly estimate daily soil surface temperature, algorithms should use air temperature, downward solar radiation, and vegetation cover as independent variables; (2) the equation set developed in this study performed better than soil surface temperature algorithms used in other models; and (3) the ecosystem model performed well in simulating soil temperature throughout the soil profile using the equation set developed in this study. Our application of the model indicates that the representation in ecosystem models of the effects of vegetation cover on the simulation of soil thermal dynamics has the potential to substantially improve our understanding of the vulnerability of alpine grassland ecosystems to changes in climate and grazing regimes.
Complex fuzzy soft expert sets
NASA Astrophysics Data System (ADS)
Selvachandran, Ganeshsree; Hafeed, Nisren A.; Salleh, Abdul Razak
2017-04-01
Complex fuzzy sets and its accompanying theory although at its infancy, has proven to be superior to classical type-1 fuzzy sets, due its ability in representing time-periodic problem parameters and capturing the seasonality of the fuzziness that exists in the elements of a set. These are important characteristics that are pervasive in most real world problems. However, there are two major problems that are inherent in complex fuzzy sets: it lacks a sufficient parameterization tool and it does not have a mechanism to validate the values assigned to the membership functions of the elements in a set. To overcome these problems, we propose the notion of complex fuzzy soft expert sets which is a hybrid model of complex fuzzy sets and soft expert sets. This model incorporates the advantages of complex fuzzy sets and soft sets, besides having the added advantage of allowing the users to know the opinion of all the experts in a single model without the need for any additional cumbersome operations. As such, this model effectively improves the accuracy of representation of problem parameters that are periodic in nature, besides having a higher level of computational efficiency compared to similar models in literature.
NASA Astrophysics Data System (ADS)
Mugo, R. M.; Limaye, A. S.; Nyaga, J. W.; Farah, H.; Wahome, A.; Flores, A.
2016-12-01
The water quality of inland lakes is largely influenced by land use and land cover changes within the lake's catchment. In Africa, some of the major land use changes are driven by a number of factors, which include urbanization, intensification of agricultural practices, unsustainable farm management practices, deforestation, land fragmentation and degradation. Often, the impacts of these factors are observable on changes in the land cover, and eventually in the hydrological systems. When the natural vegetation cover is reduced or changed, the surface water flow patterns, water and nutrient retention capacities are also changed. This can lead to high nutrient inputs into lakes, leading to eutrophication, siltation and infestation of floating aquatic vegetation. To assess the relationship between land use and land cover changes in part of the Lake Victoria Basin, a series of land cover maps were derived from Landsat imagery. Changes in land cover were identified through change maps and statistics. Further, the surface water chlorophyll-a concentration and turbidity were derived from MODIS-Aqua data for Lake Victoria. Chlrophyll-a and turbidity are good proxy indicators of nutrient inputs and siltation respectively. The trends in chlorophyll-a and turbidity concentrations were analyzed and compared to the land cover changes over time. Certain land cover changes related to agriculture and urban development were clearly identifiable. While these changes might not be solely responsible for variability in chlrophyll-a and turbidity concentrations in the lake, they are potentially contributing factors to this problem. This work illustrates the importance of addressing watershed degradation while seeking to solve water quality related problems.
Teaching Astronomy in UK Schools
ERIC Educational Resources Information Center
Roche, Paul; Roberts, Sarah; Newsam, Andy; Barclay, Charles
2012-01-01
This article attempts to summarise the good, bad and (occasionally) ugly aspects of teaching astronomy in UK schools. It covers the most common problems reported by teachers when asked about covering the astronomy/space topics in school. Particular focus is given to the GCSE Astronomy qualification offered by Edexcel (which is currently the…
Air Pollution. Part A: Analysis.
ERIC Educational Resources Information Center
Ledbetter, Joe O.
Two facets of the engineering control of air pollution (the analysis of possible problems and the application of effective controls) are covered in this two-volume text. Part A covers Analysis, and Part B, Prevention and Control. (This review is concerned with Part A only.) This volume deals with the terminology, methodology, and symptomatology…
Predictions for snow cover, glaciers and runoff in a changing climate
USDA-ARS?s Scientific Manuscript database
The problem of evaluating the hydrological effects of climate change has opened a new field of applications for snowmelt runoff models. The Snowmelt Runoff Model (SRM) has been used to evaluate climate change effects on basins in North America, the Swiss Alps, and the Himalayas. Snow covered area ...
Studies in matter antimatter separation and in the origin of lunar magnetism
NASA Technical Reports Server (NTRS)
Barker, W. A.; Greeley, R.; Parkin, C.; Aggarwal, H.; Schultz, P.
1975-01-01
A progress report, covering lunar and planetary research is introduced. Data cover lunar ionospheric models, lunar and planetary geology, and lunar magnetism. Wind tunnel simulations of Mars aeolian problems and a comparative study of basaltic analogs of Lunar and Martial volcanic features was discussed.
NASA Technical Reports Server (NTRS)
Strack, John E.; Pielke, Roger A.; Steyaert, Louis T.; Knox, Robert G.
2008-01-01
Land cover changes alter the near surface weather and climate. Changes in land surface properties such as albedo, roughness length, stomatal resistance, and leaf area index alter the surface energy balance, leading to differences in near surface temperatures. This study utilized a newly developed land cover data set for the eastern United States to examine the influence of historical land cover change on June temperatures and precipitation. The new data set contains representations of the land cover and associated biophysical parameters for 1650, 1850, 1920, and 1992, capturing the clearing of the forest and the expansion of agriculture over the eastern United States from 1650 to the early twentieth century and the subsequent forest regrowth. The data set also includes the inferred distribution of potentially water-saturated soils at each time slice for use in the sensitivity tests. The Regional Atmospheric Modeling System, equipped with the Land Ecosystem-Atmosphere Feedback (LEAF-2) land surface parameterization, was used to simulate the weather of June 1996 using the 1992, 1920, 1850, and 1650 land cover representations. The results suggest that changes in surface roughness and stomatal resistance have caused present-day maximum and minimum temperatures in the eastern United States to warm by about 0.3 C and 0.4 C, respectively, when compared to values in 1650. In contrast, the maximum temperatures have remained about the same, while the minimums have cooled by about 0.1 C when compared to 1920. Little change in precipitation was found.
Strack, John E.; Pielke, Roger A.; Steyaert, Louis T.; Knox, Robert G.
2008-01-01
Land cover changes alter the near surface weather and climate. Changes in land surface properties such as albedo, roughness length, stomatal resistance, and leaf area index alter the surface energy balance, leading to differences in near surface temperatures. This study utilized a newly developed land cover data set for the eastern United States to examine the influence of historical land cover change on June temperatures and precipitation. The new data set contains representations of the land cover and associated biophysical parameters for 1650, 1850, 1920, and 1992, capturing the clearing of the forest and the expansion of agriculture over the eastern United States from 1650 to the early twentieth century and the subsequent forest regrowth. The data set also includes the inferred distribution of potentially water‐saturated soils at each time slice for use in the sensitivity tests. The Regional Atmospheric Modeling System, equipped with the Land Ecosystem‐Atmosphere Feedback (LEAF‐2) land surface parameterization, was used to simulate the weather of June 1996 using the 1992, 1920, 1850, and 1650 land cover representations. The results suggest that changes in surface roughness and stomatal resistance have caused present‐day maximum and minimum temperatures in the eastern United States to warm by about 0.3°C and 0.4°C, respectively, when compared to values in 1650. In contrast, the maximum temperatures have remained about the same, while the minimums have cooled by about 0.1°C when compared to 1920. Little change in precipitation was found.
ERIC Educational Resources Information Center
Lang, Susan S.
2001-01-01
Argues that carpets do not contribute to student complaints of respiratory problems, allergies, and asthma as long as they are properly cleaned with high-efficiency microfiltration bags. Discusses contributions to mite problems made by smooth floor covering compared to carpeting. (GR)
WE-D-213-04: Preparing for Parts 2 & 3 of the ABR Nuclear Medicine Physics Exam
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacDougall, R.
Adequate, efficient preparation for the ABR Diagnostic and Nuclear Medical Physics exams is key to successfully obtain ABR professional certification. Each part of the ABR exam presents its own challenges: Part I: Determine the scope of basic medical physics study material, efficiently review this material, and solve related written questions/problems. Part II: Understand imaging principles, modalities, and systems, including image acquisition, processing, and display. Understand the relationship between imaging techniques, image quality, patient dose and safety, and solve related written questions/problems. Part III: Gain crucial, practical, clinical medical physics experience. Effectively communicate and explain the practice, performance, and significance ofmore » all aspects of clinical medical physics. All three parts of the ABR exam require specific skill sets and preparation: mastery of basic physics and imaging principles; written problem solving often involving rapid calculation; responding clearly and succinctly to oral questions about the practice, methods, and significance of clinical medical physics. This symposium focuses on the preparation and skill sets necessary for each part of the ABR exam. Although there is some overlap, the nuclear exam covers a different body of knowledge than the diagnostic exam. A separate speaker will address those aspects that are unique to the nuclear exam. Medical physicists who have recently completed each of part of the ABR exam will share their experiences, insights, and preparation methods to help attendees best prepare for the challenges of each part of the ABR exam. In accordance with ABR exam security policy, no recalls or exam questions will be discussed. Learning Objectives: How to prepare for Part 1 of the ABR exam by determining the scope of basic medical physics study material and related problem solving/calculations How to Prepare for Part 2 of the ABR exam by understanding diagnostic and/or nuclear imaging physics, systems, dosimetry, safety and related problem solving/calculations How to Prepare for Part 3 of the ABR exam by effectively communicating the practice, methods, and significance of clinical diagnostic and/or nuclear medical physics.« less
WE-D-213-00: Preparing for the ABR Diagnostic and Nuclear Medicine Physics Exams
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Adequate, efficient preparation for the ABR Diagnostic and Nuclear Medical Physics exams is key to successfully obtain ABR professional certification. Each part of the ABR exam presents its own challenges: Part I: Determine the scope of basic medical physics study material, efficiently review this material, and solve related written questions/problems. Part II: Understand imaging principles, modalities, and systems, including image acquisition, processing, and display. Understand the relationship between imaging techniques, image quality, patient dose and safety, and solve related written questions/problems. Part III: Gain crucial, practical, clinical medical physics experience. Effectively communicate and explain the practice, performance, and significance ofmore » all aspects of clinical medical physics. All three parts of the ABR exam require specific skill sets and preparation: mastery of basic physics and imaging principles; written problem solving often involving rapid calculation; responding clearly and succinctly to oral questions about the practice, methods, and significance of clinical medical physics. This symposium focuses on the preparation and skill sets necessary for each part of the ABR exam. Although there is some overlap, the nuclear exam covers a different body of knowledge than the diagnostic exam. A separate speaker will address those aspects that are unique to the nuclear exam. Medical physicists who have recently completed each of part of the ABR exam will share their experiences, insights, and preparation methods to help attendees best prepare for the challenges of each part of the ABR exam. In accordance with ABR exam security policy, no recalls or exam questions will be discussed. Learning Objectives: How to prepare for Part 1 of the ABR exam by determining the scope of basic medical physics study material and related problem solving/calculations How to Prepare for Part 2 of the ABR exam by understanding diagnostic and/or nuclear imaging physics, systems, dosimetry, safety and related problem solving/calculations How to Prepare for Part 3 of the ABR exam by effectively communicating the practice, methods, and significance of clinical diagnostic and/or nuclear medical physics.« less
WE-D-213-01: Preparing for Part 1 of the ABR Diagnostic Physics Exam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simiele, S.
Adequate, efficient preparation for the ABR Diagnostic and Nuclear Medical Physics exams is key to successfully obtain ABR professional certification. Each part of the ABR exam presents its own challenges: Part I: Determine the scope of basic medical physics study material, efficiently review this material, and solve related written questions/problems. Part II: Understand imaging principles, modalities, and systems, including image acquisition, processing, and display. Understand the relationship between imaging techniques, image quality, patient dose and safety, and solve related written questions/problems. Part III: Gain crucial, practical, clinical medical physics experience. Effectively communicate and explain the practice, performance, and significance ofmore » all aspects of clinical medical physics. All three parts of the ABR exam require specific skill sets and preparation: mastery of basic physics and imaging principles; written problem solving often involving rapid calculation; responding clearly and succinctly to oral questions about the practice, methods, and significance of clinical medical physics. This symposium focuses on the preparation and skill sets necessary for each part of the ABR exam. Although there is some overlap, the nuclear exam covers a different body of knowledge than the diagnostic exam. A separate speaker will address those aspects that are unique to the nuclear exam. Medical physicists who have recently completed each of part of the ABR exam will share their experiences, insights, and preparation methods to help attendees best prepare for the challenges of each part of the ABR exam. In accordance with ABR exam security policy, no recalls or exam questions will be discussed. Learning Objectives: How to prepare for Part 1 of the ABR exam by determining the scope of basic medical physics study material and related problem solving/calculations How to Prepare for Part 2 of the ABR exam by understanding diagnostic and/or nuclear imaging physics, systems, dosimetry, safety and related problem solving/calculations How to Prepare for Part 3 of the ABR exam by effectively communicating the practice, methods, and significance of clinical diagnostic and/or nuclear medical physics.« less
WE-D-213-03: Preparing for Part 3 of the ABR Diagnostic Physics Exam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevins, N.
Adequate, efficient preparation for the ABR Diagnostic and Nuclear Medical Physics exams is key to successfully obtain ABR professional certification. Each part of the ABR exam presents its own challenges: Part I: Determine the scope of basic medical physics study material, efficiently review this material, and solve related written questions/problems. Part II: Understand imaging principles, modalities, and systems, including image acquisition, processing, and display. Understand the relationship between imaging techniques, image quality, patient dose and safety, and solve related written questions/problems. Part III: Gain crucial, practical, clinical medical physics experience. Effectively communicate and explain the practice, performance, and significance ofmore » all aspects of clinical medical physics. All three parts of the ABR exam require specific skill sets and preparation: mastery of basic physics and imaging principles; written problem solving often involving rapid calculation; responding clearly and succinctly to oral questions about the practice, methods, and significance of clinical medical physics. This symposium focuses on the preparation and skill sets necessary for each part of the ABR exam. Although there is some overlap, the nuclear exam covers a different body of knowledge than the diagnostic exam. A separate speaker will address those aspects that are unique to the nuclear exam. Medical physicists who have recently completed each of part of the ABR exam will share their experiences, insights, and preparation methods to help attendees best prepare for the challenges of each part of the ABR exam. In accordance with ABR exam security policy, no recalls or exam questions will be discussed. Learning Objectives: How to prepare for Part 1 of the ABR exam by determining the scope of basic medical physics study material and related problem solving/calculations How to Prepare for Part 2 of the ABR exam by understanding diagnostic and/or nuclear imaging physics, systems, dosimetry, safety and related problem solving/calculations How to Prepare for Part 3 of the ABR exam by effectively communicating the practice, methods, and significance of clinical diagnostic and/or nuclear medical physics.« less
WE-D-213-02: Preparing for Part 2 of the ABR Diagnostic Physics Exam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zambelli, J.
Adequate, efficient preparation for the ABR Diagnostic and Nuclear Medical Physics exams is key to successfully obtain ABR professional certification. Each part of the ABR exam presents its own challenges: Part I: Determine the scope of basic medical physics study material, efficiently review this material, and solve related written questions/problems. Part II: Understand imaging principles, modalities, and systems, including image acquisition, processing, and display. Understand the relationship between imaging techniques, image quality, patient dose and safety, and solve related written questions/problems. Part III: Gain crucial, practical, clinical medical physics experience. Effectively communicate and explain the practice, performance, and significance ofmore » all aspects of clinical medical physics. All three parts of the ABR exam require specific skill sets and preparation: mastery of basic physics and imaging principles; written problem solving often involving rapid calculation; responding clearly and succinctly to oral questions about the practice, methods, and significance of clinical medical physics. This symposium focuses on the preparation and skill sets necessary for each part of the ABR exam. Although there is some overlap, the nuclear exam covers a different body of knowledge than the diagnostic exam. A separate speaker will address those aspects that are unique to the nuclear exam. Medical physicists who have recently completed each of part of the ABR exam will share their experiences, insights, and preparation methods to help attendees best prepare for the challenges of each part of the ABR exam. In accordance with ABR exam security policy, no recalls or exam questions will be discussed. Learning Objectives: How to prepare for Part 1 of the ABR exam by determining the scope of basic medical physics study material and related problem solving/calculations How to Prepare for Part 2 of the ABR exam by understanding diagnostic and/or nuclear imaging physics, systems, dosimetry, safety and related problem solving/calculations How to Prepare for Part 3 of the ABR exam by effectively communicating the practice, methods, and significance of clinical diagnostic and/or nuclear medical physics.« less
Discrete Ordinate Quadrature Selection for Reactor-based Eigenvalue Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarrell, Joshua J; Evans, Thomas M; Davidson, Gregory G
2013-01-01
In this paper we analyze the effect of various quadrature sets on the eigenvalues of several reactor-based problems, including a two-dimensional (2D) fuel pin, a 2D lattice of fuel pins, and a three-dimensional (3D) reactor core problem. While many quadrature sets have been applied to neutral particle discrete ordinate transport calculations, the Level Symmetric (LS) and the Gauss-Chebyshev product (GC) sets are the most widely used in production-level reactor simulations. Other quadrature sets, such as Quadruple Range (QR) sets, have been shown to be more accurate in shielding applications. In this paper, we compare the LS, GC, QR, and themore » recently developed linear-discontinuous finite element (LDFE) sets, as well as give a brief overview of other proposed quadrature sets. We show that, for a given number of angles, the QR sets are more accurate than the LS and GC in all types of reactor problems analyzed (2D and 3D). We also show that the LDFE sets are more accurate than the LS and GC sets for these problems. We conclude that, for problems where tens to hundreds of quadrature points (directions) per octant are appropriate, QR sets should regularly be used because they have similar integration properties as the LS and GC sets, have no noticeable impact on the speed of convergence of the solution when compared with other quadrature sets, and yield more accurate results. We note that, for very high-order scattering problems, the QR sets exactly integrate fewer angular flux moments over the unit sphere than the GC sets. The effects of those inexact integrations have yet to be analyzed. We also note that the LDFE sets only exactly integrate the zeroth and first angular flux moments. Pin power comparisons and analyses are not included in this paper and are left for future work.« less
Discrete ordinate quadrature selection for reactor-based Eigenvalue problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarrell, J. J.; Evans, T. M.; Davidson, G. G.
2013-07-01
In this paper we analyze the effect of various quadrature sets on the eigenvalues of several reactor-based problems, including a two-dimensional (2D) fuel pin, a 2D lattice of fuel pins, and a three-dimensional (3D) reactor core problem. While many quadrature sets have been applied to neutral particle discrete ordinate transport calculations, the Level Symmetric (LS) and the Gauss-Chebyshev product (GC) sets are the most widely used in production-level reactor simulations. Other quadrature sets, such as Quadruple Range (QR) sets, have been shown to be more accurate in shielding applications. In this paper, we compare the LS, GC, QR, and themore » recently developed linear-discontinuous finite element (LDFE) sets, as well as give a brief overview of other proposed quadrature sets. We show that, for a given number of angles, the QR sets are more accurate than the LS and GC in all types of reactor problems analyzed (2D and 3D). We also show that the LDFE sets are more accurate than the LS and GC sets for these problems. We conclude that, for problems where tens to hundreds of quadrature points (directions) per octant are appropriate, QR sets should regularly be used because they have similar integration properties as the LS and GC sets, have no noticeable impact on the speed of convergence of the solution when compared with other quadrature sets, and yield more accurate results. We note that, for very high-order scattering problems, the QR sets exactly integrate fewer angular flux moments over the unit sphere than the GC sets. The effects of those inexact integrations have yet to be analyzed. We also note that the LDFE sets only exactly integrate the zeroth and first angular flux moments. Pin power comparisons and analyses are not included in this paper and are left for future work. (authors)« less
Two Quantum Protocols for Oblivious Set-member Decision Problem
NASA Astrophysics Data System (ADS)
Shi, Run-Hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-10-01
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology.
Two Quantum Protocols for Oblivious Set-member Decision Problem
Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-01-01
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology. PMID:26514668
Two Quantum Protocols for Oblivious Set-member Decision Problem.
Shi, Run-Hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-10-30
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology.
Generalized minimum dominating set and application in automatic text summarization
NASA Astrophysics Data System (ADS)
Xu, Yi-Zhi; Zhou, Hai-Jun
2016-03-01
For a graph formed by vertices and weighted edges, a generalized minimum dominating set (MDS) is a vertex set of smallest cardinality such that the summed weight of edges from each outside vertex to vertices in this set is equal to or larger than certain threshold value. This generalized MDS problem reduces to the conventional MDS problem in the limiting case of all the edge weights being equal to the threshold value. We treat the generalized MDS problem in the present paper by a replica-symmetric spin glass theory and derive a set of belief-propagation equations. As a practical application we consider the problem of extracting a set of sentences that best summarize a given input text document. We carry out a preliminary test of the statistical physics-inspired method to this automatic text summarization problem.
Individualized Math Problems in Ratio and Proportion. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. This volume contains problems involving ratio and proportion. Some…
Individualized Math Problems in Graphs and Tables. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. Problems involving the construction and interpretation of graphs and…
Individualized Math Problems in Simple Equations. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. Problems in this volume require solution of linear equations, systems…
Individualized Math Problems in Trigonometry. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. Problems in this volume require the use of trigonometric and inverse…
Individualized Math Problems in Decimals. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
THis is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. Problems in this volume concern use of decimals and are related to the…
Individualized Math Problems in Volume. Oregon Vo-Tech Mathematics Problem Sets.
ERIC Educational Resources Information Center
Cosler, Norma, Ed.
This is one of eighteen sets of individualized mathematics problems developed by the Oregon Vo-Tech Math Project. Each of these problem packages is organized around a mathematical topic and contains problems related to diverse vocations. Solutions are provided for all problems. Problems in this booklet require the computation of volumes of solids,…
The Cognitive Consequences of Patterns of Information Flow
NASA Technical Reports Server (NTRS)
Hutchins, Edwin
1999-01-01
The flight deck of a moderm commercial airliner is a complex system consisting of two or more crew and a suite of technological devices. When everything goes right, all modem flight decks are easy to use. When things go sour, however, automated flight decks provide opportunities for new kinds of problems. A recent article in Aviation Week cited industry concern over the problem of verifying the safety of complex systems on automated, digital aircraft, stating that the industry must "guard against the kind of incident in which people and the automation seem to mismanage a minor occurrence or non-routine situation into larger trouble." The design of automated flight deck systems that flight crews find easy to use safely is a challenge in part because this design activity requires a theoretical perspective which can simultaneously cover the interactions of people with each other and with technology. In this paper, I will introduce some concepts that can be used to understand the flight deck as a system that is composed of two or more pilots and a complex suite of automated devices. As I will try to show, without a theory, we can repeat what seems to work, but we may not know why it worked or how to make it work in novel circumstances. Theory allows us to rise above the particulars of specific situations and makes the application of the roots of success in one setting applicable to other settings.
Hemodynamic Performance of a Novel Right Ventricular Assist Device (PERKAT).
Kretzschmar, Daniel; Schulze, P Christian; Ferrari, Markus W
Acute right ventricular failure (RVF) is an increasing clinical problem and a life-threatening condition. Right ventricular assist devices represent a reasonable treatment option for patients with refractory RVF. We here present a novel percutaneously implantable device for right ventricular support. The PERKAT device is based on a nitinol stent cage, which is covered with valve-carrying foils. A flexible outlet trunk with a pigtail tip is connected to the distal part. The device is driven by an intra-aortic balloon pump (IABP) drive unit, which inflates/deflates a standard IABP-balloon placed within the stent cage. In-vitro evaluation was done in a liquid bath containing water or blood analog. The PERKAT device was tested in different afterload settings using two different IABP-balloons and varying inflation/deflation rates. We detected flow rates ranging from 1.97 to 3.93 L/min depending on the afterload setting, inflation/deflation rate, balloon size, and the medium used. Flow rates between water and blood analog were nearly comparable, and in the higher inflation/deflation rate settings slightly higher with water. Based on this promising in vitro data, the innovative percutaneously implantable PERKAT device has a potential to become a therapeutic option for patients with RVF refractory to medical treatment.
Reconstruction of limited-angle dual-energy CT using mutual learning and cross-estimation (MLCE)
NASA Astrophysics Data System (ADS)
Zhang, Huayu; Xing, Yuxiang
2016-03-01
Dual-energy CT (DECT) imaging has gained a lot of attenuation because of its capability to discriminate materials. We proposes a flexible DECT scan strategy which can be realized on a system with general X-ray sources and detectors. In order to lower dose and scanning time, our DECT acquires two projections data sets on two arcs of limited-angular coverage (one for each energy) respectively. Meanwhile, a certain number of rays from two data sets form conjugate sampling pairs. Our reconstruction method for such a DECT scan mainly tackles the consequent limited-angle problem. Using the idea of artificial neural network, we excavate the connection between projections at two different energies by constructing a relationship between the linear attenuation coefficient of the high energy and that of the low one. We use this relationship to cross-estimate missing projections and reconstruct attenuation images from an augmented data set including projections at views covered by itself (projections collected in scanning) and by the other energy (projections estimated) for each energy respectively. Validated by our numerical experiment on a dental phantom with rather complex structures, our DECT is effective in recovering small structures in severe limited-angle situations. This DECT scanning strategy can much broaden DECT design in reality.
NASA technology utilization program: The small business market
NASA Technical Reports Server (NTRS)
Vannoy, J. K.; Garcia-Otero, F.; Johnson, F. D.; Staskin, E.
1980-01-01
Technology transfer programs were studied to determine how they might be more useful to the small business community. The status, needs, and technology use patterns of small firms are reported. Small business problems and failures are considered. Innovation, capitalization, R and D, and market share problems are discussed. Pocket, captive, and new markets are summarized. Small manufacturers and technology acquisition are discussed, covering external and internal sources, and NASA technology. Small business and the technology utilization program are discussed, covering publications and industrial applications centers. Observations and recommendations include small business market development and contracting, and NASA management technology.
Enhanced hemispheric-scale snow mapping through the blending of optical and microwave satellite data
NASA Astrophysics Data System (ADS)
Armstrong, R. L.; Brodzik, M. J.; Savoie, M.; Knowles, K.
2003-04-01
Snow cover is an important variable for climate and hydrologic models due to its effects on energy and moisture budgets. Seasonal snow can cover more than 50% of the Northern Hemisphere land surface during the winter resulting in snow cover being the land surface characteristic responsible for the largest annual and interannual differences in albedo. Passive microwave satellite remote sensing can augment measurements based on visible satellite data alone because of the ability to acquire data through most clouds or during darkness as well as to provide a measure of snow depth or water equivalent. Global snow cover fluctuation can now be monitored over a 24 year period using passive microwave data (Scanning Multichannel Microwave Radiometer (SMMR) 1978-1987 and Special Sensor Microwave/Imager (SSM/I), 1987-present). Evaluation of snow extent derived from passive microwave algorithms is presented through comparison with the NOAA Northern Hemisphere weekly snow extent data. For the period 1978 to 2002, both passive microwave and visible data sets show a similar pattern of inter-annual variability, although the maximum snow extents derived from the microwave data are consistently less than those provided by the visible satellite data and the visible data typically show higher monthly variability. Decadal trends and their significance are compared for the two data types. During shallow snow conditions of the early winter season microwave data consistently indicate less snow-covered area than the visible data. This underestimate of snow extent results from the fact that shallow snow cover (less than about 5.0 cm) does not provide a scattering signal of sufficient strength to be detected by the algorithms. As the snow cover continues to build during the months of January through March, as well as throughout the melt season, agreement between the two data types continually improves. This occurs because as the snow becomes deeper and the layered structure more complex, the negative spectral gradient driving the passive microwave algorithm is enhanced. Because the current generation of microwave snow algorithms is unable to consistently detect shallow and intermittent snow, we combine visible satellite data with the microwave data in a single blended product to overcome this problem. For the period 1978 to 2002 we combine data from the NOAA weekly snow charts with passive microwave data from the SMMR and SSM/I brightness temperature record. For the current and future time period we blend MODIS and AMSR-E data sets, both of which have greatly enhanced spatial resolution compared to the earlier data sources. Because it is not possible to determine snow depth or snow water equivalent from visible data, the regions where only the NOAA or MODIS data indicate snow are defined as "shallow snow". However, because our current blended product is being developed in the 25 km EASE-Grid and the MODIS data being used are in the Climate Modelers Grid (CMG) at approximately 5 km (0.05 deg.) the blended product also includes percent snow cover over the larger grid cell. A prototype version of the blended MODIS/AMSR-E product will be available in near real-time from NSIDC during the 2002-2003 winter season.
Ultrafast adiabatic quantum algorithm for the NP-complete exact cover problem
Wang, Hefeng; Wu, Lian-Ao
2016-01-01
An adiabatic quantum algorithm may lose quantumness such as quantum coherence entirely in its long runtime, and consequently the expected quantum speedup of the algorithm does not show up. Here we present a general ultrafast adiabatic quantum algorithm. We show that by applying a sequence of fast random or regular signals during evolution, the runtime can be reduced substantially, whereas advantages of the adiabatic algorithm remain intact. We also propose a randomized Trotter formula and show that the driving Hamiltonian and the proposed sequence of fast signals can be implemented simultaneously. We illustrate the algorithm by solving the NP-complete 3-bit exact cover problem (EC3), where NP stands for nondeterministic polynomial time, and put forward an approach to implementing the problem with trapped ions. PMID:26923834
ERIC Educational Resources Information Center
Higgins, Jon L., Ed.
This document provides abstracts of 20 research reports. Topics covered include: children's comprehension of simple story problems; field independence and group instruction; problem-solving competence and memory; spatial visualization and the use of manipulative materials; effects of games on mathematical skills; problem-solving ability and right…
Using Problem-Based Pre-Class Activities to Prepare Students for In-Class Learning
ERIC Educational Resources Information Center
Alayont, Feryal
2014-01-01
This article presents a problem-based approach that prepares students for future learning in the classroom. In this approach, students complete problem-based activities before coming to class to familiarize themselves with the topics to be covered. After the discussion on how the use of these activities relate to the learning and transfer…
Fractional Snow Cover Mapping by Artificial Neural Networks and Support Vector Machines
NASA Astrophysics Data System (ADS)
Çiftçi, B. B.; Kuter, S.; Akyürek, Z.; Weber, G.-W.
2017-11-01
Snow is an important land cover whose distribution over space and time plays a significant role in various environmental processes. Hence, snow cover mapping with high accuracy is necessary to have a real understanding for present and future climate, water cycle, and ecological changes. This study aims to investigate and compare the design and use of artificial neural networks (ANNs) and support vector machines (SVMs) algorithms for fractional snow cover (FSC) mapping from satellite data. ANN and SVM models with different model building settings are trained by using Moderate Resolution Imaging Spectroradiometer surface reflectance values of bands 1-7, normalized difference snow index and normalized difference vegetation index as predictor variables. Reference FSC maps are generated from higher spatial resolution Landsat ETM+ binary snow cover maps. Results on the independent test data set indicate that the developed ANN model with hyperbolic tangent transfer function in the output layer and the SVM model with radial basis function kernel produce high FSC mapping accuracies with the corresponding values of R = 0.93 and R = 0.92, respectively.
Cao, F; Ramaseshan, R; Corns, R; Harrop, S; Nuraney, N; Steiner, P; Aldridge, S; Liu, M; Carolan, H; Agranovich, A; Karva, A
2012-07-01
Craniospinal irradiation were traditionally treated the central nervous system using two or three adjacent field sets. A intensity-modulated radiotherapy (IMRT) plan (Jagged-Junction IMRT) which overcomes problems associated with field junctions and beam edge matching, improves planning and treatment setup efficiencies with homogenous target dose distribution was developed. Jagged-Junction IMRT was retrospectively planned on three patients with prescription of 36 Gy in 20 fractions and compared to conventional treatment plans. Planning target volume (PTV) included the whole brain and spinal canal to the S3 vertebral level. The plan employed three field sets, each with a unique isocentre. One field set with seven fields treated the cranium. Two field sets treated the spine, each set using three fields. Fields from adjacent sets were overlapped and the optimization process smoothly integrated the dose inside the overlapped junction. For the Jagged-Junction IMRT plans vs conventional technique, average homogeneity index equaled 0.08±0.01 vs 0.12±0.02, and conformity number equaled 0.79±0.01 vs 0.47±0.12. The 95% isodose surface covered (99.5±0.3)% of the PTV vs (98.1±2.0)%. Both Jagged-Junction IMRT plans and the conventional plans had good sparing of the organs at risk. Jagged-Junction IMRT planning provided good dose homogeneity and conformity to the target while maintaining a low dose to the organs at risk. Jagged-Junction IMRT optimization smoothly distributed dose in the junction between field sets. Since there was no beam matching, this treatment technique is less likely to produce hot or cold spots at the junction in contrast to conventional techniques. © 2012 American Association of Physicists in Medicine.
An Integrated Approach to Damage Accommodation in Flight Control
NASA Technical Reports Server (NTRS)
Boskovic, Jovan D.; Knoebel, Nathan; Mehra, Raman K.; Gregory, Irene
2008-01-01
In this paper we present an integrated approach to in-flight damage accommodation in flight control. The approach is based on Multiple Models, Switching and Tuning (MMST), and consists of three steps: In the first step the main objective is to acquire a realistic aircraft damage model. Modeling of in-flight damage is a highly complex problem since there is a large number of issues that need to be addressed. One of the most important one is that there is strong coupling between structural dynamics, aerodynamics, and flight control. These effects cannot be studied separately due to this coupling. Once a realistic damage model is available, in the second step a large number of models corresponding to different damage cases are generated. One possibility is to generate many linear models and interpolate between them to cover a large portion of the flight envelope. Once these models have been generated, we will implement a recently developed-Model Set Reduction (MSR) technique. The technique is based on parameterizing damage in terms of uncertain parameters, and uses concepts from robust control theory to arrive at a small number of "centered" models such that the controllers corresponding to these models assure desired stability and robustness properties over a subset in the parametric space. By devising a suitable model placement strategy, the entire parametric set is covered with a relatively small number of models and controllers. The third step consists of designing a Multiple Models, Switching and Tuning (MMST) strategy for estimating the current operating regime (damage case) of the aircraft, and switching to the corresponding controller to achieve effective damage accommodation and the desired performance. In the paper present a comprehensive approach to damage accommodation using Model Set Design,MMST, and Variable Structure compensation for coupling nonlinearities. The approach was evaluated on a model of F/A-18 aircraft dynamics under control effector damage, augmented by nonlinear cross-coupling terms and a structural dynamics model. The proposed approach achieved excellent performance under severe damage effects.
Diffusion archeology for diffusion progression history reconstruction.
Sefer, Emre; Kingsford, Carl
2016-11-01
Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring - perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data.
Diffusion archeology for diffusion progression history reconstruction
Sefer, Emre; Kingsford, Carl
2015-01-01
Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring — perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data. PMID:27821901
NASA Technical Reports Server (NTRS)
Burley, Richard K.; Guirguis, Kamal S.
1991-01-01
Simple, cheap device locks valve stem so its setting cannot be changed by unauthorized people. Device covers valve stem; cover locked in place with standard padlock. Valve lock made of PVC pipe and packing band. Shears, drill or punch, and forming rod only tools needed.
Double coverings with h 2 , 0 = 0 over compact Kähler manifolds
NASA Astrophysics Data System (ADS)
Lee, Nam-Hoon
2018-04-01
We give a formula for Hodge numbers of double coverings with h 2 , 0 = 0 over compact Kähler manifolds. As an application, we consider Calabi-Yau double coverings and calculate their Hodge numbers. In this way, we find several pairs (h 1 , 1 ,h 1 , 2) of Hodge numbers of Calabi-Yau threefolds that do not come from toric setting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... covered with protective tubing. A set of two pieces of poly braid rope covered with light duty garden hose... line (400-lb test) or polypropylene multistrand material, known as braided or tarred mainline, and must... m) lengths of poly braid rope (3/8-inch (9.52 mm) diameter suggested), each covered with an 8-inch...
Extraordinarily Adaptive Properties of the Genetically Encoded Amino Acids
Ilardo, Melissa; Meringer, Markus; Freeland, Stephen; Rasulev, Bakhtiyor; Cleaves II, H. James
2015-01-01
Using novel advances in computational chemistry, we demonstrate that the set of 20 genetically encoded amino acids, used nearly universally to construct all coded terrestrial proteins, has been highly influenced by natural selection. We defined an adaptive set of amino acids as one whose members thoroughly cover relevant physico-chemical properties, or “chemistry space.” Using this metric, we compared the encoded amino acid alphabet to random sets of amino acids. These random sets were drawn from a computationally generated compound library containing 1913 alternative amino acids that lie within the molecular weight range of the encoded amino acids. Sets that cover chemistry space better than the genetically encoded alphabet are extremely rare and energetically costly. Further analysis of more adaptive sets reveals common features and anomalies, and we explore their implications for synthetic biology. We present these computations as evidence that the set of 20 amino acids found within the standard genetic code is the result of considerable natural selection. The amino acids used for constructing coded proteins may represent a largely global optimum, such that any aqueous biochemistry would use a very similar set. PMID:25802223
Cross-cultural Education at the International Space University
NASA Astrophysics Data System (ADS)
Burke, J. D.; Peeters, W.; Hill, H.
2006-12-01
A typical nine-week summer session of the International Space University includes 100 graduate students and young professionals from as many as thirty countries. In addition to lectures and team projects covering a variety of space-related disciplines, the curriculum contains several modes of cross-cultural education. In role- playing workshops, students (asked to behave in accord with known cultural norms of various countries) engage in negotiations on problems such as the rescue and return of astronauts as mandated in international agreements and treaties. Culture shock that could derail such negotiations in actual practice is observed, and the participants come away with heightened sensitivity to cultural differences. This technique could be extended to other educational settings, such as the activities of the UN's Regional Centres in developing countries and the outreach efforts associated with the International Heliophysical Year.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rerknimitr, Rungsun, E-mail: Rungsun@pol.net; Naprasert, Pisit; Kongkam, Pradermchai
Background. Distal migration is one of the common complications after insertion of a covered metallic stent. Stent repositioning or removal is not always possible in every patient. Therefore, trimming using an argon plasma coagulator (APC) may be a good alternative method to solve this problem. Methods. Metallic stent trimming by APC was performed in 2 patients with biliary Wallstent migration and in another patient with esophageal Ultraflex stent migration. The power setting was 60-100 watts with an argon flow of 0.8 l/min. Observations. The procedure was successfully performed and all distal parts of the stents were removed. No significant collateralmore » damage to the nearby mucosa was observed. Conclusions. In a patient with a distally migrated metallic stent, trimming of the stent is possible by means of an APC. This new method may be applicable to other sites of metallic stent migration.« less
Measuring (shared) decision-making--a review of psychometric instruments.
Simon, Daniela; Loh, Andreas; Härter, Martin
2007-01-01
In recent years shared decision-making (SDM) has gained importance as an appropriate approach to patient-physician communication and decision-making. However, there is a conceptual variety that implies problems of inconsistent measurement, of defining relationships of SDM and outcome measures, and of comparisons across different studies. This article presents the results of a literature search of psychometric instruments measuring aspects of decision-making. Altogether 18 scales were found. The majority covers the patients' perspective and relates to preferences for information and participation, decisional conflict, self-efficacy as well as to the evaluation of decision-making process and outcomes. The scales differ widely in their extent of validation. Although this review is not exhaustive, it presents a variety of available decision-making instruments. Yet, many of them still need to show their psychometric quality for other settings in further studies.
NASA Astrophysics Data System (ADS)
Vicente-Vicente, Jose Luis; García-Ruiz, Roberto; Calero, Julio; Aranda, Victor
2016-04-01
Spain has 2.5 million hectares of olive groves, 60 % of which are situated in Andalusia (Southern Spain). The most common agricultural management consist of a conventional or reduced tillage combined with herbicides to eliminate weeds. This might lead to some ecological problems (e.g. erosion, soil nutrient and organic carbon losses). The recommended management consist of a plant cover of spontaneous herbaceous plant in the inter row of olive oil orchards which are usually mowed early in spring. In this study, we assessed the influence of: i) two soil managements: non-covered and weed-covered, and ii) soil parent material (carbonated and siliceous), on soil organic carbon (SOC) fractions. In addition, we assessed the existence of a saturation limit for the different SOC fractions by including calcareous and siliceous soils under natural vegetation. Weed-covered soils accumulated more total SOC than soils under the non-covered management and this was independent on the parent material type. Same was true for most of the SOC fractions. However, the relative proportion of the SOC fractions was not affected by the presence of weeds, but it was due to the parent material type; carbonated soils had more unprotected and physically protected SOC, whereas the siliceous soils were relatively enriched in biochemically protected pool. Otherwise, table 1 shows that the chemically protected SOC pool was best fit to a saturation function, especially in the siliceous plots. The other fractions were best fit to a linear function. Therefore, these results suggest that chemically protected pools are the only protected fractions which can be saturated considering the SOC in the natural vegetation soils as the SOC limit. Considering SOC levels in the weed-covered and non-covered managements of all protected fractions and their respective limits of total SOC, saturation deficits in the non-covered and weed-covered plots were 75% and 60% of total SOC, respectively. Table 1. Significance of the linear and saturation models between total SOC and SOC of each isolated fraction for the whole set of plots and for plots of similar mineralogy. Physically protected fraction is comprised of three sub-fractions: iPOM, chemically and biochemically protected within microaggregates. "-" stands for non-analysed fractions. Fraction/Sub-fraction Whole set of plots Siliceous Carbonated Linear Saturation Linear Saturation Linear Saturation Unprotected 0.87 0.76 - - - - Physically protected 0.82 0.86 - - - - iPOM 0.75 0.73 - - - - Chemically protected within microaggregates 0.26 0.49 0.72 0.79 0.63 0.65 Biochemically protected within microaggregates 0.75 0.66 0.87 0.82 0.73 0.66 Chemically protected 0.41 0.62 0.69 0.79 0.78 0.71 Biochemically protected 0.76 0.69 0.89 0.90 0.72 0.62 These results suggest that there is a high potential for SOC sequestration in Andalusian olive grove soils. Nevertheless, it is very important to analyse in detail the influence of the soil mineralogy properties on SOC accumulation. The management clearly affects the total amount of SOC and its fractions, whereas the parent material type mainly affects the proportion of these.
Development of a global land cover characteristics database and IGBP DISCover from 1 km AVHRR data
Loveland, Thomas R.; Reed, B.C.; Brown, Jesslyn F.; Ohlen, D.O.; Zhu, Z.; Yang, L.; Merchant, J.W.
2000-01-01
Researchers from the U.S. Geological Survey, University of Nebraska-Lincoln and the European Commission's Joint Research Centre, Ispra, Italy produced a 1 km resolution global land cover characteristics database for use in a wide range of continental-to global-scale environmental studies. This database provides a unique view of the broad patterns of the biogeographical and ecoclimatic diversity of the global land surface, and presents a detailed interpretation of the extent of human development. The project was carried out as an International Geosphere-Biosphere Programme, Data and Information Systems (IGBP-DIS) initiative. The IGBP DISCover global land cover product is an integral component of the global land cover database. DISCover includes 17 general land cover classes defined to meet the needs of IGBP core science projects. A formal accuracy assessment of the DISCover data layer will be completed in 1998. The 1 km global land cover database was developed through a continent-by-continent unsupervised classification of 1 km monthly Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) composites covering 1992-1993. Extensive post-classification stratification was necessary to resolve spectral/temporal confusion between disparate land cover types. The complete global database consists of 961 seasonal land cover regions that capture patterns of land cover, seasonality and relative primary productivity. The seasonal land cover regions were aggregated to produce seven separate land cover data sets used for global environmental modelling and assessment. The data sets include IGBP DISCover, U.S. Geological Survey Anderson System, Simple Biosphere Model, Simple Biosphere Model 2, Biosphere-Atmosphere Transfer Scheme, Olson Ecosystems and Running Global Remote Sensing Land Cover. The database also includes all digital sources that were used in the classification. The complete database can be sourced from the website: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html.
Couples' Reports of Relationship Problems in a Naturalistic Therapy Setting
ERIC Educational Resources Information Center
Boisvert, Marie-Michele; Wright, John; Tremblay, Nadine; McDuff, Pierre
2011-01-01
Understanding couples' relationship problems is fundamental to couple therapy. Although research has documented common relationship problems, no study has used open-ended questions to explore problems in couples seeking therapy in naturalistic settings. The present study used a reliable coding system to explore the relationship problems reported…
A new NASA/MSFC mission analysis global cloud cover data base
NASA Technical Reports Server (NTRS)
Brown, S. C.; Jeffries, W. R., III
1985-01-01
A global cloud cover data set, derived from the USAF 3D NEPH Analysis, was developed for use in climate studies and for Earth viewing applications. This data set contains a single parameter - total sky cover - separated in time by 3 or 6 hr intervals and in space by approximately 50 n.mi. Cloud cover amount is recorded for each grid point (of a square grid) by a single alphanumeric character representing each 5 percent increment of sky cover. The data are arranged in both quarterly and monthly formats. The data base currently provides daily, 3-hr observed total sky cover for the Northern Hemisphere from 1972 through 1977 less 1976. For the Southern Hemisphere, there are data at 6-hr intervals for 1976 through 1978 and at 3-hr intervals for 1979 and 1980. More years of data are being added. To validate the data base, the percent frequency of or = 0.3 and or = 0.8 cloud cover was compared with ground observed cloud amounts at several locations with generally good agreement. Mean or other desired cloud amounts can be calculated for any time period and any size area from a single grid point to a hemisphere. The data base is especially useful in evaluating the consequence of cloud cover on Earth viewing space missions. The temporal and spatial frequency of the data allow simulations that closely approximate any projected viewing mission. No adjustments are required to account for cloud continuity.
Selected Bibliography of Materials; Algeria, Libya, Morocco, Tunisia. Volume 1, Number 2, 1967.
ERIC Educational Resources Information Center
Azzouz, Azzedine; And Others
A bibliography with abstracts of 106 items from books and articles covers materials on education in the Maghreb countries of Tunisia, Libya, Algeria, and Morocco. Special emphasis is given to the two problems besetting the area's educational system: illiteracy and multilingualism. The entries cover philosophy and theory of education,…
The Formation of Citizenship through Community Theatre. A Study in Aguascalientes, Mexico
ERIC Educational Resources Information Center
Moschou, Christiana; Anaya Rodriguez, Roberto
2016-01-01
Purpose: The aim of the research was to examine if adolescents can develop abilities of democratic interaction through Community Theatre. Design/methodology: Firstly, two instruments were applied, a questionnaire covering socio-moral problems, of the students and a Questionnaire, covering the Educational Ideologies of the professors. Then, a…
Cover of coastal vegetation as an indicator of eutrophication along environmental gradients.
Wikström, Sofia A; Carstensen, Jacob; Blomqvist, Mats; Krause-Jensen, Dorte
2016-01-01
Coastal vegetation communities are important for primary production, biodiversity, coastal protection, carbon and nutrient cycling which, in combination with their sensitivity to eutrophication, render them potential indicators of environmental status for environmental policies like the EU Water and Marine Strategy Framework Directives. We evaluated one potential indicator for coastal vegetation, the cumulative cover at depths where the vegetation is light limited, by investigating its response to eutrophication along gradients in natural conditions. We used a large data set covering the Swedish coastline, spanning broad gradients in nutrient level, water clarity, seabed substrate, physical exposure and climate in addition to a salinity gradient from 0.5 to 30.5. Macroalgal cover increased significantly along gradients of declining nutrient concentration and increasing water clarity when we had accounted for diver effects, spatio-temporal sampling variability, salinity gradients, wave exposure and latitude. The developed empirical model explained 79% of the variation in algal cover across 130 areas. Based on this, we identified macroalgal cover as a promising indicator across the Baltic Sea, Kattegat and Skagerrak. A parallel analysis of soft-substrate macrophytes similarly identified significant increases in cover with decreasing concentrations of total nitrogen and increasing salinity, but the resulting empirical model explained only 52% of the variation in cover, probably due to the spatially more variable nature of soft-substrate vegetation. The identified general responses of vegetation cover to gradients of eutrophication across wide ranges in environmental settings may be useful for monitoring and management of marine vegetation in areas with strong environmental gradients.
Management Techniques for Librarians.
ERIC Educational Resources Information Center
Evans, G. Edward
This textbook on library management techniques is concerned with basic management problems. Examples of problems in planning, organization, and coordination are drawn from situations in libraries or information centers. After an introduction to library management, the history of management is covered. Several styles of management and organization…
Learning Activity Package, Algebra.
ERIC Educational Resources Information Center
Evans, Diane
A set of ten teacher-prepared Learning Activity Packages (LAPs) in beginning algebra and nine in intermediate algebra, these units cover sets, properties of operations, number systems, open expressions, solution sets of equations and inequalities in one and two variables, exponents, factoring and polynomials, relations and functions, radicals,…
Delgiudice, Glenn D; Fieberg, John R; Sampson, Barry A
2013-01-01
Long-term studies allow capture of a wide breadth of environmental variability and a broader context within which to maximize our understanding of relationships to specific aspects of wildlife behavior. The goal of our study was to improve our understanding of the biological value of dense conifer cover to deer on winter range relative to snow depth and ambient temperature. We examined variation among deer in their use of dense conifer cover during a 12-year study period as potentially influenced by winter severity and cover availability. Female deer were fitted with a mixture of very high frequency (VHF, n = 267) and Global Positioning System (GPS, n = 24) collars for monitoring use of specific cover types at the population and individual levels, respectively. We developed habitat composites for four study sites. We fit multinomial response models to VHF (daytime) data to describe population-level use patterns as a function of snow depth, ambient temperature, and cover availability. To develop alternative hypotheses regarding expected spatio-temporal patterns in the use of dense conifer cover, we considered two sets of competing sub-hypotheses. The first set addressed whether or not dense conifer cover was limiting on the four study sites. The second set considered four alternative sub-hypotheses regarding the potential influence of snow depth and ambient temperature on space use patterns. Deer use of dense conifer cover increased the most with increasing snow depth and most abruptly on the two sites where it was most available, suggestive of an energy conservation strategy. Deer use of dense cover decreased the most with decreasing temperatures on the sites where it was most available. At all four sites deer made greater daytime use (55 to >80% probability of use) of open vegetation types at the lowest daily minimum temperatures indicating the importance of thermal benefits afforded from increased exposure to solar radiation. Date-time plots of GPS data (24 hr) allowed us to explore individual diurnal and seasonal patterns of habitat use relative to changes in snow depth. There was significant among-animal variability in their propensity to be found in three density classes of conifer cover and other open types, but little difference between diurnal and nocturnal patterns of habitat use. Consistent with our findings reported elsewhere that snow depth has a greater impact on deer survival than ambient temperature, herein our population-level results highlight the importance of dense conifer cover as snow shelter rather than thermal cover. Collectively, our findings suggest that maximizing availability of dense conifer cover in an energetically beneficial arrangement with quality feeding sites should be a prominent component of habitat management for deer.
DelGiudice, Glenn D.; Fieberg, John R.; Sampson, Barry A.
2013-01-01
Backgound Long-term studies allow capture of a wide breadth of environmental variability and a broader context within which to maximize our understanding of relationships to specific aspects of wildlife behavior. The goal of our study was to improve our understanding of the biological value of dense conifer cover to deer on winter range relative to snow depth and ambient temperature. Methodology/Principal Findings We examined variation among deer in their use of dense conifer cover during a 12-year study period as potentially influenced by winter severity and cover availability. Female deer were fitted with a mixture of very high frequency (VHF, n = 267) and Global Positioning System (GPS, n = 24) collars for monitoring use of specific cover types at the population and individual levels, respectively. We developed habitat composites for four study sites. We fit multinomial response models to VHF (daytime) data to describe population-level use patterns as a function of snow depth, ambient temperature, and cover availability. To develop alternative hypotheses regarding expected spatio-temporal patterns in the use of dense conifer cover, we considered two sets of competing sub-hypotheses. The first set addressed whether or not dense conifer cover was limiting on the four study sites. The second set considered four alternative sub-hypotheses regarding the potential influence of snow depth and ambient temperature on space use patterns. Deer use of dense conifer cover increased the most with increasing snow depth and most abruptly on the two sites where it was most available, suggestive of an energy conservation strategy. Deer use of dense cover decreased the most with decreasing temperatures on the sites where it was most available. At all four sites deer made greater daytime use (55 to >80% probability of use) of open vegetation types at the lowest daily minimum temperatures indicating the importance of thermal benefits afforded from increased exposure to solar radiation. Date-time plots of GPS data (24 hr) allowed us to explore individual diurnal and seasonal patterns of habitat use relative to changes in snow depth. There was significant among-animal variability in their propensity to be found in three density classes of conifer cover and other open types, but little difference between diurnal and nocturnal patterns of habitat use. Conclusions/Significance Consistent with our findings reported elsewhere that snow depth has a greater impact on deer survival than ambient temperature, herein our population-level results highlight the importance of dense conifer cover as snow shelter rather than thermal cover. Collectively, our findings suggest that maximizing availability of dense conifer cover in an energetically beneficial arrangement with quality feeding sites should be a prominent component of habitat management for deer. PMID:23785421
Effects of spatial resolution and landscape structure on land cover characterization
NASA Astrophysics Data System (ADS)
Yang, Wenli
This dissertation addressed problems in scaling, problems that are among the main challenges in remote sensing. The principal objective of the research was to investigate the effects of changing spatial scale on the representation of land cover. A second objective was to determine the relationship between such effects, characteristics of landscape structure and scaling procedures. Four research issues related to spatial scaling were examined. They included: (1) the upscaling of Normalized Difference Vegetation Index (NDVI); (2) the effects of spatial scale on indices of landscape structure; (3) the representation of land cover databases at different spatial scales; and (4) the relationships between landscape indices and land cover area estimations. The overall bias resulting from non-linearity of NDVI in relation to spatial resolution is generally insignificant as compared to other factors such as influences of aerosols and water vapor. The bias is, however, related to land surface characteristics. Significant errors may be introduced in heterogeneous areas where different land cover types exhibit strong spectral contrast. Spatially upscaled SPOT and TM NDVIs have information content comparable with the AVHRR-derived NDVI. Indices of landscape structure and spatial resolution are generally related, but the exact forms of the relationships are subject to changes in other factors including the basic patch unit constituting a landscape and the proportional area of foreground land cover under consideration. The extent of agreement between spatially aggregated coarse resolution land cover datasets and full resolution datasets changes with the properties of the original datasets, including the pixel size and class definition. There are close relationships between landscape structure and class areas estimated from spatially aggregated land cover databases. The relationships, however, do not permit extension from one area to another. Inversion calibration across different geographic/ecological areas is, therefore, not feasible. Different rules govern the land cover area changes across resolutions when different upscaling methods are used. Special attention should be given to comparison between land cover maps derived using different methods.
Denneboom, Wilma; Dautzenberg, Maaike GH; Grol, Richard; De Smet, Peter AGM
2007-01-01
Background Older people are prone to problems related to use of medicines. As they tend to use many different medicines, monitoring pharmacotherapy for older people in primary care is important. Aim To determine which procedure for treatment reviews (case conferences versus written feedback) results in more medication changes, measured at different moments in time. To determine the costs and savings related to such an intervention. Design of study Randomised, controlled trial, randomisation at the level of the community pharmacy. Setting Primary care; treatment reviews were performed by 28 pharmacists and 77 GPs concerning 738 older people (≥75 years) on polypharmacy (>five medicines). Method In one group, pharmacists and GPs performed case conferences on prescription-related problems; in the other group, pharmacists provided results of a treatment review to GPs as written feedback. Number of medication changes was counted following clinically-relevant recommendations. Costs and savings associated with the intervention at various times were calculated. Results In the case-conference group significantly more medication changes were initiated (42 versus 22, P = 0.02). This difference was also present 6 months after treatment reviews (36 versus 19, P = 0.02). Nine months after treatment reviews, the difference was no longer significant (33 versus 19, P = 0.07). Additional costs in the case-conference group seem to be covered by the slightly greater savings in this group. Conclusion Performing treatment reviews with case conferences leads to greater uptake of clinically-relevant recommendations. Extra costs seem to be covered by related savings. The effect of the intervention declines over time, so performing treatment reviews for older people should be integrated in the routine collaboration between GPs and pharmacists. PMID:17761060
Wieczorek, Michael; LaMotte, Andrew E.
2010-01-01
This tabular data set represents the estimated area of land use and land cover from the National Land Cover Dataset 2001 (LaMotte, 2008), compiled for every MRB_E2RF1 catchment of the Major River Basins (MRBs, Crawford and others, 2006). The source data set represents land use and land cover for the conterminous United States for 2001. The National Land Cover Data Set for 2001 was produced through a cooperative project conducted by the Multi-Resolution Land Characteristics (MRLC) Consortium. The MRLC Consortium is a partnership of Federal agencies (http://www.mrlc.gov), consisting of the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Environmental Protection Agency (USEPA), the U.S. Department of Agriculture (USDA), the U.S. Forest Service (USFS), the National Park Service (NPS), the U.S. Fish and Wildlife Service (USFWS), the Bureau of Land Management (BLM), and the USDA Natural Resources Conservation Service (NRCS). The MRB_E2RF1 catchments are based on a modified version of the U.S. Environmental Protection Agency's (USEPA) ERF1_2 and include enhancements to support national and regional-scale surface-water quality modeling (Nolan and others, 2002; Brakebill and others, 2011). Data were compiled for every MRB_E2RF1 catchment for the conterminous United States covering the South Atlantic-Gulf and Tennessee (MRB2), the Great Lakes, Ohio, Upper Mississippi, and Souris-Red-Rainy (MRB3), the Missouri (MRB4), the Lower Mississippi, Arkansas-White-Red, and Texas-Gulf (MRB5) and the Pacific Northwest (MRB7) river basins.
Spoorenberg, Sophie L W; Reijneveld, Sijmen A; Middel, Berrie; Uittenbroek, Ronald J; Kremer, Hubertus P H; Wynia, Klaske
2015-01-01
The aim of the present study was to develop a valid Geriatric ICF Core Set reflecting relevant health-related problems of community-living older adults without dementia. A Delphi study was performed in order to reach consensus (≥70% agreement) on second-level categories from the International Classification of Functioning, Disability and Health (ICF). The Delphi panel comprised 41 older adults, medical and non-medical experts. Content validity of the set was tested in a cross-sectional study including 267 older adults identified as frail or having complex care needs. Consensus was reached for 30 ICF categories in the Delphi study (fourteen Body functions, ten Activities and Participation and six Environmental Factors categories). Content validity of the set was high: the prevalence of all the problems was >10%, except for d530 Toileting. The most frequently reported problems were b710 Mobility of joint functions (70%), b152 Emotional functions (65%) and b455 Exercise tolerance functions (62%). No categories had missing values. The final Geriatric ICF Core Set is a comprehensive and valid set of 29 ICF categories, reflecting the most relevant health-related problems among community-living older adults without dementia. This Core Set may contribute to optimal care provision and support of the older population. Implications for Rehabilitation The Geriatric ICF Core Set may provide a practical tool for gaining an understanding of the relevant health-related problems of community-living older adults without dementia. The Geriatric ICF Core Set may be used in primary care practice as an assessment tool in order to tailor care and support to the needs of older adults. The Geriatric ICF Core Set may be suitable for use in multidisciplinary teams in integrated care settings, since it is based on a broad range of problems in functioning. Professionals should pay special attention to health problems related to mobility and emotional functioning since these are the most prevalent problems in community-living older adults.
Indoor air quality and health problems associated with damp floor coverings.
Tuomainen, Anneli; Seuri, Markku; Sieppi, Anne
2004-04-01
To study the relationship between a high incidence of bronchial asthma among employees working in an office building and an indoor air problem related to the degradation of polyvinyl chloride (PVC) floor coverings in the building. The indoor air measurements and results of renovations are also described. Employees' symptoms were surveyed by a questionnaire, and the incidence of asthma was calculated from the medical records for 1997-2000. The quality of indoor air was assessed by microbial sampling and by investigation of the building for possible moisture damage. Indoor air was sampled for volatile organic compounds (VOCs) through Tenax adsorbent tubes. In situ volatile emission measurements from the concrete floor were performed via the field and laboratory emission cell (FLEC) method. In an office with approximately 150 employees, eight new cases of asthma were found in 4 years. In addition, the workers complained of respiratory, conjunctival and nasal symptoms. Emissions indicating the degradation of plastic floor coverings (e.g. 2-ethyl-1-hexanol, 1-butanol) were found in the indoor air and floor material samples. The plastic floor coverings, adhesives and the levelling layers were carefully removed from 12 rooms. The VOCs had diffused into the underlying concrete slabs. The concrete was warmed to remove the diffused VOCs from these areas. After the repairs the concentrations of the VOCs indicating the degradation of PVC, decreased, as did the prevalence of the employees' symptoms and several asthma patients' need for medication. The workers in the office building complained of several respiratory, conjunctival and dermal symptoms. The incidence of adult-onset asthma was approximately nine-times higher than that among Finns employed in similar work. The most probable single cause of the indoor air problem was the degradation of the plastic floor coverings.
David G. Ray
2013-01-01
Restoring natural fire regimes and diverse ground cover to planted or old-field origin southern pine stands typically requires a substantial reduction in overstory density. While maintaining full canopy cover (CC) is consistent with maximizing fiber production, this approach does not allow sufficient light to reach the forest floor to accomplish a broader set of...
A comparison of the light-reduction capacity of commonly used incubator covers.
Lee, Yi-Hui; Malakooti, Nima; Lotas, Marilyn
2005-01-01
The use of incubator covers to enhance preterm infants' rest and recovery is common in the NICU. However, the kinds of covers used vary extensively among and within nurseries. Few data exist on the effectiveness of different types of covers in reducing light levels to the infant. This study compared several types of commonly used incubator covers as to efficacy of light reduction. A descriptive, comparative design was used in this study. Twenty-three incubator covers were tested, including professional, receiving blanket, hand-crocheted, three-layer quilt, and flannel. The percentage of light level reduction of different incubator covers under various ambient light level settings. The amount of light reduction provided by incubator covers varies depending on type of fabric as well as percentage of incubator surface shielded by the cover. Dark-colored covers provided greater light reduction than bright/light-colored covers when covers identical in fabric type were compared. The light-reduction efficiency of the covers varied depending on the level of ambient light. Covers provided less light reduction in higher ambient light levels.
Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set
NASA Astrophysics Data System (ADS)
Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.
2017-05-01
A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.
Connes' embedding problem and Tsirelson's problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Junge, M.; Palazuelos, C.; Navascues, M.
2011-01-15
We show that Tsirelson's problem concerning the set of quantum correlations and Connes' embedding problem on finite approximations in von Neumann algebras (known to be equivalent to Kirchberg's QWEP conjecture) are essentially equivalent. Specifically, Tsirelson's problem asks whether the set of bipartite quantum correlations generated between tensor product separated systems is the same as the set of correlations between commuting C{sup *}-algebras. Connes' embedding problem asks whether any separable II{sub 1} factor is a subfactor of the ultrapower of the hyperfinite II{sub 1} factor. We show that an affirmative answer to Connes' question implies a positive answer to Tsirelson's. Conversely,more » a positive answer to a matrix valued version of Tsirelson's problem implies a positive one to Connes' problem.« less
Serials Acquisition Problems in Developing Countries: The Zambian Experience.
ERIC Educational Resources Information Center
Lungu, Charles B. M.
1985-01-01
Analysis of serial acquisition problems in developing nations cites specific references from University of Zambia Library. Discussion covers underdeveloped economic circumstances of Third World nations, overdependence on serials of foreign origin, geographical locations of Third World countries, ill-defined acquisition policies, staffing for…
A Problem-Based Learning Design for Teaching Biochemistry.
ERIC Educational Resources Information Center
Dods, Richard F.
1996-01-01
Describes the design of a biochemistry course that uses problem-based learning. Provides opportunities for students to question, dispute, confirm, and disconfirm their understanding of basic concepts. Emphasizes self-correction through dialogue. Topics covered include amino acids, metabolic pathways and inherited disease, proteins, enzymes and…
SciCADE 95: International conference on scientific computation and differential equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-12-31
This report consists of abstracts from the conference. Topics include algorithms, computer codes, and numerical solutions for differential equations. Linear and nonlinear as well as boundary-value and initial-value problems are covered. Various applications of these problems are also included.
A bounding-based solution approach for the continuous arc covering problem
NASA Astrophysics Data System (ADS)
Wei, Ran; Murray, Alan T.; Batta, Rajan
2014-04-01
Road segments, telecommunication wiring, water and sewer pipelines, canals and the like are important features of the urban environment. They are often conceived of and represented as network-based arcs. As a result of the usefulness and significance of arc-based features, there is a need to site facilities along arcs to serve demand. Examples of such facilities include surveillance equipment, cellular towers, refueling centers and emergency response stations, with the intent of being economically efficient as well as providing good service along the arcs. While this amounts to a continuous location problem by nature, various discretizations are generally relied upon to solve such problems. The result is potential for representation errors that negatively impact analysis and decision making. This paper develops a solution approach for the continuous arc covering problem that theoretically eliminates representation errors. The developed approach is applied to optimally place acoustic sensors and cellular base stations along a road network. The results demonstrate the effectiveness of this approach for ameliorating any error and uncertainty in the modeling process.
SERM Forest Cover Data Layers of the SSA in Vector Format
NASA Technical Reports Server (NTRS)
Nickeson, Jaime; Gruszka, Fern; Hall, Forrest G. (Editor)
2000-01-01
This data set was prepared by the SERM-FBIU. The data include information on forest parameters and cover the area in and near the BOREAS SSA, excluding the PANP. The data were produced from aerial photography taken as recently as 1988.
Geometric Hitting Set for Segments of Few Orientations
Fekete, Sandor P.; Huang, Kan; Mitchell, Joseph S. B.; ...
2016-01-13
Here we study several natural instances of the geometric hitting set problem for input consisting of sets of line segments (and rays, lines) having a small number of distinct slopes. These problems model path monitoring (e.g., on road networks) using the fewest sensors (the \\hitting points"). We give approximation algorithms for cases including (i) lines of 3 slopes in the plane, (ii) vertical lines and horizontal segments, (iii) pairs of horizontal/vertical segments. Lastly, we give hardness and hardness of approximation results for these problems. We prove that the hitting set problem for vertical lines and horizontal rays is polynomially solvable.
Hansen, M.C.; Egorov, Alexey; Roy, David P.; Potapov, P.; Ju, J.; Turubanova, S.; Kommareddy, I.; Loveland, Thomas R.
2011-01-01
Vegetation Continuous Field (VCF) layers of 30 m percent tree cover, bare ground, other vegetation and probability of water were derived for the conterminous United States (CONUS) using Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data sets from the Web-Enabled Landsat Data (WELD) project. Turnkey approaches to land cover characterization were enabled due to the systematic WELD Landsat processing, including conversion of digital numbers to calibrated top of atmosphere reflectance and brightness temperature, cloud masking, reprojection into a continental map projection and temporal compositing. Annual, seasonal and monthly WELD composites for 2008 were used as spectral inputs to a bagged regression and classification tree procedure using a large training data set derived from very high spatial resolution imagery and available ancillary data. The results illustrate the ability to perform Landsat land cover characterizations at continental scales that are internally consistent while retaining local spatial and thematic detail.
Integrated resource inventory for southcentral Alaska (INTRISCA)
NASA Technical Reports Server (NTRS)
Burns, T.; Carson-Henry, C.; Morrissey, L. A.
1981-01-01
The Integrated Resource Inventory for Southcentral Alaska (INTRISCA) Project comprised an integrated set of activities related to the land use planning and resource management requirements of the participating agencies within the southcentral region of Alaska. One subproject involved generating a region-wide land cover inventory of use to all participating agencies. Toward this end, participants first obtained a broad overview of the entire region and identified reasonable expectations of a LANDSAT-based land cover inventory through evaluation of an earlier classification generated during the Alaska Water Level B Study. Classification of more recent LANDSAT data was then undertaken by INTRISCA participants. The latter classification produced a land cover data set that was more specifically related to individual agency needs, concurrently providing a comprehensive training experience for Alaska agency personnel. Other subprojects employed multi-level analysis techniques ranging from refinement of the region-wide classification and photointerpretation, to digital edge enhancement and integration of land cover data into a geographic information system (GIS).
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1979-01-01
The spatial characteristics of the data were evaluated. A program was developed to reduce the spatial distortions resulting from variable viewing distance, and geometrically adjusted data sets were generated. The potential need for some level of radiometric adjustment was evidenced by an along track band of high reflectance across different cover types in the Varian imagery. A multiple regression analysis was employed to explore the viewing angle effect on measured reflectance. Areas in the data set which appeared to have no across track stratification of cover type were identified. A program was developed which computed the average reflectance by column for each channel, over all of the scan lines in the designated areas. A regression analysis was then run using the first, second, and third degree polynomials, for each channel. An atmospheric effect as a component of the viewing angle source of variance is discussed. Cover type maps were completed and training and test field selection was initiated.
Minimal measures for Euler-Lagrange flows on finite covering spaces
NASA Astrophysics Data System (ADS)
Wang, Fang; Xia, Zhihong
2016-12-01
In this paper we study the minimal measures for positive definite Lagrangian systems on compact manifolds. We are particularly interested in manifolds with more complicated fundamental groups. Mather’s theory classifies the minimal or action-minimizing measures according to the first (co-)homology group of a given manifold. We extend Mather’s notion of minimal measures to a larger class for compact manifolds with non-commutative fundamental groups, and use finite coverings to study the structure of these extended minimal measures. We also define action-minimizers and minimal measures in the homotopical sense. Our program is to study the structure of homotopical minimal measures by considering Mather’s minimal measures on finite covering spaces. Our goal is to show that, in general, manifolds with a non-commutative fundamental group have a richer set of minimal measures, hence a richer dynamical structure. As an example, we study the geodesic flow on surfaces of higher genus. Indeed, by going to the finite covering spaces, the set of minimal measures is much larger and more interesting.
Operational applications of satellite snowcover observations in Rio Grande drainage of Colorado
NASA Technical Reports Server (NTRS)
Washicheck, J. N.; Mikesell, T.
1975-01-01
Various mapping techniques were tried and evaluated. There were many problems encountered such as distinquishing clouds from snow and snow under trees. A partial solution to some of the problems involves ground reconnaissance and low air flights. Snow areas, cloud cover, and total areas were planimetered after transferring imagery by use of zoom transfer scope. These determinations were then compared to areas determined by use of a density slicer. Considerable adjustment is required for these two values to compare. NOAA pictures were also utilized in the evaluation. Forest cover is one of the parameters used in the modeling process. The determination of this percentage is being explored.
[Continuity and discontinuity of the geomerida: the bionomic and biotic aspects].
Kafanov, A I
2005-01-01
The view of the spatial structure of the geomerida (Earth's life cover) as a continuum that prevails in modern phytocoenology is mostly determined by a physiognomic (landscape-bionomic) discrimination of vegetation components. In this connection, geography of life forms appears as subject of the landscapebionomic biogeography. In zoocoenology there is a tendency of synthesis of alternative concepts based on the assumption that there are no absolute continuum and absolute discontinuum in the organic nature. The problem of continuum and discontinuum of living cover being problem of scale aries from fractal structure of geomerida. This problem arises from fractal nature of the spatial structure of geomerida. The continuum mainly belongs to regularities of topological order. At regional and subregional scale the continuum of biochores is rather rare. The objective evidences of relative discontinuity of the living cover are determined by significant alterations of species diversity at the regional, subregional and even topological scale Alternatively to conventionally discriminated units in physionomically continuous vegetation, the same biotic complexes, represented as operational units of biogeographical and biocenological zoning, are distinguished repeatedly and independently by different researchers. An area occupied by certain flora (fauna, biota) could be considered as elementary unit of biotic diversity (elementary biotic complex).
Linking Family Characteristics with Poor Peer Relations: The Mediating Role of Conduct Problems
Bierman, Karen Linn; Smoot, David L.
2012-01-01
Parent, teacher, and peer ratings were collected for 75 grade school boys to test the hypothesis that certain family interaction patterns would be associated with poor peer relations. Path analyses provided support for a mediational model, in which punitive and ineffective discipline was related to child conduct problems in home and school settings which, in turn, predicted poor peer relations. Further analyses suggested that distinct subgroups of boys could be identified who exhibited conduct problems at home only, at school only, in both settings, or in neither setting. Boys who exhibited cross-situational conduct problems were more likely to experience multiple concurrent problems (e.g., in both home and school settings) and were more likely than any other group to experience poor peer relations. However, only about one-third of the boys with poor peer relations in this sample exhibited problem profiles consistent with the proposed model (e.g., experienced high rates of punitive/ineffective home discipline and exhibited conduct problems in home and school settings), suggesting that the proposed model reflects one common (but not exclusive) pathway to poor peer relations. PMID:1865049
NASA Technical Reports Server (NTRS)
Norris, Joel R.
2005-01-01
This study investigated the spatial pattern of linear trends in surface-observed upper-level (combined mid-level and High-level) cloud cover, precipitation, and surface divergence over the tropical Indo-Pacific Ocean during 1952-1957. Cloud values were obtained from the Extended Edited Cloud Report Archive (EECRA), precipitation values were obtained from the Hulme/Climate Research Unit Data Set, and surface divergence was alternatively calculated from wind reported Comprehensive Ocean-Atmosphere Data Set and from Smith and Reynolds Extended Reconstructed sea level pressure data.
Inducing mental set constrains procedural flexibility and conceptual understanding in mathematics.
DeCaro, Marci S
2016-10-01
An important goal in mathematics is to flexibly use and apply multiple, efficient procedures to solve problems and to understand why these procedures work. One factor that may limit individuals' ability to notice and flexibly apply strategies is the mental set induced by the problem context. Undergraduate (N = 41, Experiment 1) and fifth- and sixth-grade students (N = 87, Experiment 2) solved mathematical equivalence problems in one of two set-inducing conditions. Participants in the complex-first condition solved problems without a repeated addend on both sides of the equal sign (e.g., 7 + 5 + 9 = 3 + _), which required multistep strategies. Then these students solved problems with a repeated addend (e.g., 7 + 5 + 9 = 7 + _), for which a shortcut strategy could be readily used (i.e., adding 5 + 9). Participants in the shortcut-first condition solved the same problem set but began with the shortcut problems. Consistent with laboratory studies of mental set, participants in the complex-first condition were less likely to use the more efficient shortcut strategy when possible. In addition, these participants were less likely to demonstrate procedural flexibility and conceptual understanding on a subsequent assessment of mathematical equivalence knowledge. These findings suggest that certain problem-solving contexts can help or hinder both flexibility in strategy use and deeper conceptual thinking about the problems.
NASA Astrophysics Data System (ADS)
Nguyen, Dong-Hai
This research project investigates the difficulties students encounter when solving physics problems involving the integral and the area under the curve concepts and the strategies to facilitate students learning to solve those types of problems. The research contexts of this project are calculus-based physics courses covering mechanics and electromagnetism. In phase I of the project, individual teaching/learning interviews were conducted with 20 students in mechanics and 15 students from the same cohort in electromagnetism. The students were asked to solve problems on several topics of mechanics and electromagnetism. These problems involved calculating physical quantities (e.g. velocity, acceleration, work, electric field, electric resistance, electric current) by integrating or finding the area under the curve of functions of related quantities (e.g. position, velocity, force, charge density, resistivity, current density). Verbal hints were provided when students made an error or were unable to proceed. A total number of 140 one-hour interviews were conducted in this phase, which provided insights into students' difficulties when solving the problems involving the integral and the area under the curve concepts and the hints to help students overcome those difficulties. In phase II of the project, tutorials were created to facilitate students' learning to solve physics problems involving the integral and the area under the curve concepts. Each tutorial consisted of a set of exercises and a protocol that incorporated the helpful hints to target the difficulties that students expressed in phase I of the project. Focus group learning interviews were conducted to test the effectiveness of the tutorials in comparison with standard learning materials (i.e. textbook problems and solutions). Overall results indicated that students learning with our tutorials outperformed students learning with standard materials in applying the integral and the area under the curve concepts to physics problems. The results of this project provide broader and deeper insights into students' problem solving with the integral and the area under the curve concepts and suggest strategies to facilitate students' learning to apply these concepts to physics problems. This study also has significant implications for further research, curriculum development and instruction.
NASA Astrophysics Data System (ADS)
Shaik, Vaseem A.; Ardekani, Arezoo M.
2017-11-01
In this work we derive the image flow fields for point force singularities placed outside a stationary drop covered with an insoluble, nondiffusing, and incompressible surfactant. We assume the interface to be Newtonian and use the Boussinesq-Scriven constitutive law for the interfacial stress tensor. We use this analytical solution to investigate two different problems. First, we derive the mobility matrix for two drops of arbitrary sizes covered with an incompressible surfactant. In the second example, we calculate the velocity of a swimming microorganism (modeled as a Stokes dipole) outside a drop covered with an incompressible surfactant.