Science.gov

Sample records for performance based logistics

  1. Country logistics performance and disaster impact.

    PubMed

    Vaillancourt, Alain; Haavisto, Ira

    2016-04-01

    The aim of this paper is to deepen the understanding of the relationship between country logistics performance and disaster impact. The relationship is analysed through correlation analysis and regression models for 117 countries for the years 2007 to 2012 with disaster impact variables from the International Disaster Database (EM-DAT) and logistics performance indicators from the World Bank. The results show a significant relationship between country logistics performance and disaster impact overall and for five out of six specific logistic performance indicators. These specific indicators were further used to explore the relationship between country logistic performance and disaster impact for three specific disaster types (epidemic, flood and storm). The findings enhance the understanding of the role of logistics in a humanitarian context with empirical evidence of the importance of country logistics performance in disaster response operations.

  2. Imbalanced Learning Based on Logistic Discrimination

    PubMed Central

    Guo, Huaping; Zhi, Weimei; Liu, Hongbing; Xu, Mingliang

    2016-01-01

    In recent years, imbalanced learning problem has attracted more and more attentions from both academia and industry, and the problem is concerned with the performance of learning algorithms in the presence of data with severe class distribution skews. In this paper, we apply the well-known statistical model logistic discrimination to this problem and propose a novel method to improve its performance. To fully consider the class imbalance, we design a new cost function which takes into account the accuracies of both positive class and negative class as well as the precision of positive class. Unlike traditional logistic discrimination, the proposed method learns its parameters by maximizing the proposed cost function. Experimental results show that, compared with other state-of-the-art methods, the proposed one shows significantly better performance on measures of recall, g-mean, f-measure, AUC, and accuracy. PMID:26880877

  3. Transfer Learning Based on Logistic Regression

    NASA Astrophysics Data System (ADS)

    Paul, A.; Rottensteiner, F.; Heipke, C.

    2015-08-01

    In this paper we address the problem of classification of remote sensing images in the framework of transfer learning with a focus on domain adaptation. The main novel contribution is a method for transductive transfer learning in remote sensing on the basis of logistic regression. Logistic regression is a discriminative probabilistic classifier of low computational complexity, which can deal with multiclass problems. This research area deals with methods that solve problems in which labelled training data sets are assumed to be available only for a source domain, while classification is needed in the target domain with different, yet related characteristics. Classification takes place with a model of weight coefficients for hyperplanes which separate features in the transformed feature space. In term of logistic regression, our domain adaptation method adjusts the model parameters by iterative labelling of the target test data set. These labelled data features are iteratively added to the current training set which, at the beginning, only contains source features and, simultaneously, a number of source features are deleted from the current training set. Experimental results based on a test series with synthetic and real data constitutes a first proof-of-concept of the proposed method.

  4. Logistical Consideration in Computer-Based Screening of Astronaut Applicants

    NASA Technical Reports Server (NTRS)

    Galarza, Laura

    2000-01-01

    This presentation reviews the logistical, ergonomic, and psychometric issues and data related to the development and operational use of a computer-based system for the psychological screening of astronaut applicants. The Behavioral Health and Performance Group (BHPG) at the Johnson Space Center upgraded its astronaut psychological screening and selection procedures for the 1999 astronaut applicants and subsequent astronaut selection cycles. The questionnaires, tests, and inventories were upgraded from a paper-and-pencil system to a computer-based system. Members of the BHPG and a computer programmer designed and developed needed interfaces (screens, buttons, etc.) and programs for the astronaut psychological assessment system. This intranet-based system included the user-friendly computer-based administration of tests, test scoring, generation of reports, the integration of test administration and test output to a single system, and a complete database for past, present, and future selection data. Upon completion of the system development phase, four beta and usability tests were conducted with the newly developed system. The first three tests included 1 to 3 participants each. The final system test was conducted with 23 participants tested simultaneously. Usability and ergonomic data were collected from the system (beta) test participants and from 1999 astronaut applicants who volunteered the information in exchange for anonymity. Beta and usability test data were analyzed to examine operational, ergonomic, programming, test administration and scoring issues related to computer-based testing. Results showed a preference for computer-based testing over paper-and -pencil procedures. The data also reflected specific ergonomic, usability, psychometric, and logistical concerns that should be taken into account in future selection cycles. Conclusion. Psychological, psychometric, human and logistical factors must be examined and considered carefully when developing and

  5. Performance and strategy comparisons of human listeners and logistic regression in discriminating underwater targets.

    PubMed

    Yang, Lixue; Chen, Kean

    2015-11-01

    To improve the design of underwater target recognition systems based on auditory perception, this study compared human listeners with automatic classifiers. Performances measures and strategies in three discrimination experiments, including discriminations between man-made and natural targets, between ships and submarines, and among three types of ships, were used. In the experiments, the subjects were asked to assign a score to each sound based on how confident they were about the category to which it belonged, and logistic regression, which represents linear discriminative models, also completed three similar tasks by utilizing many auditory features. The results indicated that the performances of logistic regression improved as the ratio between inter- and intra-class differences became larger, whereas the performances of the human subjects were limited by their unfamiliarity with the targets. Logistic regression performed better than the human subjects in all tasks but the discrimination between man-made and natural targets, and the strategies employed by excellent human subjects were similar to that of logistic regression. Logistic regression and several human subjects demonstrated similar performances when discriminating man-made and natural targets, but in this case, their strategies were not similar. An appropriate fusion of their strategies led to further improvement in recognition accuracy.

  6. An image encryption scheme based on quantum logistic map

    NASA Astrophysics Data System (ADS)

    Akhshani, A.; Akhavan, A.; Lim, S.-C.; Hassan, Z.

    2012-12-01

    The topic of quantum chaos has begun to draw increasing attention in recent years. While a satisfactory definition for it is not settled yet in order to differentiate between its classical counterparts. Dissipative quantum maps can be characterized by sensitive dependence on initial conditions, like classical maps. Considering this property, an implementation of image encryption scheme based on the quantum logistic map is proposed. The security and performance analysis of the proposed image encryption is performed using well-known methods. The results of the reliability analysis are encouraging and it can be concluded that, the proposed scheme is efficient and secure. The results of this study also suggest application of other quantum maps such as quantum standard map and quantum baker map in cryptography and other aspects of security and privacy.

  7. Logistic Regression-HSMM-Based Heart Sound Segmentation.

    PubMed

    Springer, David B; Tarassenko, Lionel; Clifford, Gari D

    2016-04-01

    The identification of the exact positions of the first and second heart sounds within a phonocardiogram (PCG), or heart sound segmentation, is an essential step in the automatic analysis of heart sound recordings, allowing for the classification of pathological events. While threshold-based segmentation methods have shown modest success, probabilistic models, such as hidden Markov models, have recently been shown to surpass the capabilities of previous methods. Segmentation performance is further improved when a priori information about the expected duration of the states is incorporated into the model, such as in a hidden semi-Markov model (HSMM). This paper addresses the problem of the accurate segmentation of the first and second heart sound within noisy real-world PCG recordings using an HSMM, extended with the use of logistic regression for emission probability estimation. In addition, we implement a modified Viterbi algorithm for decoding the most likely sequence of states, and evaluated this method on a large dataset of 10,172 s of PCG recorded from 112 patients (including 12,181 first and 11,627 second heart sounds). The proposed method achieved an average F1 score of 95.63 ± 0.85%, while the current state of the art achieved 86.28 ± 1.55% when evaluated on unseen test recordings. The greater discrimination between states afforded using logistic regression as opposed to the previous Gaussian distribution-based emission probability estimation as well as the use of an extended Viterbi algorithm allows this method to significantly outperform the current state-of-the-art method based on a two-sided paired t-test.

  8. Research on the Environmental Performance Evaluation of Electronic Waste Reverse Logistics Enterprise

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Xiang; Chen, Fei-Yang; Tong, Tong

    According to the characteristic of e-waste reverse logistics, environmental performance evaluation system of electronic waste reverse logistics enterprise is proposed. We use fuzzy analytic hierarchy process method to evaluate the system. In addition, this paper analyzes the enterprise X, as an example, to discuss the evaluation method. It's important to point out attributes and indexes which should be strengthen during the process of ewaste reverse logistics and provide guidance suggestions to domestic e-waste reverse logistics enterprises.

  9. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  10. Assessing “First Mile” Supply Chain Factors Affecting Timeliness of School-Based Deworming Interventions: Supply and Logistics Performance Indicators

    PubMed Central

    Koporc, Kimberly M.; Strunz, Eric; Holloway, Cassandra; Addiss, David G.; Lin, William

    2015-01-01

    Background Between 2007 and 2012, Children Without Worms (CWW) oversaw the Johnson & Johnson (J&J) donation of Vermox (mebendazole) for treatment of school-age children to control soil-transmitted helminthiasis (STH). To identify factors associated with on-time, delayed, or missed mass drug administration (MDA) interventions, and explore possible indicators for supply chain performance for drug donation programs, we reviewed program data for the 14 STH-endemic countries CWW supported during 2007–2012. Methodology Data from drug applications, shipping records, and annual treatment reports were tracked using Microsoft Excel. Qualitative data from interviews with key personnel were used to provide additional context on the causes of delayed or missed MDAs. Four possible contributory factors to delayed or missed MDAs were considered: production, shipping, customs clearance, and miscellaneous in-country issues. Coverage rates were calculated by dividing the number of treatments administered by the number of children targeted during the MDA. Principal Findings Of the approved requests for 78 MDAs, 54 MDAs (69%) were successfully implemented during or before the scheduled month. Ten MDAs (13%) were classified as delayed; seven of these were delayed by one month or less. An additional 14 MDAs (18%) were classified as missed. For the 64 on-time or delayed MDAs, the mean coverage was approximately 88%. Conclusions and Significance To continue to assess the supply chain processes and identify areas for improvement, we identified four indicators or metrics for supply chain performance that can be applied across all neglected tropical disease (NTD) drug donation programs: (1) donor having available inventory to satisfy the country request for donation; (2) donor shipping the approved number of doses; (3) shipment arriving at the Central Medical Stores one month in advance of the scheduled MDA date; and (4) country programs implementing the MDA as scheduled. PMID:26657842

  11. Logistics measurement and performance for United States-Mexican operations under NAFTA

    SciTech Connect

    Fawcett, S.E.; Smith, S.R.

    1995-12-01

    An empirical study utilizing a survey methodology was undertaken to explore the issues surrounding logistics performance under the recently enacted North American Free Trade Agreement (NAFTA). The study surveyed 524 senior level managers directly responsible for their strategic business units` operations involving Mexican production sharing. The study focused on what role Mexican production facilities take in the production process, relative technology level, planning activities, final destination of products, and what level of logistics performance was required to successfully operate. Some of the findings suggest a need to reevaluate current strategies to incorporate logistics support systems. Many benefits of true integration may have been overlooked since logistics was given a secondary position when strategies were formulated. Excessive tranportation and distribution costs may be lowered if logistics is given a higher emphasis in corporate decision making.

  12. Quantum image encryption based on generalized affine transform and logistic map

    NASA Astrophysics Data System (ADS)

    Liang, Hao-Ran; Tao, Xiang-Yang; Zhou, Nan-Run

    2016-07-01

    Quantum circuits of the generalized affine transform are devised based on the novel enhanced quantum representation of digital images. A novel quantum image encryption algorithm combining the generalized affine transform with logistic map is suggested. The gray-level information of the quantum image is encrypted by the XOR operation with a key generator controlled by the logistic map, while the position information of the quantum image is encoded by the generalized affine transform. The encryption keys include the independent control parameters used in the generalized affine transform and the logistic map. Thus, the key space is large enough to frustrate the possible brute-force attack. Numerical simulations and analyses indicate that the proposed algorithm is realizable, robust and has a better performance than its classical counterpart in terms of computational complexity.

  13. Risk assessment of logistics outsourcing based on BP neural network

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofeng; Tian, Zi-you

    The purpose of this article is to evaluate the risk of the enterprises logistics outsourcing. To get this goal, the paper first analysed he main risks existing in the logistics outsourcing, and then set up a risk evaluation index system of the logistics outsourcing; second applied BP neural network into the logistics outsourcing risk evaluation and used MATLAB to the simulation. It proved that the network error is small and has strong practicability. And this method can be used by enterprises to evaluate the risks of logistics outsourcing.

  14. Determine the optimal carrier selection for a logistics network based on multi-commodity reliability criterion

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Kuei; Yeh, Cheng-Ta

    2013-05-01

    From the perspective of supply chain management, the selected carrier plays an important role in freight delivery. This article proposes a new criterion of multi-commodity reliability and optimises the carrier selection based on such a criterion for logistics networks with routes and nodes, over which multiple commodities are delivered. Carrier selection concerns the selection of exactly one carrier to deliver freight on each route. The capacity of each carrier has several available values associated with a probability distribution, since some of a carrier's capacity may be reserved for various orders. Therefore, the logistics network, given any carrier selection, is a multi-commodity multi-state logistics network. Multi-commodity reliability is defined as a probability that the logistics network can satisfy a customer's demand for various commodities, and is a performance indicator for freight delivery. To solve this problem, this study proposes an optimisation algorithm that integrates genetic algorithm, minimal paths and Recursive Sum of Disjoint Products. A practical example in which multi-sized LCD monitors are delivered from China to Germany is considered to illustrate the solution procedure.

  15. Chaos-based secure communication system using logistic map

    NASA Astrophysics Data System (ADS)

    Singh, Narendra; Sinha, Aloka

    2010-03-01

    We propose a new opto-electronic secure communication system using logistic map and pulse position modulation. A modified version of the electronic circuit of the logistic map is used to generate the chaotic signal. Pulse position modulation scheme together with the logistic map has been used to encrypt the signal. Optical fiber has been used to demonstrate the proposed scheme. Eye pattern has been used to verify the noise-like nature of the encrypted signal. Opto-electronic implementation of the technique has been carried out. Experimental results are presented to verify the validity of the proposed technique.

  16. Actigraphy-based scratch detection using logistic regression.

    PubMed

    Petersen, Johanna; Austin, Daniel; Sack, Robert; Hayes, Tamara L

    2013-03-01

    Incessant scratching as a result of diseases such as atopic dermatitis causes skin break down, poor sleep quality, and reduced quality of life for affected individuals. In order to develop more effective therapies, there is a need for objective measures to detect scratching. Wrist actigraphy, which detects wrist movements over time using micro-accelerometers, has shown great promise in detecting scratch because it is lightweight, usable in the home environment, can record longitudinally, and does not require any wires. However, current actigraphy-based scratch-detection methods are limited in their ability to discriminate scratch from other nighttime activities. Our previous work demonstrated the separability of scratch from both walking and restless sleep using a clustering technique which employed four features derived from the actigraphic data: number of accelerations above 0.01 gs, epoch variance, peak frequency, and autocorrelation value at one lag. In this paper, we extended these results by employing these same features as independent variables in a logistic regression model. This allows us to directly estimate the conditional probability of scratching for each epoch. Our approach outperforms competing actigraphy-based approaches and has both high sensitivity (0.96) and specificity (0.92) for identifying scratch as validated on experimental data collected from 12 healthy subjects. The model must still be fully validated on clinical data, but shows promise for applications to clinical trials and longitudinal studies of scratch.

  17. Development of Performance Assessments in Science: Conceptual, Practical, and Logistical Issues.

    ERIC Educational Resources Information Center

    Solano-Flores, Guillermo; Shavelson, Richard J.

    1997-01-01

    Conceptual, practical, and logistical issues in the development of science performance assessments (SPAs) are discussed. The conceptual framework identifies task, response format, and scoring system as components, and conceives of SPAs as tasks that attempt to recreate conditions in which scientists work. Developing SPAs is a sophisticated effort…

  18. Logistics of a Lunar Based Solar Power Satellite Scenario

    NASA Technical Reports Server (NTRS)

    Melissopoulos, Stefanos

    1995-01-01

    A logistics system comprised of two orbital stations for the support of a 500 GW space power satellite scenario in a geostationary orbit was investigated in this study. A subsystem mass model, a mass flow model and a life cycle cost model were developed. The results regarding logistics cost and burden rates show that the transportation cost contributed the most (96%) to the overall cost of the scenario. The orbital stations at a geostationary and at a lunar orbit contributed 4 % to that cost.

  19. Practical investigation of the performance of robust logistic regression to predict the genetic risk of hypertension.

    PubMed

    Kesselmeier, Miriam; Legrand, Carine; Peil, Barbara; Kabisch, Maria; Fischer, Christine; Hamann, Ute; Lorenzo Bermejo, Justo

    2014-01-01

    Logistic regression is usually applied to investigate the association between inherited genetic variants and a binary disease phenotype. A limitation of standard methods used to estimate the parameters of logistic regression models is their strong dependence on a few observations deviating from the majority of the data. We used data from the Genetic Analysis Workshop 18 to explore the possible benefit of robust logistic regression to estimate the genetic risk of hypertension. The comparison between standard and robust methods relied on the influence of departing hypertension profiles (outliers) on the estimated odds ratios, areas under the receiver operating characteristic curves, and clinical net benefit. Our results confirmed that single outliers may substantially affect the estimated genotype relative risks. The ranking of variants by probability values was different in standard and in robust logistic regression. For cutoff probabilities between 0.2 and 0.6, the clinical net benefit estimated by leave-one-out cross-validation in the investigated sample was slightly larger under robust regression, but the overall area under the receiver operating characteristic curve was larger for standard logistic regression. The potential advantage of robust statistics in the context of genetic association studies should be investigated in future analyses based on real and simulated data.

  20. The International Space Station's Multi-Purpose Logistics Module, Thermal Performance of the First Five Flights

    NASA Technical Reports Server (NTRS)

    Holladay, Jon; Cho, Frank

    2003-01-01

    The Multi-Purpose Logistics Module is the primary carrier for transport of pressurized payload to the International Space Station. Performing five missions within a thirteen month span provided a unique opportunity to gather a great deal of information toward understanding and verifying the orbital performance of the vehicle. This paper will provide a brief overview of the hardware history and design capabilities followed by a summary of the missions flown, resource requirements and possibilities for the future.

  1. A Multicriteria Decision Making Approach Based on Fuzzy Theory and Credibility Mechanism for Logistics Center Location Selection

    PubMed Central

    Wang, Bowen; Jiang, Chengrui

    2014-01-01

    As a hot topic in supply chain management, fuzzy method has been widely used in logistics center location selection to improve the reliability and suitability of the logistics center location selection with respect to the impacts of both qualitative and quantitative factors. However, it does not consider the consistency and the historical assessments accuracy of experts in predecisions. So this paper proposes a multicriteria decision making model based on credibility of decision makers by introducing priority of consistency and historical assessments accuracy mechanism into fuzzy multicriteria decision making approach. In this way, only decision makers who pass the credibility check are qualified to perform the further assessment. Finally, a practical example is analyzed to illustrate how to use the model. The result shows that the fuzzy multicriteria decision making model based on credibility mechanism can improve the reliability and suitability of site selection for the logistics center. PMID:25215319

  2. Logistic regression function for detection of suspicious performance during baseline evaluations using concussion vital signs.

    PubMed

    Hill, Benjamin David; Womble, Melissa N; Rohling, Martin L

    2015-01-01

    This study utilized logistic regression to determine whether performance patterns on Concussion Vital Signs (CVS) could differentiate known groups with either genuine or feigned performance. For the embedded measure development group (n = 174), clinical patients and undergraduate students categorized as feigning obtained significantly lower scores on the overall test battery mean for the CVS, Shipley-2 composite score, and California Verbal Learning Test-Second Edition subtests than did genuinely performing individuals. The final full model of 3 predictor variables (Verbal Memory immediate hits, Verbal Memory immediate correct passes, and Stroop Test complex reaction time correct) was significant and correctly classified individuals in their known group 83% of the time (sensitivity = .65; specificity = .97) in a mixed sample of young-adult clinical cases and simulators. The CVS logistic regression function was applied to a separate undergraduate college group (n = 378) that was asked to perform genuinely and identified 5% as having possibly feigned performance indicating a low false-positive rate. The failure rate was 11% and 16% at baseline cognitive testing in samples of high school and college athletes, respectively. These findings have particular relevance given the increasing use of computerized test batteries for baseline cognitive testing and return-to-play decisions after concussion.

  3. Logistic Regression-Based Trichotomous Classification Tree and Its Application in Medical Diagnosis.

    PubMed

    Zhu, Yanke; Fang, Jiqian

    2016-11-01

    The classification tree is a valuable methodology for predictive modeling and data mining. However, the current existing classification trees ignore the fact that there might be a subset of individuals who cannot be well classified based on the information of the given set of predictor variables and who might be classified with a higher error rate; most of the current existing classification trees do not use the combination of variables in each step. An algorithm of a logistic regression-based trichotomous classification tree (LRTCT) is proposed that employs the trichotomous tree structure and the linear combination of predictor variables in the recursive partitioning process. Compared with the widely used classification and regression tree through the applications on a series of simulated data and 2 real data sets, the LRTCT performed better in several aspects and does not require excessive complicated calculations.

  4. Mars Scenario-Based Visioning: Logistical Optimization of Transportation Architectures

    NASA Astrophysics Data System (ADS)

    1999-01-01

    The purpose of this conceptual design investigation is to examine transportation forecasts for future human Wu missions to Mars. - Scenario-Based Visioning is used to generate possible future demand projections. These scenarios are then coupled with availability, cost, and capacity parameters for indigenously designed Mars Transfer Vehicles (solar electric, nuclear thermal, and chemical propulsion types) and Earth-to-Orbit launch vehicles (current, future, and indigenous) to provide a cost-conscious dual-phase launch manifest to meet such future demand. A simulator named M-SAT (Mars Scenario Analysis Tool) is developed using this method. This simulation is used to examine three specific transportation scenarios to Mars: a limited "flaus and footprints" mission, a More ambitious scientific expedition similar to an expanded version of the Design Reference Mission from NASA, and a long-term colonization scenario. Initial results from the simulation indicate that chemical propulsion systems might be the architecture of choice for all three scenarios. With this mind, "what if' analyses were performed which indicated that if nuclear production costs were reduced by 30% for the colonization scenario, then the nuclear architecture would have a lower life cycle cost than the chemical. Results indicate that the most cost-effective solution to the Mars transportation problem is to plan for segmented development, this involves development of one vehicle at one opportunity and derivatives of that vehicle at subsequent opportunities.

  5. Based on 3G and RFID logistic delivery management system application and practice analysis

    NASA Astrophysics Data System (ADS)

    Li, Xiaojun; Peng, Longjun; Zhong, Kaiwen; Huang, Jianming

    2008-10-01

    This article in view of the Logistic Delivery Management characteristic, analysis the logistic delivery management cannot satisfy requests rapid reaction and conformity transportation at present and so on. This article elaborated based on 3G (GIS, GPS, and GPRS) and RFID technology logistic delivery contents and so on management system, system design and architecture design, and its effective integration. The system design mentality uses the systems engineering method, follows the humanist idea, and embarks from user's demand, according to the user demand and the network request, divides according to the laminated structure into the decision-making strata, the service level, the management maintenance level and the technical support level 4 levels. The overall structural design including the system function structural design and the software system design, and take some province logistic delivery management system in management service as an example, introduced the design mentality and the application way.

  6. Comparing performances of heuristic and logistic regression models for a spatial landslide susceptibility assessment in Maramures, County, Northwestern Romania

    NASA Astrophysics Data System (ADS)

    Mǎgut, F. L.; Zaharia, S.; Glade, T.; Irimuş, I. A.

    2012-04-01

    Various methods exist in analyzing spatial landslide susceptibility and classing the results in susceptibility classes. The prediction of spatial landslide distribution can be performed by using a variety of methods based on GIS techniques. The two very common methods of a heuristic assessment and a logistic regression model are employed in this study in order to compare their performance in predicting the spatial distribution of previously mapped landslides for a study area located in Maramureš County, in Northwestern Romania. The first model determines a susceptibility index by combining the heuristic approach with GIS techniques of spatial data analysis. The criteria used for quantifying each susceptibility factor and the expression used to determine the susceptibility index are taken from the Romanian legislation (Governmental Decision 447/2003). This procedure is followed in any Romanian state-ordered study which relies on financial support. The logistic regression model predicts the spatial distribution of landslides by statistically calculating regressive coefficients which describe the dependency of previously mapped landslides on different factors. The identified shallow landslides correspond generally to Pannonian marl and Quaternary contractile clay deposits. The study region is located in the Northwestern part of Romania, including the Baia Mare municipality, the capital of Maramureš County. The study focuses on the former piedmontal region situated to the south of the volcanic mountains Gutâi, in the Baia Mare Depression, where most of the landslide activity has been recorded. In addition, a narrow sector of the volcanic mountains which borders the city of Baia Mare to the north has also been included to test the accuracy of the models in different lithologic units. The results of both models indicate a general medium landslide susceptibility of the study area. The more detailed differences will be discussed with respect to the advantages and

  7. A logistic regression based approach for the prediction of flood warning threshold exceedance

    NASA Astrophysics Data System (ADS)

    Diomede, Tommaso; Trotter, Luca; Stefania Tesini, Maria; Marsigli, Chiara

    2016-04-01

    A method based on logistic regression is proposed for the prediction of river level threshold exceedance at short (+0-18h) and medium (+18-42h) lead times. The aim of the study is to provide a valuable tool for the issue of warnings by the authority responsible of public safety in case of flood. The role of different precipitation periods as predictors for the exceedance of a fixed river level has been investigated, in order to derive significant information for flood forecasting. Based on catchment-averaged values, a separation of "antecedent" and "peak-triggering" rainfall amounts as independent variables is attempted. In particular, the following flood-related precipitation periods have been considered: (i) the period from 1 to n days before the forecast issue time, which may be relevant for the soil saturation, (ii) the last 24 hours, which may be relevant for the current water level in the river, and (iii) the period from 0 to x hours in advance with respect to the forecast issue time, when the flood-triggering precipitation generally occurs. Several combinations and values of these predictors have been tested to optimise the method implementation. In particular, the period for the precursor antecedent precipitation ranges between 5 and 45 days; the state of the river can be represented by the last 24-h precipitation or, as alternative, by the current river level. The flood-triggering precipitation has been cumulated over the next 18 hours (for the short lead time) and 36-42 hours (for the medium lead time). The proposed approach requires a specific implementation of logistic regression for each river section and warning threshold. The method performance has been evaluated over the Santerno river catchment (about 450 km2) in the Emilia-Romagna Region, northern Italy. A statistical analysis in terms of false alarms, misses and related scores was carried out by using a 8-year long database. The results are quite satisfactory, with slightly better performances

  8. Research on reverse logistics location under uncertainty environment based on grey prediction

    NASA Astrophysics Data System (ADS)

    Zhenqiang, Bao; Congwei, Zhu; Yuqin, Zhao; Quanke, Pan

    This article constructs reverse logistic network based on uncertain environment, integrates the reverse logistics network and distribution network, and forms a closed network. An optimization model based on cost is established to help intermediate center, manufacturing center and remanufacturing center make location decision. A gray model GM (1, 1) is used to predict the product holdings of the collection points, and then prediction results are carried into the cost optimization model and a solution is got. Finally, an example is given to verify the effectiveness and feasibility of the model.

  9. Case Study on Optimal Routing in Logistics Network by Priority-based Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoguang; Lin, Lin; Gen, Mitsuo; Shiota, Mitsushige

    Recently, research on logistics caught more and more attention. One of the important issues on logistics system is to find optimal delivery routes with the least cost for products delivery. Numerous models have been developed for that reason. However, due to the diversity and complexity of practical problem, the existing models are usually not very satisfying to find the solution efficiently and convinently. In this paper, we treat a real-world logistics case with a company named ABC Co. ltd., in Kitakyusyu Japan. Firstly, based on the natures of this conveyance routing problem, as an extension of transportation problem (TP) and fixed charge transportation problem (fcTP) we formulate the problem as a minimum cost flow (MCF) model. Due to the complexity of fcTP, we proposed a priority-based genetic algorithm (pGA) approach to find the most acceptable solution to this problem. In this pGA approach, a two-stage path decoding method is adopted to develop delivery paths from a chromosome. We also apply the pGA approach to this problem, and compare our results with the current logistics network situation, and calculate the improvement of logistics cost to help the management to make decisions. Finally, in order to check the effectiveness of the proposed method, the results acquired are compared with those come from the two methods/ software, such as LINDO and CPLEX.

  10. A Graph Summarization Algorithm Based on RFID Logistics

    NASA Astrophysics Data System (ADS)

    Sun, Yan; Hu, Kongfa; Lu, Zhipeng; Zhao, Li; Chen, Ling

    Radio Frequency Identification (RFID) applications are set to play an essential role in object tracking and supply chain management systems. The volume of data generated by a typical RFID application will be enormous as each item will generate a complete history of all the individual locations that it occupied at every point in time. The movement trails of such RFID data form gigantic commodity flowgraph representing the locations and durations of the path stages traversed by each item. In this paper, we use graph to construct a warehouse of RFID commodity flows, and introduce a database-style operation to summarize graphs, which produces a summary graph by grouping nodes based on user-selected node attributes, further allows users to control the hierarchy of summaries. It can cut down the size of graphs, and provide convenience for users to study just on the shrunk graph which they interested. Through extensive experiments, we demonstrate the effectiveness and efficiency of the proposed method.

  11. Computerized Classification Testing under the One-Parameter Logistic Response Model with Ability-Based Guessing

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Huang, Sheng-Yun

    2011-01-01

    The one-parameter logistic model with ability-based guessing (1PL-AG) has been recently developed to account for effect of ability on guessing behavior in multiple-choice items. In this study, the authors developed algorithms for computerized classification testing under the 1PL-AG and conducted a series of simulations to evaluate their…

  12. Reducing false positive incidental findings with ensemble genotyping and logistic regression-based variant filtering methods

    PubMed Central

    Hwang, Kyu-Baek; Lee, In-Hee; Park, Jin-Ho; Hambuch, Tina; Choi, Yongjoon; Kim, MinHyeok; Lee, Kyungjoon; Song, Taemin; Neu, Matthew B.; Gupta, Neha; Kohane, Isaac S.; Green, Robert C.; Kong, Sek Won

    2014-01-01

    As whole genome sequencing (WGS) uncovers variants associated with rare and common diseases, an immediate challenge is to minimize false positive findings due to sequencing and variant calling errors. False positives can be reduced by combining results from orthogonal sequencing methods, but costly. Here we present variant filtering approaches using logistic regression (LR) and ensemble genotyping to minimize false positives without sacrificing sensitivity. We evaluated the methods using paired WGS datasets of an extended family prepared using two sequencing platforms and a validated set of variants in NA12878. Using LR or ensemble genotyping based filtering, false negative rates were significantly reduced by 1.1- to 17.8-fold at the same levels of false discovery rates (5.4% for heterozygous and 4.5% for homozygous SNVs; 30.0% for heterozygous and 18.7% for homozygous insertions; 25.2% for heterozygous and 16.6% for homozygous deletions) compared to the filtering based on genotype quality scores. Moreover, ensemble genotyping excluded > 98% (105,080 of 107,167) of false positives while retaining > 95% (897 of 937) of true positives in de novo mutation (DNM) discovery, and performed better than a consensus method using two sequencing platforms. Our proposed methods were effective in prioritizing phenotype-associated variants, and ensemble genotyping would be essential to minimize false positive DNM candidates. PMID:24829188

  13. Double-image encryption based on discrete multiple-parameter fractional angular transform and two-coupled logistic maps

    NASA Astrophysics Data System (ADS)

    Sui, Liansheng; Duan, Kuaikuai; Liang, Junli

    2015-05-01

    A new discrete fractional transform defined by the fractional order, periodicity and vector parameters is presented, which is named as the discrete multiple-parameter fractional angular transform. Based on this transform and two-coupled logistic map, a double-image encryption scheme is proposed. First, an enlarged image is obtained by connecting two plaintext images sequentially and scrambled by using a chaotic permutation process, in which the sequences of chaotic pairs generated by using the two-coupled logistic map. Then, the scrambled enlarged image is decomposed into two new components. Second, a chaotic random phase mask is generated based on the logistic map, with which one of two components is converted to the modulation phase mask. Another component is encoded into an interim matrix with the help of the modulation phase mask. Finally, the two-dimensional discrete multiple-parameter fractional angular transform is performed on the interim matrix to obtain the ciphertext with stationary white noise distribution. The proposed encryption scheme has an obvious advantage that no phase keys are used in the encryption and decryption process, which is convenient to key management. Moreover, the security of the cryptosystem can be enhanced by using extra parameters such as initial values of chaos functions, fractional orders and vector parameters of transform. Simulation results and security analysis verify the feasibility and effectiveness of the proposed scheme.

  14. Implementing a High Performance Work Place in the Distribution and Logistics Industry: Recommendations for Leadership & Team Member Development

    ERIC Educational Resources Information Center

    McCann, Laura Harding

    2012-01-01

    Leadership development and employee engagement are two elements critical to the success of organizations. In response to growth opportunities, our Distribution and Logistics company set on a course to implement High Performance Work Place to meet the leadership and employee engagement needs, and to find methods for improving work processes. This…

  15. Organizational Learning, Strategic Flexibility and Business Model Innovation: An Empirical Research Based on Logistics Enterprises

    NASA Astrophysics Data System (ADS)

    Bao, Yaodong; Cheng, Lin; Zhang, Jian

    Using the data of 237 Jiangsu logistics firms, this paper empirically studies the relationship among organizational learning capability, business model innovation, strategic flexibility. The results show as follows; organizational learning capability has positive impacts on business model innovation performance; strategic flexibility plays mediating roles on the relationship between organizational learning capability and business model innovation; interaction among strategic flexibility, explorative learning and exploitative learning play significant roles in radical business model innovation and incremental business model innovation.

  16. A Multi-Stage Reverse Logistics Network Problem by Using Hybrid Priority-Based Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Lee, Jeong-Eun; Gen, Mitsuo; Rhee, Kyong-Gu

    Today remanufacturing problem is one of the most important problems regarding to the environmental aspects of the recovery of used products and materials. Therefore, the reverse logistics is gaining become power and great potential for winning consumers in a more competitive context in the future. This paper considers the multi-stage reverse Logistics Network Problem (m-rLNP) while minimizing the total cost, which involves reverse logistics shipping cost and fixed cost of opening the disassembly centers and processing centers. In this study, we first formulate the m-rLNP model as a three-stage logistics network model. Following for solving this problem, we propose a Genetic Algorithm pri (GA) with priority-based encoding method consisting of two stages, and introduce a new crossover operator called Weight Mapping Crossover (WMX). Additionally also a heuristic approach is applied in the 3rd stage to ship of materials from processing center to manufacturer. Finally numerical experiments with various scales of the m-rLNP models demonstrate the effectiveness and efficiency of our approach by comparing with the recent researches.

  17. Business process re-engineering in the logistics industry: a study of implementation, success factors, and performance

    NASA Astrophysics Data System (ADS)

    Shen, Chien-wen; Chou, Ching-Chih

    2010-02-01

    As business process re-engineering (BPR) is an important foundation to ensure the success of enterprise systems, this study would like to investigate the relationships among BPR implementation, BPR success factors, and business performance for logistics companies. Our empirical findings show that BPR companies outperformed non-BPR companies, not only on information processing, technology applications, organisational structure, and co-ordination, but also on all of the major logistics operations. Comparing the different perceptions of the success factors for BPR, non-BPR companies place greater emphasis on the importance of employee involvement while BPR companies are more concerned about the influence of risk management. Our findings also suggest that management attitude towards BPR success factors could affect performance with regard to technology applications and logistics operations. Logistics companies which have not yet implemented the BPR approach could refer to our findings to evaluate the advantages of such an undertaking and to take care of those BPR success factors affecting performance before conducting BPR projects.

  18. More efficient approaches to the exponentiated half-logistic distribution based on record values.

    PubMed

    Seo, Jung-In; Kang, Suk-Bok

    2016-01-01

    The exponentiated half-logistic distribution has various shapes depending on its shape parameter. Therefore, this paper proposes more efficient approach methods for estimating shape parameters in the presence of a nuisance parameter, that is, a scale parameter, from Bayesian and non-Bayesian perspectives if record values have an exponentiated half-logistic distribution. In the frequentist approach, estimation methods based on pivotal quantities are proposed which require no complex computation unlike the maximum likelihood method. In the Bayesian approach, a robust estimation method is developed by constructing a hierarchical structure of the parameter of interest. In addition, two approaches address how the nuisance parameter can be dealt with and verifies that the proposed methods are more efficient than existing methods through Monte Carlo simulations and analyses based on real data. PMID:27652009

  19. Econometric analysis on the impact of macroeconomic variables toward financial performance: A case of Malaysian public listed logistics companies

    NASA Astrophysics Data System (ADS)

    Zakariah, Sahidah; Pyeman, Jaafar; Ghazali, Rahmat; Rahman, Ibrahim A.; Rashid, Ahmad Husni Mohd; Shamsuddin, Sofian

    2014-12-01

    The primary concern of this study is to analyse the impact against macroeconomic variables upon the financial performance, particularly in the case of public listed logistics companies in Malaysia. This study incorporated five macroeconomic variables and four proxies of financial performance. The macroeconomic variables selected are gross domestic product (GDP), total trade (XM), foreign direct investment (FDI), inflation rate (INF), and interest rate (INT). This study is extended to the usage of ratio analysis to predict financial performance in relation to the changes upon macroeconomic variables. As such, this study selected four (4) ratios as proxies to financial performance, which is Operating Profit Margin (OPM), Net Profit Margin (NPM), Return on Asset (ROA), Return on Equity (ROE). The findings of this study may appear non-controversial to some, but it resulted in the following important consensus; (1) GDP is found to be highly impacting NPM and least of ROA, (2) XM has high positive impact on OPM and least on ROE, (3) FDI appear to have insignificant impact towards NPM, and (4) INF and INT show similar negative impact on financial performance, precisely highly negative on OPM and least on ROA. Such findings also conform to the local logistic industry settings, specifically in regards to public listed logistics companies in relation to its financial performance.

  20. Evaluation of the Logistic Model for GAC Performance in Water Treatment

    EPA Science Inventory

    Full-scale field measurement and rapid small-scale column test data from the Greater Cincinnati (Ohio) Water Works (GCWW) were used to calibrate and investigate the application of the logistic model for simulating breakthrough of total organic carbon (TOC) in granular activated c...

  1. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty

    PubMed Central

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946

  2. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.

    PubMed

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method.

  3. Spectral and Spatial-Based Classification for Broad-Scale Land Cover Mapping Based on Logistic Regression

    PubMed Central

    Mallinis, Georgios; Koutsias, Nikos

    2008-01-01

    Improvement of satellite sensor characteristics motivates the development of new techniques for satellite image classification. Spatial information seems to be critical in classification processes, especially for heterogeneous and complex landscapes such as those observed in the Mediterranean basin. In our study, a spectral classification method of a LANDSAT-5 TM imagery that uses several binomial logistic regression models was developed, evaluated and compared to the familiar parametric maximum likelihood algorithm. The classification approach based on logistic regression modelling was extended to a contextual one by using autocovariates to consider spatial dependencies of every pixel with its neighbours. Finally, the maximum likelihood algorithm was upgraded to contextual by considering typicality, a measure which indicates the strength of class membership. The use of logistic regression for broad-scale land cover classification presented higher overall accuracy (75.61%), although not statistically significant, than the maximum likelihood algorithm (64.23%), even when the latter was refined following a spatial approach based on Mahalanobis distance (66.67%). However, the consideration of the spatial autocovariate in the logistic models significantly improved the fit of the models and increased the overall accuracy from 75.61% to 80.49%.

  4. Android platform based smartphones for a logistical remote association repair framework.

    PubMed

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-01-01

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use. PMID:24967603

  5. Android Platform Based Smartphones for a Logistical Remote Association Repair Framework

    PubMed Central

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-01-01

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use. PMID:24967603

  6. Comprehensible Predictive Modeling Using Regularized Logistic Regression and Comorbidity Based Features.

    PubMed

    Stiglic, Gregor; Povalej Brzan, Petra; Fijacko, Nino; Wang, Fei; Delibasic, Boris; Kalousis, Alexandros; Obradovic, Zoran

    2015-01-01

    Different studies have demonstrated the importance of comorbidities to better understand the origin and evolution of medical complications. This study focuses on improvement of the predictive model interpretability based on simple logical features representing comorbidities. We use group lasso based feature interaction discovery followed by a post-processing step, where simple logic terms are added. In the final step, we reduce the feature set by applying lasso logistic regression to obtain a compact set of non-zero coefficients that represent a more comprehensible predictive model. The effectiveness of the proposed approach was demonstrated on a pediatric hospital discharge dataset that was used to build a readmission risk estimation model. The evaluation of the proposed method demonstrates a reduction of the initial set of features in a regression model by 72%, with a slight improvement in the Area Under the ROC Curve metric from 0.763 (95% CI: 0.755-0.771) to 0.769 (95% CI: 0.761-0.777). Additionally, our results show improvement in comprehensibility of the final predictive model using simple comorbidity based terms for logistic regression.

  7. Android platform based smartphones for a logistical remote association repair framework.

    PubMed

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-06-25

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use.

  8. A Time Scheduling Model of Logistics Service Supply Chain Based on the Customer Order Decoupling Point: A Perspective from the Constant Service Operation Time

    PubMed Central

    Yang, Yi; Xu, Haitao; Liu, Xiaoyan; Wang, Yijia; Liang, Zhicheng

    2014-01-01

    In mass customization logistics service, reasonable scheduling of the logistics service supply chain (LSSC), especially time scheduling, is benefit to increase its competitiveness. Therefore, the effect of a customer order decoupling point (CODP) on the time scheduling performance should be considered. To minimize the total order operation cost of the LSSC, minimize the difference between the expected and actual time of completing the service orders, and maximize the satisfaction of functional logistics service providers, this study establishes an LSSC time scheduling model based on the CODP. Matlab 7.8 software is used in the numerical analysis for a specific example. Results show that the order completion time of the LSSC can be delayed or be ahead of schedule but cannot be infinitely advanced or infinitely delayed. Obtaining the optimal comprehensive performance can be effective if the expected order completion time is appropriately delayed. The increase in supply chain comprehensive performance caused by the increase in the relationship coefficient of logistics service integrator (LSI) is limited. The relative concern degree of LSI on cost and service delivery punctuality leads to not only changes in CODP but also to those in the scheduling performance of the LSSC. PMID:24715818

  9. A time scheduling model of logistics service supply chain based on the customer order decoupling point: a perspective from the constant service operation time.

    PubMed

    Liu, Weihua; Yang, Yi; Xu, Haitao; Liu, Xiaoyan; Wang, Yijia; Liang, Zhicheng

    2014-01-01

    In mass customization logistics service, reasonable scheduling of the logistics service supply chain (LSSC), especially time scheduling, is benefit to increase its competitiveness. Therefore, the effect of a customer order decoupling point (CODP) on the time scheduling performance should be considered. To minimize the total order operation cost of the LSSC, minimize the difference between the expected and actual time of completing the service orders, and maximize the satisfaction of functional logistics service providers, this study establishes an LSSC time scheduling model based on the CODP. Matlab 7.8 software is used in the numerical analysis for a specific example. Results show that the order completion time of the LSSC can be delayed or be ahead of schedule but cannot be infinitely advanced or infinitely delayed. Obtaining the optimal comprehensive performance can be effective if the expected order completion time is appropriately delayed. The increase in supply chain comprehensive performance caused by the increase in the relationship coefficient of logistics service integrator (LSI) is limited. The relative concern degree of LSI on cost and service delivery punctuality leads to not only changes in CODP but also to those in the scheduling performance of the LSSC.

  10. A time scheduling model of logistics service supply chain based on the customer order decoupling point: a perspective from the constant service operation time.

    PubMed

    Liu, Weihua; Yang, Yi; Xu, Haitao; Liu, Xiaoyan; Wang, Yijia; Liang, Zhicheng

    2014-01-01

    In mass customization logistics service, reasonable scheduling of the logistics service supply chain (LSSC), especially time scheduling, is benefit to increase its competitiveness. Therefore, the effect of a customer order decoupling point (CODP) on the time scheduling performance should be considered. To minimize the total order operation cost of the LSSC, minimize the difference between the expected and actual time of completing the service orders, and maximize the satisfaction of functional logistics service providers, this study establishes an LSSC time scheduling model based on the CODP. Matlab 7.8 software is used in the numerical analysis for a specific example. Results show that the order completion time of the LSSC can be delayed or be ahead of schedule but cannot be infinitely advanced or infinitely delayed. Obtaining the optimal comprehensive performance can be effective if the expected order completion time is appropriately delayed. The increase in supply chain comprehensive performance caused by the increase in the relationship coefficient of logistics service integrator (LSI) is limited. The relative concern degree of LSI on cost and service delivery punctuality leads to not only changes in CODP but also to those in the scheduling performance of the LSSC. PMID:24715818

  11. Performance-Based Assessment

    ERIC Educational Resources Information Center

    ERIC Review, 1994

    1994-01-01

    "The ERIC Review" is published three times a year and announces research results, publications, and new programs relevant to each issue's theme topic. This issue explores performance-based assessment via two principal articles: "Performance Assessment" (Lawrence M. Rudner and Carol Boston); and "Alternative Assessment: Implications for Social…

  12. Statistical modelling for thoracic surgery using a nomogram based on logistic regression

    PubMed Central

    Liu, Run-Zhong; Zhao, Ze-Rui

    2016-01-01

    A well-developed clinical nomogram is a popular decision-tool, which can be used to predict the outcome of an individual, bringing benefits to both clinicians and patients. With just a few steps on a user-friendly interface, the approximate clinical outcome of patients can easily be estimated based on their clinical and laboratory characteristics. Therefore, nomograms have recently been developed to predict the different outcomes or even the survival rate at a specific time point for patients with different diseases. However, on the establishment and application of nomograms, there is still a lot of confusion that may mislead researchers. The objective of this paper is to provide a brief introduction on the history, definition, and application of nomograms and then to illustrate simple procedures to develop a nomogram with an example based on a multivariate logistic regression model in thoracic surgery. In addition, validation strategies and common pitfalls have been highlighted. PMID:27621910

  13. Statistical modelling for thoracic surgery using a nomogram based on logistic regression.

    PubMed

    Liu, Run-Zhong; Zhao, Ze-Rui; Ng, Calvin S H

    2016-08-01

    A well-developed clinical nomogram is a popular decision-tool, which can be used to predict the outcome of an individual, bringing benefits to both clinicians and patients. With just a few steps on a user-friendly interface, the approximate clinical outcome of patients can easily be estimated based on their clinical and laboratory characteristics. Therefore, nomograms have recently been developed to predict the different outcomes or even the survival rate at a specific time point for patients with different diseases. However, on the establishment and application of nomograms, there is still a lot of confusion that may mislead researchers. The objective of this paper is to provide a brief introduction on the history, definition, and application of nomograms and then to illustrate simple procedures to develop a nomogram with an example based on a multivariate logistic regression model in thoracic surgery. In addition, validation strategies and common pitfalls have been highlighted. PMID:27621910

  14. Statistical modelling for thoracic surgery using a nomogram based on logistic regression

    PubMed Central

    Liu, Run-Zhong; Zhao, Ze-Rui

    2016-01-01

    A well-developed clinical nomogram is a popular decision-tool, which can be used to predict the outcome of an individual, bringing benefits to both clinicians and patients. With just a few steps on a user-friendly interface, the approximate clinical outcome of patients can easily be estimated based on their clinical and laboratory characteristics. Therefore, nomograms have recently been developed to predict the different outcomes or even the survival rate at a specific time point for patients with different diseases. However, on the establishment and application of nomograms, there is still a lot of confusion that may mislead researchers. The objective of this paper is to provide a brief introduction on the history, definition, and application of nomograms and then to illustrate simple procedures to develop a nomogram with an example based on a multivariate logistic regression model in thoracic surgery. In addition, validation strategies and common pitfalls have been highlighted.

  15. Statistical modelling for thoracic surgery using a nomogram based on logistic regression.

    PubMed

    Liu, Run-Zhong; Zhao, Ze-Rui; Ng, Calvin S H

    2016-08-01

    A well-developed clinical nomogram is a popular decision-tool, which can be used to predict the outcome of an individual, bringing benefits to both clinicians and patients. With just a few steps on a user-friendly interface, the approximate clinical outcome of patients can easily be estimated based on their clinical and laboratory characteristics. Therefore, nomograms have recently been developed to predict the different outcomes or even the survival rate at a specific time point for patients with different diseases. However, on the establishment and application of nomograms, there is still a lot of confusion that may mislead researchers. The objective of this paper is to provide a brief introduction on the history, definition, and application of nomograms and then to illustrate simple procedures to develop a nomogram with an example based on a multivariate logistic regression model in thoracic surgery. In addition, validation strategies and common pitfalls have been highlighted.

  16. Optimization-based decision support to assist in logistics planning for hospital evacuations.

    PubMed

    Glick, Roger; Bish, Douglas R; Agca, Esra

    2013-01-01

    The evacuation of the hospital is a very complex process and evacuation planning is an important part of a hospital's emergency management plan. There are numerous factors that affect the evacuation plan including the nature of threat, availability of resources and staff the characteristics of the evacuee population, and risk to patients and staff. The safety and health of patients is of fundamental importance, but safely moving patients to alternative care facilities while under threat is a very challenging task. This article describes the logistical issues and complexities involved in planning and execution of hospital evacuations. Furthermore, this article provides examples of how optimization-based decision support tools can help evacuation planners to better plan for complex evacuations by providing real-world solutions to various evacuation scenarios.

  17. Geohydrology and water quality of Marine Corps Logistics Base, Nebo and Yermo annexes, near Barstow, California

    USGS Publications Warehouse

    Densmore, Jill N.; Cox, Brett F.; Crawford, Steven M.

    1997-01-01

    Because ground water is the only dependable source of water in the Barstow area, a thorough understanding of the relationship between the geology and hydrology of this area is needed to make informed ground-water management andremediation decisions. This report summarizes geologic and hydrologic studies done during 1992-95 at the Marine Corps Logistics Base, Nebo and Yermo Annexes, near Barstow, California. The geologic investigation dealt with the stratigraphy and geologic history of the area and determined the location of faults that cross the Marine Corps Logistics Base, Nebo Annex. Two of these faultscoincide with significant ground-water barriers. Geologic and hydrologic data collected for this study were used to define two main aquifer systems in this area. The Mojave River aquifer is contained within the sand and gravel of the Mojave River alluvium, and the regional aquifer lies in the bordering alluvial-fan deposits and older alluvium. Water-level data showed that recharge occurs exten sively in the Mojave River aquifer but occurs only in small areas of the regional aquifer. Dissolved- solids concentrations showed that ground-water degradation exists in the Mojave River aquifer near the Nebo Annex and extends at least 1 mile downgradient of the Nebo golf course in the younger Mojave River alluvium. Nitrogen concentrations show that more than one source is causing the observed degradation in the Mojave River aquifer. Oxygen-18, deuterium, tritium, andcarbon-14 data indicate that the Mojave River and regional aquifers have different sources of recharge and that recent recharge occurs in the Mojave River aquifer but is more limited in the regional aquifer.

  18. Screening for ketosis using multiple logistic regression based on milk yield and composition.

    PubMed

    Kayano, Mitsunori; Kataoka, Tomoko

    2015-11-01

    Multiple logistic regression was applied to milk yield and composition data for 632 records of healthy cows and 61 records of ketotic cows in Hokkaido, Japan. The purpose was to diagnose ketosis based on milk yield and composition, simultaneously. The cows were divided into two groups: (1) multiparous, including 314 healthy cows and 45 ketotic cows and (2) primiparous, including 318 healthy cows and 16 ketotic cows, since nutritional status, milk yield and composition are affected by parity. Multiple logistic regression was applied to these groups separately. For multiparous cows, milk yield (kg/day/cow) and protein-to-fat (P/F) ratio in milk were significant factors (P<0.05) for the diagnosis of ketosis. For primiparous cows, lactose content (%), solid not fat (SNF) content (%) and milk urea nitrogen (MUN) content (mg/dl) were significantly associated with ketosis (P<0.01). A diagnostic rule was constructed for each group of cows: (1) 9.978 × P/F ratio + 0.085 × milk yield <10 and (2) 2.327 × SNF - 2.703 × lactose + 0.225 × MUN <10. The sensitivity, specificity and the area under the curve (AUC) of the diagnostic rules were (1) 0.800, 0.729 and 0.811; (2) 0.813, 0.730 and 0.787, respectively. The P/F ratio, which is a widely used measure of ketosis, provided the sensitivity, specificity and AUC values of (1) 0.711, 0.726 and 0.781; and (2) 0.678, 0.767 and 0.738, respectively.

  19. Using Contests to Provide Business Students Project-Based Learning in Humanitarian Logistics: PSAid Example

    ERIC Educational Resources Information Center

    Özpolat, Koray; Chen, Yuwen; Hales, Doug; Yu, Degan; Yalcin, Mehmet G.

    2014-01-01

    Business students appreciate working on classroom projects that are both enjoyable and useful in preparing them for future careers. Promoting competition among project teams is also used as a method to motivate students. The Humanitarian Logistics Project (HLP) teaches undergraduate students the logistical implications of unsolicited material…

  20. Performance-based ratemaking

    SciTech Connect

    Cross, P.S.

    1995-07-15

    Performance-based ratemaking (PBR) departs from the cost-of-service standard in setting just and reasonable utility rates, but that departure isn`t as easy as it looks. Up until now, cost-of-service ratemaking has provided relatively stable rates, while enabling utilities to attract enormous amounts of capital. Of late, however, regulators appear to be heeding the argument that changing markets warrant a second look. Throughout the country and across the utility industry, some regulators appear willing to abandon cost of service as a proxy for competition, instead favoring performance-based methods that would rely on competitive forces. These performance-based schemes vary in their details but generally afford utilities the opportunity to increase profits by exceeding targets for efficiency and cost savings. Moreover, these plans purport to streamline the regulatory process. Annual, accounting-type reviews replace rate hearings. Cost-of-service studies might not be required at all once initial rates are fixed. Nevertheless, these PBR plans rely on cost-based rates as a starting point and still contain safeguards to protect ratepayers. PBR falls short of true deregulation. As the Massachusetts Department of Public Utilities noted recently in an order approving a PBR variant known as price-cap regulation for New England Telephone and Telegraph Co., `price-cap regulation is not deregulation; it is merely another way for regulators to control the rates charged by a firm.`

  1. 75 FR 43944 - Membership of the Defense Logistics Agency (DLA) Senior Executive Service (SES) Performance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ...) Performance Review Board (PRB) AGENCY: DLA. ACTION: Notice of membership--2010 DLA PRB. SUMMARY: This notice announces the appointment of members to the DLA SES Performance Review Board (PRB). The publication of PRB composition is required by 5 U.S.C. 4314(c)(4). The PRB provides fair and impartial review of SES...

  2. Kernel-based logistic regression model for protein sequence without vectorialization.

    PubMed

    Fong, Youyi; Datta, Saheli; Georgiev, Ivelin S; Kwong, Peter D; Tomaras, Georgia D

    2015-07-01

    Protein sequence data arise more and more often in vaccine and infectious disease research. These types of data are discrete, high-dimensional, and complex. We propose to study the impact of protein sequences on binary outcomes using a kernel-based logistic regression model, which models the effect of protein through a random effect whose variance-covariance matrix is mostly determined by a kernel function. We propose a novel, biologically motivated, profile hidden Markov model (HMM)-based mutual information (MI) kernel. Hypothesis testing can be carried out using the maximum of the score statistics and a parametric bootstrap procedure. To improve the power of testing, we propose intuitive modifications to the test statistic. We show through simulation studies that the profile HMM-based MI kernel can be substantially more powerful than competing kernels, and that the modified test statistics bring incremental gains in power. We use these proposed methods to investigate two problems from HIV-1 vaccine research: (1) identifying segments of HIV-1 envelope (Env) protein that confer resistance to neutralizing antibody and (2) identifying segments of Env that are associated with attenuation of protective vaccine effect by antibodies of isotype A in the RV144 vaccine trial.

  3. Architecture for improving terrestrial logistics based on the Web of Things.

    PubMed

    Castro, Miguel; Jara, Antonio J; Skarmeta, Antonio

    2012-01-01

    Technological advances for improving supply chain efficiency present three key challenges for managing goods: tracking, tracing and monitoring (TTM), in order to satisfy the requirements for products such as perishable goods where the European Legislations requires them to ship within a prescribed temperature range to ensure freshness and suitability for consumption. The proposed system integrates RFID for tracking and tracing through a distributed architecture developed for heavy goods vehicles, and the sensors embedded in the SunSPOT platform for monitoring the goods transported based on the concept of the Internet of Things. This paper presents how the Internet of Things is integrated for improving terrestrial logistics offering a comprehensive and flexible architecture, with high scalability, according to the specific needs for reaching an item-level continuous monitoring solution. The major contribution from this work is the optimization of the Embedded Web Services based on RESTful (Web of Things) for the access to TTM services at any time during the transportation of goods. Specifically, it has been extended the monitoring patterns such as observe and blockwise transfer for the requirements from the continuous conditional monitoring, and for the transfer of full inventories and partial ones based on conditional queries. In definitive, this work presents an evolution of the previous TTM solutions, which were limited to trailer identification and environment monitoring, to a solution which is able to provide an exhaustive item-level monitoring, required for several use cases. This exhaustive monitoring has required new communication capabilities through the Web of Things, which has been optimized with the use and improvement of a set of communications patterns. PMID:22778657

  4. Architecture for improving terrestrial logistics based on the Web of Things.

    PubMed

    Castro, Miguel; Jara, Antonio J; Skarmeta, Antonio

    2012-01-01

    Technological advances for improving supply chain efficiency present three key challenges for managing goods: tracking, tracing and monitoring (TTM), in order to satisfy the requirements for products such as perishable goods where the European Legislations requires them to ship within a prescribed temperature range to ensure freshness and suitability for consumption. The proposed system integrates RFID for tracking and tracing through a distributed architecture developed for heavy goods vehicles, and the sensors embedded in the SunSPOT platform for monitoring the goods transported based on the concept of the Internet of Things. This paper presents how the Internet of Things is integrated for improving terrestrial logistics offering a comprehensive and flexible architecture, with high scalability, according to the specific needs for reaching an item-level continuous monitoring solution. The major contribution from this work is the optimization of the Embedded Web Services based on RESTful (Web of Things) for the access to TTM services at any time during the transportation of goods. Specifically, it has been extended the monitoring patterns such as observe and blockwise transfer for the requirements from the continuous conditional monitoring, and for the transfer of full inventories and partial ones based on conditional queries. In definitive, this work presents an evolution of the previous TTM solutions, which were limited to trailer identification and environment monitoring, to a solution which is able to provide an exhaustive item-level monitoring, required for several use cases. This exhaustive monitoring has required new communication capabilities through the Web of Things, which has been optimized with the use and improvement of a set of communications patterns.

  5. Architecture for Improving Terrestrial Logistics Based on the Web of Things

    PubMed Central

    Castro, Miguel; Jara, Antonio J.; Skarmeta, Antonio

    2012-01-01

    Technological advances for improving supply chain efficiency present three key challenges for managing goods: tracking, tracing and monitoring (TTM), in order to satisfy the requirements for products such as perishable goods where the European Legislations requires them to ship within a prescribed temperature range to ensure freshness and suitability for consumption. The proposed system integrates RFID for tracking and tracing through a distributed architecture developed for heavy goods vehicles, and the sensors embedded in the SunSPOT platform for monitoring the goods transported based on the concept of the Internet of Things. This paper presents how the Internet of Things is integrated for improving terrestrial logistics offering a comprehensive and flexible architecture, with high scalability, according to the specific needs for reaching an item-level continuous monitoring solution. The major contribution from this work is the optimization of the Embedded Web Services based on RESTful (Web of Things) for the access to TTM services at any time during the transportation of goods. Specifically, it has been extended the monitoring patterns such as observe and blockwise transfer for the requirements from the continuous conditional monitoring, and for the transfer of full inventories and partial ones based on conditional queries. In definitive, this work presents an evolution of the previous TTM solutions, which were limited to trailer identification and environment monitoring, to a solution which is able to provide an exhaustive item-level monitoring, required for several use cases. This exhaustive monitoring has required new communication capabilities through the Web of Things, which has been optimized with the use and improvement of a set of communications patterns. PMID:22778657

  6. Biomass Logistics

    SciTech Connect

    J. Richard Hess; Kevin L. Kenney; William A. Smith; Ian Bonner; David J. Muth

    2015-04-01

    Equipment manufacturers have made rapid improvements in biomass harvesting and handling equipment. These improvements have increased transportation and handling efficiencies due to higher biomass densities and reduced losses. Improvements in grinder efficiencies and capacity have reduced biomass grinding costs. Biomass collection efficiencies (the ratio of biomass collected to the amount available in the field) as high as 75% for crop residues and greater than 90% for perennial energy crops have also been demonstrated. However, as collection rates increase, the fraction of entrained soil in the biomass increases, and high biomass residue removal rates can violate agronomic sustainability limits. Advancements in quantifying multi-factor sustainability limits to increase removal rate as guided by sustainable residue removal plans, and mitigating soil contamination through targeted removal rates based on soil type and residue type/fraction is allowing the use of new high efficiency harvesting equipment and methods. As another consideration, single pass harvesting and other technologies that improve harvesting costs cause biomass storage moisture management challenges, which challenges are further perturbed by annual variability in biomass moisture content. Monitoring, sampling, simulation, and analysis provide basis for moisture, time, and quality relationships in storage, which has allowed the development of moisture tolerant storage systems and best management processes that combine moisture content and time to accommodate baled storage of wet material based upon “shelf-life.” The key to improving biomass supply logistics costs has been developing the associated agronomic sustainability and biomass quality technologies and processes that allow the implementation of equipment engineering solutions.

  7. 3D DWT-DCT and Logistic MAP Based Robust Watermarking for Medical Volume Data.

    PubMed

    Li, Jingbing; Liu, Yaoli; Zhong, Jiling

    2014-01-01

    Applying digital watermarking technique for the security protection of medical information systems is a hotspot of research in recent years. In this paper, we present a robust watermarking algorithm for medical volume data using 3D DWT-DCT and Logistic Map. After applying Logistic Map to enhance the security of watermarking, the visual feature vector of medical volume data is obtained using 3D DWT-DCT. Combining the feature vector, the third party concept and Hash function, a zero-watermarking scheme can be achieved. The proposed algorithm can mitigate the illogicality between robustness and invisibility. The experiment results show that the proposed algorithm is robust to common and geometrical attacks.

  8. Graded Response Model Based on the Logistic Positive Exponent Family of Models for Dichotomous Responses

    ERIC Educational Resources Information Center

    Samejima, Fumiko

    2008-01-01

    Samejima ("Psychometrika "65:319--335, 2000) proposed the logistic positive exponent family of models (LPEF) for dichotomous responses in the unidimensional latent space. The objective of the present paper is to propose and discuss a graded response model that is expanded from the LPEF, in the context of item response theory (IRT). This specific…

  9. Classification of Urban Aerial Data Based on Pixel Labelling with Deep Convolutional Neural Networks and Logistic Regression

    NASA Astrophysics Data System (ADS)

    Yao, W.; Poleswki, P.; Krzystek, P.

    2016-06-01

    The recent success of deep convolutional neural networks (CNN) on a large number of applications can be attributed to large amounts of available training data and increasing computing power. In this paper, a semantic pixel labelling scheme for urban areas using multi-resolution CNN and hand-crafted spatial-spectral features of airborne remotely sensed data is presented. Both CNN and hand-crafted features are applied to image/DSM patches to produce per-pixel class probabilities with a L1-norm regularized logistical regression classifier. The evidence theory infers a degree of belief for pixel labelling from different sources to smooth regions by handling the conflicts present in the both classifiers while reducing the uncertainty. The aerial data used in this study were provided by ISPRS as benchmark datasets for 2D semantic labelling tasks in urban areas, which consists of two data sources from LiDAR and color infrared camera. The test sites are parts of a city in Germany which is assumed to consist of typical object classes including impervious surfaces, trees, buildings, low vegetation, vehicles and clutter. The evaluation is based on the computation of pixel-based confusion matrices by random sampling. The performance of the strategy with respect to scene characteristics and method combination strategies is analyzed and discussed. The competitive classification accuracy could be not only explained by the nature of input data sources: e.g. the above-ground height of nDSM highlight the vertical dimension of houses, trees even cars and the nearinfrared spectrum indicates vegetation, but also attributed to decision-level fusion of CNN's texture-based approach with multichannel spatial-spectral hand-crafted features based on the evidence combination theory.

  10. Logistics of Guinea worm disease eradication in South Sudan.

    PubMed

    Jones, Alexander H; Becknell, Steven; Withers, P Craig; Ruiz-Tiben, Ernesto; Hopkins, Donald R; Stobbelaar, David; Makoy, Samuel Yibi

    2014-03-01

    From 2006 to 2012, the South Sudan Guinea Worm Eradication Program reduced new Guinea worm disease (dracunculiasis) cases by over 90%, despite substantial programmatic challenges. Program logistics have played a key role in program achievements to date. The program uses disease surveillance and program performance data and integrated technical-logistical staffing to maintain flexible and effective logistical support for active community-based surveillance and intervention delivery in thousands of remote communities. Lessons learned from logistical design and management can resonate across similar complex surveillance and public health intervention delivery programs, such as mass drug administration for the control of neglected tropical diseases and other disease eradication programs. Logistical challenges in various public health scenarios and the pivotal contribution of logistics to Guinea worm case reductions in South Sudan underscore the need for additional inquiry into the role of logistics in public health programming in low-income countries.

  11. Short-term cascaded hydroelectric system scheduling based on chaotic particle swarm optimization using improved logistic map

    NASA Astrophysics Data System (ADS)

    He, Yaoyao; Yang, Shanlin; Xu, Qifa

    2013-07-01

    In order to solve the model of short-term cascaded hydroelectric system scheduling, a novel chaotic particle swarm optimization (CPSO) algorithm using improved logistic map is introduced, which uses the water discharge as the decision variables combined with the death penalty function. According to the principle of maximum power generation, the proposed approach makes use of the ergodicity, symmetry and stochastic property of improved logistic chaotic map for enhancing the performance of particle swarm optimization (PSO) algorithm. The new hybrid method has been examined and tested on two test functions and a practical cascaded hydroelectric system. The experimental results show that the effectiveness and robustness of the proposed CPSO algorithm in comparison with other traditional algorithms.

  12. Assessing a cross-border logistics policy using a performance measurement system framework: the case of Hong Kong and the Pearl River Delta region

    NASA Astrophysics Data System (ADS)

    Wong, David W. C.; Choy, K. L.; Chow, Harry K. H.; Lin, Canhong

    2014-06-01

    For the most rapidly growing economic entity in the world, China, a new logistics operation called the indirect cross-border supply chain model has recently emerged. The primary idea of this model is to reduce logistics costs by storing goods at a bonded warehouse with low storage cost in certain Chinese regions, such as the Pearl River Delta (PRD). This research proposes a performance measurement system (PMS) framework to assess the direct and indirect cross-border supply chain models. The PMS covers four categories including cost, time, quality and flexibility in the assessment of the performance of direct and indirect models. Furthermore, a survey was conducted to investigate the logistics performance of third party logistics (3PLs) at the PRD regions, including Guangzhou, Shenzhen and Hong Kong. The significance of the proposed PMS framework allows 3PLs accurately pinpoint the weakness and strengths of it current operations policy at four major performance measurement categories. Hence, this helps 3PLs further enhance the competitiveness and operations efficiency through better resources allocation at the area of warehousing and transportation.

  13. Performance-Based Funding Brief

    ERIC Educational Resources Information Center

    Washington Higher Education Coordinating Board, 2011

    2011-01-01

    A number of states have made progress in implementing performance-based funding (PFB) and accountability. This policy brief summarizes main features of performance-based funding systems in three states: Tennessee, Ohio, and Indiana. The brief also identifies key issues that states considering performance-based funding must address, as well as…

  14. 3D DWT-DCT and Logistic MAP Based Robust Watermarking for Medical Volume Data

    PubMed Central

    Li, Jingbing; Liu, Yaoli; Zhong, Jiling

    2014-01-01

    Applying digital watermarking technique for the security protection of medical information systems is a hotspot of research in recent years. In this paper, we present a robust watermarking algorithm for medical volume data using 3D DWT-DCT and Logistic Map. After applying Logistic Map to enhance the security of watermarking, the visual feature vector of medical volume data is obtained using 3D DWT-DCT. Combining the feature vector, the third party concept and Hash function, a zero-watermarking scheme can be achieved. The proposed algorithm can mitigate the illogicality between robustness and invisibility. The experiment results show that the proposed algorithm is robust to common and geometrical attacks. PMID:25852783

  15. Challenges and models in supporting logistics system design for dedicated-biomass-based bioenergy industry.

    PubMed

    Zhu, Xiaoyan; Li, Xueping; Yao, Qingzhu; Chen, Yuerong

    2011-01-01

    This paper analyzed the uniqueness and challenges in designing the logistics system for dedicated biomass-to-bioenergy industry, which differs from the other industries, due to the unique features of dedicated biomass (e.g., switchgrass) including its low bulk density, restrictions on harvesting season and frequency, content variation with time and circumambient conditions, weather effects, scattered distribution over a wide geographical area, and so on. To design it, this paper proposed a mixed integer linear programming model. It covered from planting and harvesting switchgrass to delivering to a biorefinery and included the residue handling, concentrating on integrating strategic decisions on the supply chain design and tactical decisions on the annual operation schedules. The present numerical examples verified the model and demonstrated its use in practice. This paper showed that the operations of the logistics system were significantly different for harvesting and non-harvesting seasons, and that under the well-designed biomass logistics system, the mass production with a steady and sufficient supply of biomass can increase the unit profit of bioenergy. The analytical model and practical methodology proposed in this paper will help realize the commercial production in biomass-to-bioenergy industry. PMID:20863690

  16. Land-Based Wind Turbine Transportation and Logistics Barriers and Their Effects on U.S. Wind Markets (Presentation)

    SciTech Connect

    Cotrell, J.; Stehly, T.; Johnson, J.; Roberts, J.O.; Parker, Z.; Scott, G.; Heimiller, D.

    2014-05-01

    The average size of land based wind turbines installed in the United States has increased dramatically over time. As a result wind turbines are facing new transportation and logistics barriers that limit the size of utility scale land based wind turbines that can be deployed in the United States. Addressing these transportation and logistics barriers will allow for even further increases in U.S. turbine size using technologies under development for offshore markets. These barriers are important because larger taller turbines have been identified as a path to reducing the levelized cost of energy for electricity. Additionally, increases in turbine size enable the development of new low and moderate speed markets in the U.S. In turn, wind industry stakeholder support, market stability, and ultimately domestic content and manufacturing competitiveness are potentially affected. In general there is very little recent literature that characterizes transportation and logistics barriers and their effects on U.S. wind markets and opportunities. Accordingly, the objective of this paper is to report the results of a recent NREL study that identifies the barriers, assesses their impact and provides recommendations for strategies and specific actions.

  17. [Optimization for MSW logistics of new Xicheng and new Dongcheng districts in Beijing based on the maximum capacity of transfer stations].

    PubMed

    Yuan, Jing; Li, Guo-xue; Zhang, Hong-yu; Luo, Yi-ming

    2013-09-01

    It is necessary to achieve the optimization for MSW logistics based on the new Xicheng (combining the former Xicheng and the former Xuanwu districts) and the new Dongcheng (combining the former Dongcheng and the former Chongwen districts) districts of Beijing. Based on the analysis of current MSW logistics system, transfer station's processing capacity and the terminal treatment facilities' conditions of the four former districts and other districts, a MSW logistics system was built by GIS methods considering transregional treatment. This article analyzes the MSW material balance of current and new logistics systems. Results show that the optimization scheme could reduce the MSW collection distance of the new Xicheng and the new Dongcheng by 9.3 x 10(5) km x a(-1), reduced by 10% compared with current logistics. Under the new logistics solution, considering transregional treatment, can reduce landfill treatment of untreated MSW about 28.3%. If the construction of three incineration plants finished based on the new logistics, the system's optimal ratio of incineration: biochemical treatment: landfill can reach 3.8 : 4.5 : 1.7 compared with 1 : 4.8 : 4.2, which is the ratio of current MSW logistics. The ratio of the amount of incineration: biochemical treatment: landfill approximately reach 4 : 3 : 3 which is the target for 2015. The research results are benefit in increasing MSW utilization and reduction rate of the new Dongcheng and Xicheng districts and nearby districts.

  18. Lunar Commercial Mining Logistics

    NASA Astrophysics Data System (ADS)

    Kistler, Walter P.; Citron, Bob; Taylor, Thomas C.

    2008-01-01

    Innovative commercial logistics is required for supporting lunar resource recovery operations and assisting larger consortiums in lunar mining, base operations, camp consumables and the future commercial sales of propellant over the next 50 years. To assist in lowering overall development costs, ``reuse'' innovation is suggested in reusing modified LTS in-space hardware for use on the moon's surface, developing product lines for recovered gases, regolith construction materials, surface logistics services, and other services as they evolve, (Kistler, Citron and Taylor, 2005) Surface logistics architecture is designed to have sustainable growth over 50 years, financed by private sector partners and capable of cargo transportation in both directions in support of lunar development and resource recovery development. The author's perspective on the importance of logistics is based on five years experience at remote sites on Earth, where remote base supply chain logistics didn't always work, (Taylor, 1975a). The planning and control of the flow of goods and materials to and from the moon's surface may be the most complicated logistics challenges yet to be attempted. Affordability is tied to the innovation and ingenuity used to keep the transportation and surface operations costs as low as practical. Eleven innovations are proposed and discussed by an entrepreneurial commercial space startup team that has had success in introducing commercial space innovation and reducing the cost of space operations in the past. This logistics architecture offers NASA and other exploring nations a commercial alternative for non-essential cargo. Five transportation technologies and eleven surface innovations create the logistics transportation system discussed.

  19. Green Logistics Management

    NASA Astrophysics Data System (ADS)

    Chang, Yoon S.; Oh, Chang H.

    Nowadays, environmental management becomes a critical business consideration for companies to survive from many regulations and tough business requirements. Most of world-leading companies are now aware that environment friendly technology and management are critical to the sustainable growth of the company. The environment market has seen continuous growth marking 532B in 2000, and 590B in 2004. This growth rate is expected to grow to 700B in 2010. It is not hard to see the environment-friendly efforts in almost all aspects of business operations. Such trends can be easily found in logistics area. Green logistics aims to make environmental friendly decisions throughout a product lifecycle. Therefore for the success of green logistics, it is critical to have real time tracking capability on the product throughout the product lifecycle and smart solution service architecture. In this chapter, we introduce an RFID based green logistics solution and service.

  20. Landslide susceptibility mapping using decision-tree based CHi-squared automatic interaction detection (CHAID) and Logistic regression (LR) integration

    NASA Astrophysics Data System (ADS)

    Althuwaynee, Omar F.; Pradhan, Biswajeet; Ahmad, Noordin

    2014-06-01

    This article uses methodology based on chi-squared automatic interaction detection (CHAID), as a multivariate method that has an automatic classification capacity to analyse large numbers of landslide conditioning factors. This new algorithm was developed to overcome the subjectivity of the manual categorization of scale data of landslide conditioning factors, and to predict rainfall-induced susceptibility map in Kuala Lumpur city and surrounding areas using geographic information system (GIS). The main objective of this article is to use CHi-squared automatic interaction detection (CHAID) method to perform the best classification fit for each conditioning factor, then, combining it with logistic regression (LR). LR model was used to find the corresponding coefficients of best fitting function that assess the optimal terminal nodes. A cluster pattern of landslide locations was extracted in previous study using nearest neighbor index (NNI), which were then used to identify the clustered landslide locations range. Clustered locations were used as model training data with 14 landslide conditioning factors such as; topographic derived parameters, lithology, NDVI, land use and land cover maps. Pearson chi-squared value was used to find the best classification fit between the dependent variable and conditioning factors. Finally the relationship between conditioning factors were assessed and the landslide susceptibility map (LSM) was produced. An area under the curve (AUC) was used to test the model reliability and prediction capability with the training and validation landslide locations respectively. This study proved the efficiency and reliability of decision tree (DT) model in landslide susceptibility mapping. Also it provided a valuable scientific basis for spatial decision making in planning and urban management studies.

  1. DrugLogit: logistic discrimination between drugs and nondrugs including disease-specificity by assigning probabilities based on molecular properties.

    PubMed

    García-Sosa, Alfonso T; Oja, Mare; Hetényi, Csaba; Maran, Uko

    2012-08-27

    The increasing knowledge of both structure and activity of compounds provides a good basis for enhancing the pharmacological characterization of chemical libraries. In addition, pharmacology can be seen as incorporating both advances from molecular biology as well as chemical sciences, with innovative insight provided from studying target-ligand data from a ligand molecular point of view. Predictions and profiling of libraries of drug candidates have previously focused mainly on certain cases of oral bioavailability. Inclusion of other administration routes and disease-specificity would improve the precision of drug profiling. In this work, recent data are extended, and a probability-based approach is introduced for quantitative and gradual classification of compounds into categories of drugs/nondrugs, as well as for disease- or organ-specificity. Using experimental data of over 1067 compounds and multivariate logistic regressions, the classification shows good performance in training and independent test cases. The regressions have high statistical significance in terms of the robustness of coefficients and 95% confidence intervals provided by a 1000-fold bootstrapping resampling. Besides their good predictive power, the classification functions remain chemically interpretable, containing only one to five variables in total, and the physicochemical terms involved can be easily calculated. The present approach is useful for an improved description and filtering of compound libraries. It can also be applied sequentially or in combinations of filters, as well as adapted to particular use cases. The scores and equations may be able to suggest possible routes for compound or library modification. The data is made available for reuse by others, and the equations are freely accessible at http://hermes.chem.ut.ee/~alfx/druglogit.html.

  2. Toward Probabilistic Diagnosis and Understanding of Depression Based on Functional MRI Data Analysis with Logistic Group LASSO.

    PubMed

    Shimizu, Yu; Yoshimoto, Junichiro; Toki, Shigeru; Takamura, Masahiro; Yoshimura, Shinpei; Okamoto, Yasumasa; Yamawaki, Shigeto; Doya, Kenji

    2015-01-01

    Diagnosis of psychiatric disorders based on brain imaging data is highly desirable in clinical applications. However, a common problem in applying machine learning algorithms is that the number of imaging data dimensions often greatly exceeds the number of available training samples. Furthermore, interpretability of the learned classifier with respect to brain function and anatomy is an important, but non-trivial issue. We propose the use of logistic regression with a least absolute shrinkage and selection operator (LASSO) to capture the most critical input features. In particular, we consider application of group LASSO to select brain areas relevant to diagnosis. An additional advantage of LASSO is its probabilistic output, which allows evaluation of diagnosis certainty. To verify our approach, we obtained semantic and phonological verbal fluency fMRI data from 31 depression patients and 31 control subjects, and compared the performances of group LASSO (gLASSO), and sparse group LASSO (sgLASSO) to those of standard LASSO (sLASSO), Support Vector Machine (SVM), and Random Forest. Over 90% classification accuracy was achieved with gLASSO, sgLASSO, as well as SVM; however, in contrast to SVM, LASSO approaches allow for identification of the most discriminative weights and estimation of prediction reliability. Semantic task data revealed contributions to the classification from left precuneus, left precentral gyrus, left inferior frontal cortex (pars triangularis), and left cerebellum (c rus1). Weights for the phonological task indicated contributions from left inferior frontal operculum, left post central gyrus, left insula, left middle frontal cortex, bilateral middle temporal cortices, bilateral precuneus, left inferior frontal cortex (pars triangularis), and left precentral gyrus. The distribution of normalized odds ratios further showed, that predictions with absolute odds ratios higher than 0.2 could be regarded as certain. PMID:25932629

  3. Susceptibility mapping of shallow landslides using kernel-based Gaussian process, support vector machines and logistic regression

    NASA Astrophysics Data System (ADS)

    Colkesen, Ismail; Sahin, Emrehan Kutlug; Kavzoglu, Taskin

    2016-06-01

    Identification of landslide prone areas and production of accurate landslide susceptibility zonation maps have been crucial topics for hazard management studies. Since the prediction of susceptibility is one of the main processing steps in landslide susceptibility analysis, selection of a suitable prediction method plays an important role in the success of the susceptibility zonation process. Although simple statistical algorithms (e.g. logistic regression) have been widely used in the literature, the use of advanced non-parametric algorithms in landslide susceptibility zonation has recently become an active research topic. The main purpose of this study is to investigate the possible application of kernel-based Gaussian process regression (GPR) and support vector regression (SVR) for producing landslide susceptibility map of Tonya district of Trabzon, Turkey. Results of these two regression methods were compared with logistic regression (LR) method that is regarded as a benchmark method. Results showed that while kernel-based GPR and SVR methods generally produced similar results (90.46% and 90.37%, respectively), they outperformed the conventional LR method by about 18%. While confirming the superiority of the GPR method, statistical tests based on ROC statistics, success rate and prediction rate curves revealed the significant improvement in susceptibility map accuracy by applying kernel-based GPR and SVR methods.

  4. Research on public logistics centers of Zhenzhou city based on GIS

    NASA Astrophysics Data System (ADS)

    Zeng, Yuhuai; Chen, Shuisen; Tian, Zhihui; Miao, Quansheng

    2008-10-01

    The regional public logistics center (PLC) is the intermedium that transports goods or commodity from producer to wholesaler, retailer and end consumer through whole supply chains. According to the Central Place Theory, the PLC should be multi-centric and of more kinds of graded degrees. From the road network planning discipline, an unique index---Importance Degree, is presented to measure the capacity of a PLC. The Importance Degree selects three township criteria: total population, gross industry product and budget income as weights to calculate the weighted vectors by principle component analysis method. Finally, through the clustering analysis, we can get the graded degrees of PLCs. It proves that that this research method is very effective for the road network planning of Zhengzhou City.

  5. Superfund record of decision (EPA Region 4): Marine Corps Logistics Base, operable unit 1, Albany, GA, October 11, 1994

    SciTech Connect

    1996-03-01

    This Decision Document presents the selected interim remedial action to prevent migration of contaminated groundwater for Potential Source of Contamination Three (PSC 3) of the Marine Corps Logistics Base. The selected remedy will include the following major components: groundwater extraction to control migration of the contaminant plume; on-site treatment of the extracted groundwater using an air stripper unit for the purpose of achieving pretreatment levels prior to discharge to the local Publicly Owned Treatment Works (POTW); on-site treatment of vapor-phase emissions from the air stripper unit; and discharge of the treated groundwater to the POTW.

  6. Remote sensing and GIS-based landslide hazard analysis and cross-validation using multivariate logistic regression model on three test areas in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet

    2010-05-01

    This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross

  7. Hydrogeology of the upper Floridan Aquifer in the vicinity of the Marine Corps Logistics Base near Albany, Georgia

    USGS Publications Warehouse

    McSwain, Kristen Bukowski

    1999-01-01

    In 1995, the U.S. Navy requested that the U.S. Geological Survey conduct an investigation to describe the hydrogeology of the Upper Floridan aquifer in the vicinity of the Marine Corps Logistics Base, southeast and adjacent to Albany, Georgia. The study area encompasses about 90 square miles in the Dougherty Plain District of the Coastal Plain physiographic province, in Dougherty and Worth Counties-the Marine Corps Logistics Base encompasses about 3,600 acres in the central part of the study area. The Upper Floridan aquifer is the shallowest, most widely used source of drinking water for domestic use in the Albany area. The hydrogeologic framework of this aquifer was delineated by description of the geologic and hydrogeologic units that compose the aquifer; evaluation of the lithologic and hydrologic heterogeneity of the aquifer; comparison of the geologic and hydrogeologic setting beneath the base with those of the surrounding area; and determination of ground-water-flow directions, and vertical hydraulic conductivities and gradients in the aquifer. The Upper Floridan aquifer is composed of the Suwannee Limestone and Ocala Limestone and is divided into an upper and lower water-bearing zone. The aquifer is confined below by the Lisbon Formation and is semi-confined above by a low-permeability clay layer in the undifferentiated overburden. The thickness of the aquifer ranges from about 165 feet in the northeastern part of the study area, to about 325 feet in the southeastern part of the study area. Based on slug tests conducted by a U.S. Navy contractor, the upper water-bearing zone has low horizontal hydraulic conductivity (0.0224 to 2.07 feet per day) and a low vertical hydraulic conductivity (0.0000227 to 0.510 feet per day); the lower water-bearing zone has a horizontal hydraulic conductivity that ranges from 0.0134 to 2.95 feet per day. Water-level hydrographs of continuously monitored wells on the Marine Corps Logistics Base show excellent correlation between

  8. Practical Session: Logistic Regression

    NASA Astrophysics Data System (ADS)

    Clausel, M.; Grégoire, G.

    2014-12-01

    An exercise is proposed to illustrate the logistic regression. One investigates the different risk factors in the apparition of coronary heart disease. It has been proposed in Chapter 5 of the book of D.G. Kleinbaum and M. Klein, "Logistic Regression", Statistics for Biology and Health, Springer Science Business Media, LLC (2010) and also by D. Chessel and A.B. Dufour in Lyon 1 (see Sect. 6 of http://pbil.univ-lyon1.fr/R/pdf/tdr341.pdf). This example is based on data given in the file evans.txt coming from http://www.sph.emory.edu/dkleinb/logreg3.htm#data.

  9. Performance Monitoring Based on UML Performance Profile

    NASA Astrophysics Data System (ADS)

    Kim, Dong Kwan; Kim, Chul Jin; Cho, Eun Sook

    In this paper we propose a way of measuring software performance metrics such as response time, throughput, and resource utilization. It is obvious that performance-related Quality of Service (QoS) is one of the important factors which are satisfied for users' needs. The proposed approach uses UML performance profile for the performance specification and aspect-oriented paradigm for the performance measurement. Code instrumentation in AOP is a mechanism to insert source code for performance measurement into business logic code. We used AspectJ, an aspect-oriented extension to the Java. AspectJ code for performance measurement is separated from Java code for functional requirements. Both AspectJ and Java code can be woven together for the performance measurement. The key component of the proposed approach is an AspectJ code generator. It creates AspectJ code for the performance measurement from the UML [1] models containing performance profile.

  10. The comparison of landslide ratio-based and general logistic regression landslide susceptibility models in the Chishan watershed after 2009 Typhoon Morakot

    NASA Astrophysics Data System (ADS)

    WU, Chunhung

    2015-04-01

    The research built the original logistic regression landslide susceptibility model (abbreviated as or-LRLSM) and landslide ratio-based ogistic regression landslide susceptibility model (abbreviated as lr-LRLSM), compared the performance and explained the error source of two models. The research assumes that the performance of the logistic regression model can be better if the distribution of landslide ratio and weighted value of each variable is similar. Landslide ratio is the ratio of landslide area to total area in the specific area and an useful index to evaluate the seriousness of landslide disaster in Taiwan. The research adopted the landside inventory induced by 2009 Typhoon Morakot in the Chishan watershed, which was the most serious disaster event in the last decade, in Taiwan. The research adopted the 20 m grid as the basic unit in building the LRLSM, and six variables, including elevation, slope, aspect, geological formation, accumulated rainfall, and bank erosion, were included in the two models. The six variables were divided as continuous variables, including elevation, slope, and accumulated rainfall, and categorical variables, including aspect, geological formation and bank erosion in building the or-LRLSM, while all variables, which were classified based on landslide ratio, were categorical variables in building the lr-LRLSM. Because the count of whole basic unit in the Chishan watershed was too much to calculate by using commercial software, the research took random sampling instead of the whole basic units. The research adopted equal proportions of landslide unit and not landslide unit in logistic regression analysis. The research took 10 times random sampling and selected the group with the best Cox & Snell R2 value and Nagelkerker R2 value as the database for the following analysis. Based on the best result from 10 random sampling groups, the or-LRLSM (lr-LRLSM) is significant at the 1% level with Cox & Snell R2 = 0.190 (0.196) and Nagelkerke R2

  11. Comparing the Discrete and Continuous Logistic Models

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.

    2008-01-01

    The solutions of the discrete logistic growth model based on a difference equation and the continuous logistic growth model based on a differential equation are compared and contrasted. The investigation is conducted using a dynamic interactive spreadsheet. (Contains 5 figures.)

  12. Strategic Concept of Competition Model in Knowledge-Based Logistics in Machinebuilding

    NASA Astrophysics Data System (ADS)

    Medvedeva, O. V.

    2015-09-01

    A competitive labor market needs serious changing. Machinebuilding is one of the main problem domains. The current direction to promote human capital competition demands for modernization. Therefore, it is necessary to develop a strategy for social and economic promotion of competition in conditions of knowledge-based economy, in particularly, in machinebuilding. The necessity is demonstrated, as well as basic difficulties faced this strategy for machinebuilding.

  13. Logistics engineering education from the point of view environment

    NASA Astrophysics Data System (ADS)

    Bányai, Ágota

    2010-05-01

    A new field of MSc programme offered by the Faculty of Mechanical Engineering and Informatics of the University of Miskolc is represented by the programme in logistics engineering. The Faculty has always laid great emphasis on assigning processes connected with environment protection and globalisation issues the appropriate weight in its programmes. This is based on the fact that the Faculty has initiated and been involved in a great number of research and development projects with a substantial emphasis on the fundamental principles of sustainable development. The objective of the programme of logistics engineering is to train engineers who, in possession of the science, engineering, economic, informatics and industrial, transportation technological knowledge related to the professional field of logistics, are able to analyse, design, organise, and control logistics processes and systems (freight transportation, materials handling, storage, commissioning, loading, purchasing, distribution and waste management) as well as to design and develop machinery and equipment as the elements of logistic systems and also to be involved in their manufacture and quality control and are able to control their operation. The programme prepares its students for performing the logistics management tasks in a company, for creative participation in solving research and development problems in logistics and for pursuing logistics studies in doctoral programmes. There are several laboratories available for practice-oriented training. The 'Integrated Logistics Laboratory' consists of various fixed and mobile, real industrial, i.e. not model-level equipment, the integration of which in one system facilitates not only the presentation, examination and development of the individual self-standing facilities, but the study of their interaction as well in terms of mechatronics, engineering, control engineering, informatics, identification technology and logistics. The state

  14. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  15. Classifying hospitals as mortality outliers: logistic versus hierarchical logistic models.

    PubMed

    Alexandrescu, Roxana; Bottle, Alex; Jarman, Brian; Aylin, Paul

    2014-05-01

    The use of hierarchical logistic regression for provider profiling has been recommended due to the clustering of patients within hospitals, but has some associated difficulties. We assess changes in hospital outlier status based on standard logistic versus hierarchical logistic modelling of mortality. The study population consisted of all patients admitted to acute, non-specialist hospitals in England between 2007 and 2011 with a primary diagnosis of acute myocardial infarction, acute cerebrovascular disease or fracture of neck of femur or a primary procedure of coronary artery bypass graft or repair of abdominal aortic aneurysm. We compared standardised mortality ratios (SMRs) from non-hierarchical models with SMRs from hierarchical models, without and with shrinkage estimates of the predicted probabilities (Model 1 and Model 2). The SMRs from standard logistic and hierarchical models were highly statistically significantly correlated (r > 0.91, p = 0.01). More outliers were recorded in the standard logistic regression than hierarchical modelling only when using shrinkage estimates (Model 2): 21 hospitals (out of a cumulative number of 565 pairs of hospitals under study) changed from a low outlier and 8 hospitals changed from a high outlier based on the logistic regression to a not-an-outlier based on shrinkage estimates. Both standard logistic and hierarchical modelling have identified nearly the same hospitals as mortality outliers. The choice of methodological approach should, however, also consider whether the modelling aim is judgment or improvement, as shrinkage may be more appropriate for the former than the latter. PMID:24711175

  16. Research on 6R Military Logistics Network

    NASA Astrophysics Data System (ADS)

    Jie, Wan; Wen, Wang

    The building of military logistics network is an important issue for the construction of new forces. This paper has thrown out a concept model of 6R military logistics network model based on JIT. Then we conceive of axis spoke y logistics centers network, flexible 6R organizational network, lean 6R military information network based grid. And then the strategy and proposal for the construction of the three sub networks of 6Rmilitary logistics network are given.

  17. To Set Up a Logistic Regression Prediction Model for Hepatotoxicity of Chinese Herbal Medicines Based on Traditional Chinese Medicine Theory

    PubMed Central

    Liu, Hongjie; Li, Tianhao; Zhan, Sha; Pan, Meilan; Ma, Zhiguo; Li, Chenghua

    2016-01-01

    Aims. To establish a logistic regression (LR) prediction model for hepatotoxicity of Chinese herbal medicines (HMs) based on traditional Chinese medicine (TCM) theory and to provide a statistical basis for predicting hepatotoxicity of HMs. Methods. The correlations of hepatotoxic and nonhepatotoxic Chinese HMs with four properties, five flavors, and channel tropism were analyzed with chi-square test for two-way unordered categorical data. LR prediction model was established and the accuracy of the prediction by this model was evaluated. Results. The hepatotoxic and nonhepatotoxic Chinese HMs were related with four properties (p < 0.05), and the coefficient was 0.178 (p < 0.05); also they were related with five flavors (p < 0.05), and the coefficient was 0.145 (p < 0.05); they were not related with channel tropism (p > 0.05). There were totally 12 variables from four properties and five flavors for the LR. Four variables, warm and neutral of the four properties and pungent and salty of five flavors, were selected to establish the LR prediction model, with the cutoff value being 0.204. Conclusions. Warm and neutral of the four properties and pungent and salty of five flavors were the variables to affect the hepatotoxicity. Based on such results, the established LR prediction model had some predictive power for hepatotoxicity of Chinese HMs.

  18. To Set Up a Logistic Regression Prediction Model for Hepatotoxicity of Chinese Herbal Medicines Based on Traditional Chinese Medicine Theory

    PubMed Central

    Liu, Hongjie; Li, Tianhao; Zhan, Sha; Pan, Meilan; Ma, Zhiguo; Li, Chenghua

    2016-01-01

    Aims. To establish a logistic regression (LR) prediction model for hepatotoxicity of Chinese herbal medicines (HMs) based on traditional Chinese medicine (TCM) theory and to provide a statistical basis for predicting hepatotoxicity of HMs. Methods. The correlations of hepatotoxic and nonhepatotoxic Chinese HMs with four properties, five flavors, and channel tropism were analyzed with chi-square test for two-way unordered categorical data. LR prediction model was established and the accuracy of the prediction by this model was evaluated. Results. The hepatotoxic and nonhepatotoxic Chinese HMs were related with four properties (p < 0.05), and the coefficient was 0.178 (p < 0.05); also they were related with five flavors (p < 0.05), and the coefficient was 0.145 (p < 0.05); they were not related with channel tropism (p > 0.05). There were totally 12 variables from four properties and five flavors for the LR. Four variables, warm and neutral of the four properties and pungent and salty of five flavors, were selected to establish the LR prediction model, with the cutoff value being 0.204. Conclusions. Warm and neutral of the four properties and pungent and salty of five flavors were the variables to affect the hepatotoxicity. Based on such results, the established LR prediction model had some predictive power for hepatotoxicity of Chinese HMs. PMID:27656240

  19. To Set Up a Logistic Regression Prediction Model for Hepatotoxicity of Chinese Herbal Medicines Based on Traditional Chinese Medicine Theory.

    PubMed

    Liu, Hongjie; Li, Tianhao; Chen, Lingxiu; Zhan, Sha; Pan, Meilan; Ma, Zhiguo; Li, Chenghua; Zhang, Zhe

    2016-01-01

    Aims. To establish a logistic regression (LR) prediction model for hepatotoxicity of Chinese herbal medicines (HMs) based on traditional Chinese medicine (TCM) theory and to provide a statistical basis for predicting hepatotoxicity of HMs. Methods. The correlations of hepatotoxic and nonhepatotoxic Chinese HMs with four properties, five flavors, and channel tropism were analyzed with chi-square test for two-way unordered categorical data. LR prediction model was established and the accuracy of the prediction by this model was evaluated. Results. The hepatotoxic and nonhepatotoxic Chinese HMs were related with four properties (p < 0.05), and the coefficient was 0.178 (p < 0.05); also they were related with five flavors (p < 0.05), and the coefficient was 0.145 (p < 0.05); they were not related with channel tropism (p > 0.05). There were totally 12 variables from four properties and five flavors for the LR. Four variables, warm and neutral of the four properties and pungent and salty of five flavors, were selected to establish the LR prediction model, with the cutoff value being 0.204. Conclusions. Warm and neutral of the four properties and pungent and salty of five flavors were the variables to affect the hepatotoxicity. Based on such results, the established LR prediction model had some predictive power for hepatotoxicity of Chinese HMs.

  20. To Set Up a Logistic Regression Prediction Model for Hepatotoxicity of Chinese Herbal Medicines Based on Traditional Chinese Medicine Theory.

    PubMed

    Liu, Hongjie; Li, Tianhao; Chen, Lingxiu; Zhan, Sha; Pan, Meilan; Ma, Zhiguo; Li, Chenghua; Zhang, Zhe

    2016-01-01

    Aims. To establish a logistic regression (LR) prediction model for hepatotoxicity of Chinese herbal medicines (HMs) based on traditional Chinese medicine (TCM) theory and to provide a statistical basis for predicting hepatotoxicity of HMs. Methods. The correlations of hepatotoxic and nonhepatotoxic Chinese HMs with four properties, five flavors, and channel tropism were analyzed with chi-square test for two-way unordered categorical data. LR prediction model was established and the accuracy of the prediction by this model was evaluated. Results. The hepatotoxic and nonhepatotoxic Chinese HMs were related with four properties (p < 0.05), and the coefficient was 0.178 (p < 0.05); also they were related with five flavors (p < 0.05), and the coefficient was 0.145 (p < 0.05); they were not related with channel tropism (p > 0.05). There were totally 12 variables from four properties and five flavors for the LR. Four variables, warm and neutral of the four properties and pungent and salty of five flavors, were selected to establish the LR prediction model, with the cutoff value being 0.204. Conclusions. Warm and neutral of the four properties and pungent and salty of five flavors were the variables to affect the hepatotoxicity. Based on such results, the established LR prediction model had some predictive power for hepatotoxicity of Chinese HMs. PMID:27656240

  1. Meeting Skills Needs in a Market-Based Training System: A Study of Employer Perceptions and Responses to Training Challenges in the Australian Transport and Logistics Industry

    ERIC Educational Resources Information Center

    Gekara, Victor O.; Snell, Darryn; Chhetri, Prem; Manzoni, Alex

    2014-01-01

    Many countries are adopting market-based training systems to address industry skills needs. This paper examines the marketisation of Australia's training system and the implications for training provision and outcomes in the Transport and Logistics industry. Drawing on qualitative interviews from industry employers and training providers, we…

  2. [Understanding logistic regression].

    PubMed

    El Sanharawi, M; Naudet, F

    2013-10-01

    Logistic regression is one of the most common multivariate analysis models utilized in epidemiology. It allows the measurement of the association between the occurrence of an event (qualitative dependent variable) and factors susceptible to influence it (explicative variables). The choice of explicative variables that should be included in the logistic regression model is based on prior knowledge of the disease physiopathology and the statistical association between the variable and the event, as measured by the odds ratio. The main steps for the procedure, the conditions of application, and the essential tools for its interpretation are discussed concisely. We also discuss the importance of the choice of variables that must be included and retained in the regression model in order to avoid the omission of important confounding factors. Finally, by way of illustration, we provide an example from the literature, which should help the reader test his or her knowledge.

  3. Developing a Referral Protocol for Community-Based Occupational Therapy Services in Taiwan: A Logistic Regression Analysis.

    PubMed

    Mao, Hui-Fen; Chang, Ling-Hui; Tsai, Athena Yi-Jung; Huang, Wen-Ni; Wang, Jye

    2016-01-01

    Because resources for long-term care services are limited, timely and appropriate referral for rehabilitation services is critical for optimizing clients' functions and successfully integrating them into the community. We investigated which client characteristics are most relevant in predicting Taiwan's community-based occupational therapy (OT) service referral based on experts' beliefs. Data were collected in face-to-face interviews using the Multidimensional Assessment Instrument (MDAI). Community-dwelling participants (n = 221) ≥ 18 years old who reported disabilities in the previous National Survey of Long-term Care Needs in Taiwan were enrolled. The standard for referral was the judgment and agreement of two experienced occupational therapists who reviewed the results of the MDAI. Logistic regressions and Generalized Additive Models were used for analysis. Two predictive models were proposed, one using basic activities of daily living (BADLs) and one using instrumental ADLs (IADLs). Dementia, psychiatric disorders, cognitive impairment, joint range-of-motion limitations, fear of falling, behavioral or emotional problems, expressive deficits (in the BADL-based model), and limitations in IADLs or BADLs were significantly correlated with the need for referral. Both models showed high area under the curve (AUC) values on receiver operating curve testing (AUC = 0.977 and 0.972, respectively). The probability of being referred for community OT services was calculated using the referral algorithm. The referral protocol facilitated communication between healthcare professionals to make appropriate decisions for OT referrals. The methods and findings should be useful for developing referral protocols for other long-term care services.

  4. Virus evolutionary genetic algorithm for task collaboration of logistics distribution

    NASA Astrophysics Data System (ADS)

    Ning, Fanghua; Chen, Zichen; Xiong, Li

    2005-12-01

    In order to achieve JIT (Just-In-Time) level and clients' maximum satisfaction in logistics collaboration, a Virus Evolutionary Genetic Algorithm (VEGA) was put forward under double constraints of logistics resource and operation sequence. Based on mathematic description of a multiple objective function, the algorithm was designed to schedule logistics tasks with different due dates and allocate them to network members. By introducing a penalty item, make span and customers' satisfaction were expressed in fitness function. And a dynamic adaptive probability of infection was used to improve performance of local search. Compared to standard Genetic Algorithm (GA), experimental result illustrates the performance superiority of VEGA. So the VEGA can provide a powerful decision-making technique for optimizing resource configuration in logistics network.

  5. Automated Transportation Management System (ATMS) V2.0 logistics module PBI acceptance criteria

    SciTech Connect

    Weidert, R.S.

    1995-02-28

    This document defines the acceptance criteria for the Automated Transportation Management System V2.0 Logistics Module Performance Based Incentive (PBI). This acceptance criteria will be the primary basis for the generation of acceptance test procedures. The purpose of this document is to define the minimum criteria that must be fulfilled to guarantee acceptance of the Logistics Module.

  6. Logistics engineering education from the point of view environment

    NASA Astrophysics Data System (ADS)

    Bányai, Ágota

    2010-05-01

    A new field of MSc programme offered by the Faculty of Mechanical Engineering and Informatics of the University of Miskolc is represented by the programme in logistics engineering. The Faculty has always laid great emphasis on assigning processes connected with environment protection and globalisation issues the appropriate weight in its programmes. This is based on the fact that the Faculty has initiated and been involved in a great number of research and development projects with a substantial emphasis on the fundamental principles of sustainable development. The objective of the programme of logistics engineering is to train engineers who, in possession of the science, engineering, economic, informatics and industrial, transportation technological knowledge related to the professional field of logistics, are able to analyse, design, organise, and control logistics processes and systems (freight transportation, materials handling, storage, commissioning, loading, purchasing, distribution and waste management) as well as to design and develop machinery and equipment as the elements of logistic systems and also to be involved in their manufacture and quality control and are able to control their operation. The programme prepares its students for performing the logistics management tasks in a company, for creative participation in solving research and development problems in logistics and for pursuing logistics studies in doctoral programmes. There are several laboratories available for practice-oriented training. The 'Integrated Logistics Laboratory' consists of various fixed and mobile, real industrial, i.e. not model-level equipment, the integration of which in one system facilitates not only the presentation, examination and development of the individual self-standing facilities, but the study of their interaction as well in terms of mechatronics, engineering, control engineering, informatics, identification technology and logistics. The state

  7. An Optimization Model for Expired Drug Recycling Logistics Networks and Government Subsidy Policy Design Based on Tri-level Programming.

    PubMed

    Huang, Hui; Li, Yuyu; Huang, Bo; Pi, Xing

    2015-07-09

    In order to recycle and dispose of all people's expired drugs, the government should design a subsidy policy to stimulate users to return their expired drugs, and drug-stores should take the responsibility of recycling expired drugs, in other words, to be recycling stations. For this purpose it is necessary for the government to select the right recycling stations and treatment stations to optimize the expired drug recycling logistics network and minimize the total costs of recycling and disposal. This paper establishes a tri-level programming model to study how the government can optimize an expired drug recycling logistics network and the appropriate subsidy policies. Furthermore, a Hybrid Genetic Simulated Annealing Algorithm (HGSAA) is proposed to search for the optimal solution of the model. An experiment is discussed to illustrate the good quality of the recycling logistics network and government subsides obtained by the HGSAA. The HGSAA is proven to have the ability to converge on the global optimal solution, and to act as an effective algorithm for solving the optimization problem of expired drug recycling logistics network and government subsidies.

  8. An Optimization Model for Expired Drug Recycling Logistics Networks and Government Subsidy Policy Design Based on Tri-level Programming

    PubMed Central

    Huang, Hui; Li, Yuyu; Huang, Bo; Pi, Xing

    2015-01-01

    In order to recycle and dispose of all people’s expired drugs, the government should design a subsidy policy to stimulate users to return their expired drugs, and drug-stores should take the responsibility of recycling expired drugs, in other words, to be recycling stations. For this purpose it is necessary for the government to select the right recycling stations and treatment stations to optimize the expired drug recycling logistics network and minimize the total costs of recycling and disposal. This paper establishes a tri-level programming model to study how the government can optimize an expired drug recycling logistics network and the appropriate subsidy policies. Furthermore, a Hybrid Genetic Simulated Annealing Algorithm (HGSAA) is proposed to search for the optimal solution of the model. An experiment is discussed to illustrate the good quality of the recycling logistics network and government subsides obtained by the HGSAA. The HGSAA is proven to have the ability to converge on the global optimal solution, and to act as an effective algorithm for solving the optimization problem of expired drug recycling logistics network and government subsidies. PMID:26184252

  9. Sparse logistic regression with Lp penalty for biomarker identification.

    PubMed

    Liu, Zhenqiu; Jiang, Feng; Tian, Guoliang; Wang, Suna; Sato, Fumiaki; Meltzer, Stephen J; Tan, Ming

    2007-01-01

    In this paper, we propose a novel method for sparse logistic regression with non-convex regularization Lp (p <1). Based on smooth approximation, we develop several fast algorithms for learning the classifier that is applicable to high dimensional dataset such as gene expression. To the best of our knowledge, these are the first algorithms to perform sparse logistic regression with an Lp and elastic net (Le) penalty. The regularization parameters are decided through maximizing the area under the ROC curve (AUC) of the test data. Experimental results on methylation and microarray data attest the accuracy, sparsity, and efficiency of the proposed algorithms. Biomarkers identified with our methods are compared with that in the literature. Our computational results show that Lp Logistic regression (p <1) outperforms the L1 logistic regression and SCAD SVM. Software is available upon request from the first author. PMID:17402921

  10. NASA Space Rocket Logistics Challenges

    NASA Technical Reports Server (NTRS)

    Neeley, James R.; Jones, James V.; Watson, Michael D.; Bramon, Christopher J.; Inman, Sharon K.; Tuttle, Loraine

    2014-01-01

    The Space Launch System (SLS) is the new NASA heavy lift launch vehicle and is scheduled for its first mission in 2017. The goal of the first mission, which will be uncrewed, is to demonstrate the integrated system performance of the SLS rocket and spacecraft before a crewed flight in 2021. SLS has many of the same logistics challenges as any other large scale program. Common logistics concerns for SLS include integration of discreet programs geographically separated, multiple prime contractors with distinct and different goals, schedule pressures and funding constraints. However, SLS also faces unique challenges. The new program is a confluence of new hardware and heritage, with heritage hardware constituting seventy-five percent of the program. This unique approach to design makes logistics concerns such as commonality especially problematic. Additionally, a very low manifest rate of one flight every four years makes logistics comparatively expensive. That, along with the SLS architecture being developed using a block upgrade evolutionary approach, exacerbates long-range planning for supportability considerations. These common and unique logistics challenges must be clearly identified and tackled to allow SLS to have a successful program. This paper will address the common and unique challenges facing the SLS programs, along with the analysis and decisions the NASA Logistics engineers are making to mitigate the threats posed by each.

  11. Front-End Analysis Cornerstone of Logistics

    NASA Technical Reports Server (NTRS)

    Nager, Paul J.

    2000-01-01

    The presentation provides an overview of Front-End Logistics Support Analysis (FELSA), when it should be performed, benefits of performing FELSA and why it should be performed, how it is conducted, and examples.

  12. Precise Positioning Method for Logistics Tracking Systems Using Personal Handy-Phone System Based on Mahalanobis Distance

    NASA Astrophysics Data System (ADS)

    Yokoi, Naoaki; Kawahara, Yasuhiro; Hosaka, Hiroshi; Sakata, Kenji

    Focusing on the Personal Handy-phone System (PHS) positioning service used in physical distribution logistics, a positioning error offset method for improving positioning accuracy is invented. A disadvantage of PHS positioning is that measurement errors caused by the fluctuation of radio waves due to buildings around the terminal are large, ranging from several tens to several hundreds of meters. In this study, an error offset method is developed, which learns patterns of positioning results (latitude and longitude) containing errors and the highest signal strength at major logistic points in advance, and matches them with new data measured in actual distribution processes according to the Mahalanobis distance. Then the matching resolution is improved to 1/40 that of the conventional error offset method.

  13. A development of logistics management models for the Space Transportation System

    NASA Technical Reports Server (NTRS)

    Carrillo, M. J.; Jacobsen, S. E.; Abell, J. B.; Lippiatt, T. F.

    1983-01-01

    A new analytic queueing approach was described which relates stockage levels, repair level decisions, and the project network schedule of prelaunch operations directly to the probability distribution of the space transportation system launch delay. Finite source population and limited repair capability were additional factors included in this logistics management model developed specifically for STS maintenance requirements. Data presently available to support logistics decisions were based on a comparability study of heavy aircraft components. A two-phase program is recommended by which NASA would implement an integrated data collection system, assemble logistics data from previous STS flights, revise extant logistics planning and resource requirement parameters using Bayes-Lin techniques, and adjust for uncertainty surrounding logistics systems performance parameters. The implementation of these recommendations can be expected to deliver more cost-effective logistics support.

  14. KSC ISS Logistics Support

    NASA Technical Reports Server (NTRS)

    Tellado, Joseph

    2014-01-01

    The presentation contains a status of KSC ISS Logistics Operations. It basically presents current top level ISS Logistics tasks being conducted at KSC, current International Partner activities, hardware processing flow focussing on late Stow operations, list of KSC Logistics POC's, and a backup list of Logistics launch site services. This presentation is being given at the annual International Space Station (ISS) Multi-lateral Logistics Maintenance Control Panel meeting to be held in Turin, Italy during the week of May 13-16. The presentatiuon content doesn't contain any potential lessons learned.

  15. Gauss or Bernoulli? A Monte Carlo Comparison of the Performance of the Linear Mixed-Model and the Logistic Mixed-Model Analyses in Simulated Community Trials with a Dichotomous Outcome Variable at the Individual Level.

    ERIC Educational Resources Information Center

    Hannan, Peter J.; Murray, David M.

    1996-01-01

    A Monte Carlo study compared performance of linear and logistic mixed-model analyses of simulated community trials having specific event rates, intraclass correlations, and degrees of freedom. Results indicate that in studies with adequate denominator degrees of freedom, the researcher may use either method of analysis, with certain cautions. (SLD)

  16. Logistic Stick-Breaking Process

    PubMed Central

    Ren, Lu; Du, Lan; Carin, Lawrence; Dunson, David B.

    2013-01-01

    A logistic stick-breaking process (LSBP) is proposed for non-parametric clustering of general spatially- or temporally-dependent data, imposing the belief that proximate data are more likely to be clustered together. The sticks in the LSBP are realized via multiple logistic regression functions, with shrinkage priors employed to favor contiguous and spatially localized segments. The LSBP is also extended for the simultaneous processing of multiple data sets, yielding a hierarchical logistic stick-breaking process (H-LSBP). The model parameters (atoms) within the H-LSBP are shared across the multiple learning tasks. Efficient variational Bayesian inference is derived, and comparisons are made to related techniques in the literature. Experimental analysis is performed for audio waveforms and images, and it is demonstrated that for segmentation applications the LSBP yields generally homogeneous segments with sharp boundaries. PMID:25258593

  17. Performance Based Budgeting Update. Information Capsule.

    ERIC Educational Resources Information Center

    Bashford, Joanne

    The report shows the performance of Miami-Dade Community College (M-DCC) (Florida) on the measures stipulated for the 2000-01 allocation of Performance Based Budgeting (PBB). Of the total state funds allocated for Performance Based Budgeting, a certain percentage is designated for each of the measures. Colleges earn "points" according to the…

  18. Analysis of Jingdong Mall Logistics Distribution Model

    NASA Astrophysics Data System (ADS)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  19. Optimal distributions for multiplex logistic networks

    NASA Astrophysics Data System (ADS)

    Solá Conde, Luis E.; Used, Javier; Romance, Miguel

    2016-06-01

    This paper presents some mathematical models for distribution of goods in logistic networks based on spectral analysis of complex networks. Given a steady distribution of a finished product, some numerical algorithms are presented for computing the weights in a multiplex logistic network that reach the equilibrium dynamics with high convergence rate. As an application, the logistic networks of Germany and Spain are analyzed in terms of their convergence rates.

  20. Optimal distributions for multiplex logistic networks.

    PubMed

    Solá Conde, Luis E; Used, Javier; Romance, Miguel

    2016-06-01

    This paper presents some mathematical models for distribution of goods in logistic networks based on spectral analysis of complex networks. Given a steady distribution of a finished product, some numerical algorithms are presented for computing the weights in a multiplex logistic network that reach the equilibrium dynamics with high convergence rate. As an application, the logistic networks of Germany and Spain are analyzed in terms of their convergence rates.

  1. Optimal distributions for multiplex logistic networks.

    PubMed

    Solá Conde, Luis E; Used, Javier; Romance, Miguel

    2016-06-01

    This paper presents some mathematical models for distribution of goods in logistic networks based on spectral analysis of complex networks. Given a steady distribution of a finished product, some numerical algorithms are presented for computing the weights in a multiplex logistic network that reach the equilibrium dynamics with high convergence rate. As an application, the logistic networks of Germany and Spain are analyzed in terms of their convergence rates. PMID:27368801

  2. Integrated Computer System of Management in Logistics

    NASA Astrophysics Data System (ADS)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  3. Nuclear Lunar Logistics Study

    NASA Technical Reports Server (NTRS)

    1963-01-01

    This document has been prepared to incorporate all presentation aid material, together with some explanatory text, used during an oral briefing on the Nuclear Lunar Logistics System given at the George C. Marshall Space Flight Center, National Aeronautics and Space Administration, on 18 July 1963. The briefing and this document are intended to present the general status of the NERVA (Nuclear Engine for Rocket Vehicle Application) nuclear rocket development, the characteristics of certain operational NERVA-class engines, and appropriate technical and schedule information. Some of the information presented herein is preliminary in nature and will be subject to further verification, checking and analysis during the remainder of the study program. In addition, more detailed information will be prepared in many areas for inclusion in a final summary report. This work has been performed by REON, a division of Aerojet-General Corporation under Subcontract 74-10039 from the Lockheed Missiles and Space Company. The presentation and this document have been prepared in partial fulfillment of the provisions of the subcontract. From the inception of the NERVA program in July 1961, the stated emphasis has centered around the demonstration of the ability of a nuclear rocket to perform safely and reliably in the space environment, with the understanding that the assignment of a mission (or missions) would place undue emphasis on performance and operational flexibility. However, all were aware that the ultimate justification for the development program must lie in the application of the nuclear propulsion system to the national space objectives.

  4. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 7: PEP logistics and training plan requirements

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Recommendations for logistics activities and logistics planning are presented based on the assumption that a system prime contractor will perform logistics functions to support all program hardware and will implement a logistics system to include the planning and provision of products and services to assure cost effective coverage of the following: maintainability; maintenance; spares and supply support; fuels; pressurants and fluids; operations and maintenance documentation training; preservation, packaging and packing; transportation and handling; storage; and logistics management information reporting. The training courses, manpower, materials, and training aids required will be identified and implemented in a training program.

  5. Perspectives on Performance-Based Incentive Plans.

    ERIC Educational Resources Information Center

    Duttweiler, Patricia Cloud; Ramos-Cancel, Maria L.

    This document is a synthesis of the current literature on performance-based incentive systems for teachers and administrators. Section one provides an introduction to the reform movement and to performance-based pay initiatives; a definition of terms; a brief discussion of funding sources; a discussion of compensation strategies; a description of…

  6. TAP 2: Performance-Based Training Manual

    SciTech Connect

    Not Available

    1993-08-01

    Cornerstone of safe operation of DOE nuclear facilities is personnel performing day-to-day functions which accomplish the facility mission. Performance-based training is fundamental to the safe operation. This manual has been developed to support the Training Accreditation Program (TAP) and assist contractors in efforts to develop performance-based training programs. It provides contractors with narrative procedures on performance-based training that can be modified and incorporated for facility-specific application. It is divided into sections dealing with analysis, design, development, implementation, and evaluation.

  7. Alberta's Performance-Based Funding Mechanism.

    ERIC Educational Resources Information Center

    Barnetson, Bob

    This paper provides an overview of the performance indicator-based accountability and funding mechanism implemented in the higher education system of Alberta, Canada. The paper defines the terms accountability and regulation, examines the use of performance indicators to demonstrate accountability, and explains how performance indicator-based…

  8. Launching a performance-based pay plan.

    PubMed

    Berger, S; Moyer, J

    1991-08-19

    Performance-based compensation is increasingly replacing the annual bonus as hospitals seek ways to motivate their management. Two Ernst & Young authorities outline how to establish the incentive approach and put the performance measures in place. In the process, the performance goals should communicate what's important to the organization.

  9. Technical issues: logistics. AAMC.

    PubMed

    Stillman, P L

    1993-06-01

    The author states that she became interested in standardized patients (SPs) around 20 years ago as a means of developing a more uniform and effective way to provide instruction and evaluation of basic clinical skills. She reflects upon in detail: (1) the logistics of using SPs in teaching; (2) how SPs are used in assessment; (3) what aspects of performance SPs can be trained to record and evaluate; (4) issues concerning checklists; (5) evaluation of interviewing skills; (6) evaluation of written communication skills; (7) importance of defining what is being tested; (8) various kinds and uses of inter-station exercises and problems of scoring them; (9) case development and the various sources for case material; (10) ways to generate scores; (11) selecting and training SPs; (12) role of the faculty and primary importance of bedside training with real patients; and (13) pros and cons of national versus single-school efforts to use SPs. She concludes by cautioning that further research must be done before SPs can be used for high-stakes certifying and licensing examinations. PMID:8507311

  10. Models of logistic regression analysis, support vector machine, and back-propagation neural network based on serum tumor markers in colorectal cancer diagnosis.

    PubMed

    Zhang, B; Liang, X L; Gao, H Y; Ye, L S; Wang, Y G

    2016-05-13

    We evaluated the application of three machine learning algorithms, including logistic regression, support vector machine and back-propagation neural network, for diagnosing congenital heart disease and colorectal cancer. By inspecting related serum tumor marker levels in colorectal cancer patients and healthy subjects, early diagnosis models for colorectal cancer were built using three machine learning algorithms to assess their corresponding diagnostic values. Except for serum alpha-fetoprotein, the levels of 11 other serum markers of patients in the colorectal cancer group were higher than those in the benign colorectal cancer group (P < 0.05). The results of logistic regression analysis indicted that individual detection of serum carcinoembryonic antigens, CA199, CA242, CA125, and CA153 and their combined detection was effective for diagnosing colorectal cancer. Combined detection had a better diagnostic effect with a sensitivity of 94.2% and specificity of 97.7%; combining serum carcinoembryonic antigens, CA199, CA242, CA125, and CA153, with the support vector machine diagnosis model and back-propagation, a neural network diagnosis model was built with diagnostic accuracies of 82 and 75%, sensitivities of 85 and 80%, and specificities of 80 and 70%, respectively. Colorectal cancer diagnosis models based on the three machine learning algorithms showed high diagnostic value and can help obtain evidence for the early diagnosis of colorectal cancer.

  11. Space Station fluid management logistics

    NASA Technical Reports Server (NTRS)

    Dominick, Sam M.

    1990-01-01

    Viewgraphs and discussion on space station fluid management logistics are presented. Topics covered include: fluid management logistics - issues for Space Station Freedom evolution; current fluid logistics approach; evolution of Space Station Freedom fluid resupply; launch vehicle evolution; ELV logistics system approach; logistics carrier configuration; expendable fluid/propellant carrier description; fluid carrier design concept; logistics carrier orbital operations; carrier operations at space station; summary/status of orbital fluid transfer techniques; Soviet progress tanker system; and Soviet propellant resupply system observations.

  12. Performance Based Education: A Social Alchemy.

    ERIC Educational Resources Information Center

    Clements, Millard

    1982-01-01

    An exploration of performance-based education is focused through these questions: What image of human beings does it project? What image of professionals does it project? What purpose does it serve? What image of knowledge does it project? (CT)

  13. Logistics planning for phased programs.

    NASA Technical Reports Server (NTRS)

    Cook, W. H.

    1973-01-01

    It is pointed out that the proper and early integration of logistics planning into the phased program planning process will drastically reduce these logistics costs. Phased project planning is a phased approach to the planning, approval, and conduct of major research and development activity. A progressive build-up of knowledge of all aspects of the program is provided. Elements of logistics are discussed together with aspects of integrated logistics support, logistics program planning, and logistics activities for phased programs. Continuing logistics support can only be assured if there is a comprehensive sequential listing of all logistics activities tied to the program schedule and a real-time inventory of assets.

  14. Predicting language improvement in acute stroke patients presenting with aphasia: a multivariate logistic model using location-weighted atlas-based analysis of admission CT perfusion scans

    PubMed Central

    Payabvash, Seyedmehdi; Kamalian, Shahmir; Fung, Steve; Wang, Yifei; Passanese, John; Kamalian, Shervin; Souza, Leticia CS; Kemmling, Andre; Harris, Gordon J.; Halpern, Elkan F.; Gonzalez, R. Gilberto; Furie, Karen L.; Lev, Michael H.

    2013-01-01

    Purpose To construct a multivariate model for prediction of early aphasia improvement in stroke patients using admission CT perfusion (CTP) and CT angiography (CTA). Methods Fifty-eight consecutive patients with aphasia due to first-time ischemic stroke of the left hemisphere were included. Language function was assessed based on patients’ admission and discharge NIHSS and clinical records. All patients had brain CTP and CTA within 9 hours of symptom onset. For image analysis, all CTPs were automatically coregistered to MNI-152 brain space and parcellated into mirrored cortical and subcortical regions. Multiple logistic regression analysis was used to find independent imaging and clinical predictors of language recovery. Results By the time of discharge, 21 (36%) patients demonstrated improvement of language. Independent factors predicting improvement in language included relative cerebral blood flow of angular gyrus gray matter (Brodmann’s area 39) and lower third of insular ribbon, proximal cerebral artery occlusion on admission CTA, and aphasia score on admission NIHSS exam. Using these 4 variables, we developed a multivariate logistic regression model that could estimate the probability of early improvement in stroke patients presenting with aphasia and predict functional outcome with 91% accuracy. Conclusion An imaging-based location weighted multivariate model is developed to predict early language improvement of aphasic patients using admission data collected within 9-hours of stroke onset. This pilot model should be validated in a larger, prospective study; however, the semi-automated atlas-based analysis of brain CTP, along with the statistical approach, could be generalized for prediction of other outcome measures in stroke patients. PMID:20488905

  15. Multisource information fusion for logistics

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Petrov, Plamen; Noll, Warren

    2011-05-01

    Current Army logistical systems and databases contain massive amounts of data that need an effective method to extract actionable information. The databases do not contain root cause and case-based analysis needed to diagnose or predict breakdowns. A system is needed to find data from as many sources as possible, process it in an integrated fashion, and disseminate information products on the readiness of the fleet vehicles. 21st Century Systems, Inc. introduces the Agent- Enabled Logistics Enterprise Intelligence System (AELEIS) tool, designed to assist logistics analysts with assessing the availability and prognostics of assets in the logistics pipeline. AELEIS extracts data from multiple, heterogeneous data sets. This data is then aggregated and mined for data trends. Finally, data reasoning tools and prognostics tools evaluate the data for relevance and potential issues. Multiple types of data mining tools may be employed to extract the data and an information reasoning capability determines what tools are needed to apply them to extract information. This can be visualized as a push-pull system where data trends fire a reasoning engine to search for corroborating evidence and then integrate the data into actionable information. The architecture decides on what reasoning engine to use (i.e., it may start with a rule-based method, but, if needed, go to condition based reasoning, and even a model-based reasoning engine for certain types of equipment). Initial results show that AELEIS is able to indicate to the user of potential fault conditions and root-cause information mined from a database.

  16. Using Dominance Analysis to Determine Predictor Importance in Logistic Regression

    ERIC Educational Resources Information Center

    Azen, Razia; Traxel, Nicole

    2009-01-01

    This article proposes an extension of dominance analysis that allows researchers to determine the relative importance of predictors in logistic regression models. Criteria for choosing logistic regression R[superscript 2] analogues were determined and measures were selected that can be used to perform dominance analysis in logistic regression. A…

  17. Space Station - An integrated approach to operational logistics support

    NASA Technical Reports Server (NTRS)

    Hosmer, G. J.

    1986-01-01

    Development of an efficient and cost effective operational logistics system for the Space Station will require logistics planning early in the program's design and development phase. This paper will focus on Integrated Logistics Support (ILS) Program techniques and their application to the Space Station program design, production and deployment phases to assure the development of an effective and cost efficient operational logistics system. The paper will provide the methodology and time-phased programmatic steps required to establish a Space Station ILS Program that will provide an operational logistics system based on planned Space Station program logistics support.

  18. Contribution to the modelling and analysis of logistics system performance by Petri nets and simulation models: Application in a supply chain

    NASA Astrophysics Data System (ADS)

    Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said

    2016-02-01

    In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.

  19. Study of the National Science Foundation's South Pole Station as an analogous data base for the logistical support of a Moon laboratory

    NASA Technical Reports Server (NTRS)

    Hickam, H. H., Jr.

    1993-01-01

    The day will come when the United States will want to return to the Earth's Moon. When that occurs, NASA may look to the Apollo program for technical and inspirational guidance. The Apollo program, however, was designed to be an end to itself--the landing of a man on the Moon and his return safely within the decade of the 1960's. When that was accomplished, the program folded because it was not self-sustaining. The next time we return to the Moon, we should base our planning on a program that is designed to be a sustained effort for an indefinite period. It is the thrust of this report that the South Pole Station of the National Science Foundation can be used to develop analogs for the construction, funding, and logistical support of a lunar base. Other analogs include transportation and national efforts versus international cooperation. A recommended lunar base using the South Pole Station as inspiration is provided, as well as details concerning economical construction of the base over a 22-year period.

  20. Vehicle Scheduling Schemes for Commercial and Emergency Logistics Integration

    PubMed Central

    Li, Xiaohui; Tan, Qingmei

    2013-01-01

    In modern logistics operations, large-scale logistics companies, besides active participation in profit-seeking commercial business, also play an essential role during an emergency relief process by dispatching urgently-required materials to disaster-affected areas. Therefore, an issue has been widely addressed by logistics practitioners and caught researchers' more attention as to how the logistics companies achieve maximum commercial profit on condition that emergency tasks are effectively and performed satisfactorily. In this paper, two vehicle scheduling models are proposed to solve the problem. One is a prediction-related scheme, which predicts the amounts of disaster-relief materials and commercial business and then accepts the business that will generate maximum profits; the other is a priority-directed scheme, which, firstly groups commercial and emergency business according to priority grades and then schedules both types of business jointly and simultaneously by arriving at the maximum priority in total. Moreover, computer-based simulations are carried out to evaluate the performance of these two models by comparing them with two traditional disaster-relief tactics in China. The results testify the feasibility and effectiveness of the proposed models. PMID:24391724

  1. Vehicle scheduling schemes for commercial and emergency logistics integration.

    PubMed

    Li, Xiaohui; Tan, Qingmei

    2013-01-01

    In modern logistics operations, large-scale logistics companies, besides active participation in profit-seeking commercial business, also play an essential role during an emergency relief process by dispatching urgently-required materials to disaster-affected areas. Therefore, an issue has been widely addressed by logistics practitioners and caught researchers' more attention as to how the logistics companies achieve maximum commercial profit on condition that emergency tasks are effectively and performed satisfactorily. In this paper, two vehicle scheduling models are proposed to solve the problem. One is a prediction-related scheme, which predicts the amounts of disaster-relief materials and commercial business and then accepts the business that will generate maximum profits; the other is a priority-directed scheme, which, firstly groups commercial and emergency business according to priority grades and then schedules both types of business jointly and simultaneously by arriving at the maximum priority in total. Moreover, computer-based simulations are carried out to evaluate the performance of these two models by comparing them with two traditional disaster-relief tactics in China. The results testify the feasibility and effectiveness of the proposed models. PMID:24391724

  2. Vehicle scheduling schemes for commercial and emergency logistics integration.

    PubMed

    Li, Xiaohui; Tan, Qingmei

    2013-01-01

    In modern logistics operations, large-scale logistics companies, besides active participation in profit-seeking commercial business, also play an essential role during an emergency relief process by dispatching urgently-required materials to disaster-affected areas. Therefore, an issue has been widely addressed by logistics practitioners and caught researchers' more attention as to how the logistics companies achieve maximum commercial profit on condition that emergency tasks are effectively and performed satisfactorily. In this paper, two vehicle scheduling models are proposed to solve the problem. One is a prediction-related scheme, which predicts the amounts of disaster-relief materials and commercial business and then accepts the business that will generate maximum profits; the other is a priority-directed scheme, which, firstly groups commercial and emergency business according to priority grades and then schedules both types of business jointly and simultaneously by arriving at the maximum priority in total. Moreover, computer-based simulations are carried out to evaluate the performance of these two models by comparing them with two traditional disaster-relief tactics in China. The results testify the feasibility and effectiveness of the proposed models.

  3. The External Validity of Scores Based on the Two-Parameter Logistic Model: Some Comparisons between IRT and CTT

    ERIC Educational Resources Information Center

    Ferrando, Pere J.; Chico, Eliseo

    2007-01-01

    A theoretical advantage of item response theory (IRT) models is that trait estimates based on these models provide more test information than any other type of test score. It is still unclear, however, whether using IRT trait estimates improves external validity results in comparison with the results that can be obtained by using simple raw…

  4. Micro-Logistics Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Cirillo, William; Stromgren, Chel; Galan, Ricardo

    2008-01-01

    Traditionally, logistics analysis for space missions has focused on the delivery of elements and goods to a destination. This type of logistics analysis can be referred to as "macro-logistics". While the delivery of goods is a critical component of mission analysis, it captures only a portion of the constraints that logistics planning may impose on a mission scenario. The other component of logistics analysis concerns the local handling of goods at the destination, including storage, usage, and disposal. This type of logistics analysis, referred to as "micro-logistics", may also be a primary driver in the viability of a human lunar exploration scenario. With the rigorous constraints that will be placed upon a human lunar outpost, it is necessary to accurately evaluate micro-logistics operations in order to develop exploration scenarios that will result in an acceptable level of system performance.

  5. The Application used RFID in Third Party Logistics*

    NASA Astrophysics Data System (ADS)

    Mingxiu, Zheng; Chunchang, Fu; Minggen, Yang

    RFID is a non-contact automatic identification technology, which will be the future information storage extraction and processing technology. In recent years the mainstream of the large-scale development has manifested the situation. RFID is the key technology of tripartite logistics information and automation. RFID-based logistics system can enlarge the logistics operation capacity, and improve labor productivity to reduce logistics operations mistakes.

  6. High performance pitch-based carbon fiber

    SciTech Connect

    Tadokoro, Hiroyuki; Tsuji, Nobuyuki; Shibata, Hirotaka; Furuyama, Masatoshi

    1996-12-31

    The high performance pitch-based carbon fiber with smaller diameter, six micro in developed by Nippon Graphite Fiber Corporation. This fiber possesses high tensile modulus, high tensile strength, excellent yarn handle ability, low thermal expansion coefficient, and high thermal conductivity which make it an ideal material for space applications such as artificial satellites. Performance of this fiber as a reinforcement of composites was sufficient. With these characteristics, this pitch-based carbon fiber is expected to find wide variety of possible applications in space structures, industrial field, sporting goods and civil infrastructures.

  7. Performance-Based Evaluation and School Librarians

    ERIC Educational Resources Information Center

    Church, Audrey P.

    2015-01-01

    Evaluation of instructional personnel is standard procedure in our Pre-K-12 public schools, and its purpose is to document educator effectiveness. With Race to the Top and No Child Left Behind waivers, states are required to implement performance-based evaluations that demonstrate student academic progress. This three-year study describes the…

  8. Performance-Based Rewards and Work Stress

    ERIC Educational Resources Information Center

    Ganster, Daniel C.; Kiersch, Christa E.; Marsh, Rachel E.; Bowen, Angela

    2011-01-01

    Even though reward systems play a central role in the management of organizations, their impact on stress and the well-being of workers is not well understood. We review the literature linking performance-based reward systems to various indicators of employee stress and well-being. Well-controlled experiments in field settings suggest that certain…

  9. Performance-based inspection and maintenance strategies

    SciTech Connect

    Vesely, W.E.

    1995-04-01

    Performance-based inspection and maintenance strategies utilize measures of equipment performance to help guide inspection and maintenance activities. A relevant measure of performance for safety system components is component unavailability. The component unavailability can also be input into a plant risk model such as a Probabilistic Risk Assessment (PRA) to determine the associated plant risk performance. Based on the present and projected unavailability performance, or the present and projected risk performance, the effectiveness of current maintenance activities can be evaluated and this information can be used to plan future maintenance activities. A significant amount of information other than downtimes or failure times is collected or can be collected when an inspection or maintenance is conducted which can be used to estimate the component unavailability. This information generally involves observations on the condition or state of the component or component piecepart. The information can be detailed such as the amount of corrosion buildup or can be general such as the general state of the component described as {open_quotes}high degradation{close_quotes}, {open_quotes}moderate degradation{close_quotes}, or {open_quotes}low degradation{close_quotes}. Much of the information collected in maintenance logs is qualitative and fuzzy. As part of an NRC Research program on performance-based engineering modeling, approaches have been developed to apply Fuzzy Set Theory to information collected on the state of the component to determine the implied component or component piecepart unavailability. Demonstrations of the applications of Fuzzy Set Theory are presented utilizing information from plant maintenance logs. The demonstrations show the power of Fuzzy Set Theory in translating engineering information to reliability and risk implications.

  10. Analysis of Logistics in Support of a Human Lunar Outpost

    NASA Technical Reports Server (NTRS)

    Cirillo, William; Earle, Kevin; Goodliff, Kandyce; Reeves, j. D.; Andrashko, Mark; Merrill, R. Gabe; Stromgren, Chel

    2008-01-01

    Strategic level analysis of the integrated behavior of lunar transportation system and lunar surface system architecture options is performed to inform NASA Constellation Program senior management on the benefit, viability, affordability, and robustness of system design choices. This paper presents an overview of the approach used to perform the campaign (strategic) analysis, with an emphasis on the logistics modeling and the impacts of logistics resupply on campaign behavior. An overview of deterministic and probabilistic analysis approaches is provided, with a discussion of the importance of each approach to understanding the integrated system behavior. The logistics required to support lunar surface habitation are analyzed from both 'macro-logistics' and 'micro-logistics' perspectives, where macro-logistics focuses on the delivery of goods to a destination and micro-logistics focuses on local handling of re-supply goods at a destination. An example campaign is provided to tie the theories of campaign analysis to results generation capabilities.

  11. Network-Based Logistic Classification with an Enhanced L1/2 Solver Reveals Biomarker and Subnetwork Signatures for Diagnosing Lung Cancer

    PubMed Central

    Huang, Hai-Hui; Liang, Yong; Liu, Xiao-Ying

    2015-01-01

    Identifying biomarker and signaling pathway is a critical step in genomic studies, in which the regularization method is a widely used feature extraction approach. However, most of the regularizers are based on L1-norm and their results are not good enough for sparsity and interpretation and are asymptotically biased, especially in genomic research. Recently, we gained a large amount of molecular interaction information about the disease-related biological processes and gathered them through various databases, which focused on many aspects of biological systems. In this paper, we use an enhanced L1/2 penalized solver to penalize network-constrained logistic regression model called an enhanced L1/2 net, where the predictors are based on gene-expression data with biologic network knowledge. Extensive simulation studies showed that our proposed approach outperforms L1 regularization, the old L1/2 penalized solver, and the Elastic net approaches in terms of classification accuracy and stability. Furthermore, we applied our method for lung cancer data analysis and found that our method achieves higher predictive accuracy than L1 regularization, the old L1/2 penalized solver, and the Elastic net approaches, while fewer but informative biomarkers and pathways are selected. PMID:26185761

  12. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    NASA Astrophysics Data System (ADS)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  13. Space Station logistics system evolution

    NASA Technical Reports Server (NTRS)

    Tucker, Michael W.

    1990-01-01

    This task investigates logistics requirements and logistics system concepts for the evolutionary Space Station. Requirements for the basic station, crew, user equipment, and free-flying platforms, as requirements for manned exploration initiative elements and crews while at the Space Station. Data is provided which assesses the ability of the Space Freedom logistics carriers to accommodate the logistics loads per year. Also, advanced carrier concepts are defined and assessed against the logistics requirements. The implications on Earth-to-orbit vehicles of accommodating the logistics requirements, using various types of carriers, are assessed on a year by year basis.

  14. Structural vascular disease in Africans: Performance of ethnic-specific waist circumference cut points using logistic regression and neural network analyses: The SABPA study.

    PubMed

    Botha, J; de Ridder, J H; Potgieter, J C; Steyn, H S; Malan, L

    2013-10-01

    A recently proposed model for waist circumference cut points (RPWC), driven by increased blood pressure, was demonstrated in an African population. We therefore aimed to validate the RPWC by comparing the RPWC and the Joint Statement Consensus (JSC) models via Logistic Regression (LR) and Neural Networks (NN) analyses. Urban African gender groups (N=171) were stratified according to the JSC and RPWC cut point models. Ultrasound carotid intima media thickness (CIMT), blood pressure (BP) and fasting bloods (glucose, high density lipoprotein (HDL) and triglycerides) were obtained in a well-controlled setting. The RPWC male model (LR ROC AUC: 0.71, NN ROC AUC: 0.71) was practically equal to the JSC model (LR ROC AUC: 0.71, NN ROC AUC: 0.69) to predict structural vascular -disease. Similarly, the female RPWC model (LR ROC AUC: 0.84, NN ROC AUC: 0.82) and JSC model (LR ROC AUC: 0.82, NN ROC AUC: 0.81) equally predicted CIMT as surrogate marker for structural vascular disease. Odds ratios supported validity where prediction of CIMT revealed -clinical -significance, well over 1, for both the JSC and RPWC models in African males and females (OR 3.75-13.98). In conclusion, the proposed RPWC model was substantially validated utilizing linear and non-linear analyses. We therefore propose ethnic-specific WC cut points (African males, ≥90 cm; -females, ≥98 cm) to predict a surrogate marker for structural vascular disease.

  15. Performance-based asphalt mixture design methodology

    NASA Astrophysics Data System (ADS)

    Ali, Al-Hosain Mansour

    performance based design procedure. Finally, the developed guidelines with easy-to-use flow charts for the integrated mix design methodology are presented.

  16. Applying simulation and logistics modeling to tansportation issues

    SciTech Connect

    Funkhouser, B.R.; Ballweg, E.L.; Mackoy, R.D.

    1995-08-15

    This paper describes an application where transportation logistics and simulation tools are integrated to create a modeling environment for transportation planning. The Transportation Planning Model (TPM) is a tool developed for the Department of Energy (DOE) to aid in the long-term planning of their transportation resources. The focus of the tool is to aid DOE and Sandia National Laboratory analysts in the planning of future fleet sizes, driver and support personnel sizes, base site locations, and resource balancing among the base sites. The design approach is to develop a rapid modeling environment which integrates graphical user interfaces, logistics optimizing tools, and simulation modeling. Using the TPM an analyst can easily set up a shipment scenario and perform multiple ``What If`` evaluations. The TPM has been developed on personal computers using commercial off-the-shelf software tools under the WINDOW{reg_sign} operating environment.

  17. A High Performance COTS Based Computer Architecture

    NASA Astrophysics Data System (ADS)

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  18. Simulation of changes in heavy metal contamination in farmland soils of a typical manufacturing center through logistic-based cellular automata modeling.

    PubMed

    Qiu, Menglong; Wang, Qi; Li, Fangbai; Chen, Junjian; Yang, Guoyi; Liu, Liming

    2016-01-01

    A customized logistic-based cellular automata (CA) model was developed to simulate changes in heavy metal contamination (HMC) in farmland soils of Dongguan, a manufacturing center in Southern China, and to discover the relationship between HMC and related explanatory variables (continuous and categorical). The model was calibrated through the simulation and validation of HMC in 2012. Thereafter, the model was implemented for the scenario simulation of development alternatives for HMC in 2022. The HMC in 2002 and 2012 was determined through soil tests and cokriging. Continuous variables were divided into two groups by odds ratios. Positive variables (odds ratios >1) included the Nemerow synthetic pollution index in 2002, linear drainage density, distance from the city center, distance from the railway, slope, and secondary industrial output per unit of land. Negative variables (odds ratios <1) included elevation, distance from the road, distance from the key polluting enterprises, distance from the town center, soil pH, and distance from bodies of water. Categorical variables, including soil type, parent material type, organic content grade, and land use type, also significantly influenced HMC according to Wald statistics. The relative operating characteristic and kappa coefficients were 0.91 and 0.64, respectively, which proved the validity and accuracy of the model. The scenario simulation shows that the government should not only implement stricter environmental regulation but also strengthen the remediation of the current polluted area to effectively mitigate HMC.

  19. Mini pressurized logistics module (MPLM)

    NASA Astrophysics Data System (ADS)

    Vallerani, E.; Brondolo, D.; Basile, L.

    1996-06-01

    The MPLM Program was initiated through a Memorandum of Understanding (MOU) between the United States' National Aeronautics and Space Administration (NASA) and Italy's ASI, the Italian Space Agency, that was signed on 6 December 1991. The MPLM is a pressurized logistics module that will be used to transport supplies and materials (up to 20,000 lb), including user experiments, between Earth and International Space Station Alpha (ISSA) using the Shuttle, to support active and passive storage, and to provide a habitable environment for two people when docked to the Station. The Italian Space Agency has selected Alenia Spazio to develop MPLM modules that have always been considered a key element for the new International Space Station taking benefit from its design flexibility and consequent possible cost saving based on the maximum utilization of the Shuttle launch capability for any mission. In the frame of the very recent agreement between the U.S. and Russia for cooperation in space, that foresees the utilization of MIR 1 hardware, the Italian MPLM will remain an important element of the logistics system, being the only pressurized module designed for re-entry. Within the new scenario of anticipated Shuttle flights to MIR 1 during Space Station phase 1, MPLM remains a candidate for one or more missions to provide MIR 1 resupply capabilities and advanced ISSA hardware/procedures verification. Based on the concept of Flexible Carriers, Alenia Spazio is providing NASA with three MPLM flight units that can be configured according to the requirements of the Human-Tended Capability (HTC) and Permanent Human Capability (PHC) of the Space Station. Configurability will allow transportation of passive cargo only, or a combination of passive and cold cargo accommodated in R/F racks. Having developed and qualified the baseline configuration with respect to the worst enveloping condition, each unit could be easily configured to the passive or active version depending upon the

  20. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  1. An optimal hierarchical decision model for a regional logistics network with environmental impact consideration.

    PubMed

    Zhang, Dezhi; Li, Shuangyan; Qin, Jin

    2014-01-01

    This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level. PMID:24977209

  2. An Optimal Hierarchical Decision Model for a Regional Logistics Network with Environmental Impact Consideration

    PubMed Central

    Zhang, Dezhi; Li, Shuangyan

    2014-01-01

    This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level. PMID:24977209

  3. An optimal hierarchical decision model for a regional logistics network with environmental impact consideration.

    PubMed

    Zhang, Dezhi; Li, Shuangyan; Qin, Jin

    2014-01-01

    This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level.

  4. Designer protein-based performance materials.

    PubMed

    Kumar, Manoj; Sanford, Karl J; Cuevas, William A; Cuevas, William P; Du, Mai; Collier, Katharine D; Chow, Nicole

    2006-09-01

    Repeat sequence protein polymer (RSPP) technology provides a platform to design and make protein-based performance polymers and represents the best nature has to offer. We report here that the RSPP platform is a novel approach to produce functional protein polymers that have both biomechanical and biofunctional blocks built into one molecule by design, using peptide motifs. We have shown that protein-based designer biopolymers can be made using recombinant DNA technology and fermentation and offer the ability to screen for desired properties utilizing the tremendous potential diversity of amino acid combinations. The technology also allows for large-scale manufacturing with a favorable fermentative cost-structure to deliver commercially viable performance polymers. Using three diverse examples with antimicrobial, textile targeting, and UV-protective agent, we have introduced functional attributes into structural protein polymers and shown, for example, that the functionalized RSPPs have possible applications in biodefense, industrial biotechnology, and personal care areas. This new class of biobased materials will simulate natural biomaterials that can be modified for desired function and have many advantages over conventional petroleum-based polymers.

  5. The logistics of choice.

    PubMed

    Killeen, Peter R

    2015-07-01

    The generalized matching law (GML) is reconstructed as a logistic regression equation that privileges no particular value of the sensitivity parameter, a. That value will often approach 1 due to the feedback that drives switching that is intrinsic to most concurrent schedules. A model of that feedback reproduced some features of concurrent data. The GML is a law only in the strained sense that any equation that maps data is a law. The machine under the hood of matching is in all likelihood the very law that was displaced by the Matching Law. It is now time to return the Law of Effect to centrality in our science.

  6. Transport logistics in pollen tubes.

    PubMed

    Chebli, Youssef; Kroeger, Jens; Geitmann, Anja

    2013-07-01

    Cellular organelles move within the cellular volume and the effect of the resulting drag forces on the liquid causes bulk movement in the cytosol. The movement of both organelles and cytosol leads to an overall motion pattern called cytoplasmic streaming or cyclosis. This streaming enables the active and passive transport of molecules and organelles between cellular compartments. Furthermore, the fusion and budding of vesicles with and from the plasma membrane (exo/endocytosis) allow for transport of material between the inside and the outside of the cell. In the pollen tube, cytoplasmic streaming and exo/endocytosis are very active and fulfill several different functions. In this review, we focus on the logistics of intracellular motion and transport processes as well as their biophysical underpinnings. We discuss various modeling attempts that have been performed to understand both long-distance shuttling and short-distance targeting of organelles. We show how the combination of mechanical and mathematical modeling with cell biological approaches has contributed to our understanding of intracellular transport logistics.

  7. Advanced colorectal neoplasia risk stratification by penalized logistic regression.

    PubMed

    Lin, Yunzhi; Yu, Menggang; Wang, Sijian; Chappell, Richard; Imperiale, Thomas F

    2016-08-01

    Colorectal cancer is the second leading cause of death from cancer in the United States. To facilitate the efficiency of colorectal cancer screening, there is a need to stratify risk for colorectal cancer among the 90% of US residents who are considered "average risk." In this article, we investigate such risk stratification rules for advanced colorectal neoplasia (colorectal cancer and advanced, precancerous polyps). We use a recently completed large cohort study of subjects who underwent a first screening colonoscopy. Logistic regression models have been used in the literature to estimate the risk of advanced colorectal neoplasia based on quantifiable risk factors. However, logistic regression may be prone to overfitting and instability in variable selection. Since most of the risk factors in our study have several categories, it was tempting to collapse these categories into fewer risk groups. We propose a penalized logistic regression method that automatically and simultaneously selects variables, groups categories, and estimates their coefficients by penalizing the [Formula: see text]-norm of both the coefficients and their differences. Hence, it encourages sparsity in the categories, i.e. grouping of the categories, and sparsity in the variables, i.e. variable selection. We apply the penalized logistic regression method to our data. The important variables are selected, with close categories simultaneously grouped, by penalized regression models with and without the interactions terms. The models are validated with 10-fold cross-validation. The receiver operating characteristic curves of the penalized regression models dominate the receiver operating characteristic curve of naive logistic regressions, indicating a superior discriminative performance.

  8. Performance-based assessment of reconstructed images

    SciTech Connect

    Hanson, Kenneth

    2009-01-01

    During the early 90s, I engaged in a productive and enjoyable collaboration with Robert Wagner and his colleague, Kyle Myers. We explored the ramifications of the principle that tbe quality of an image should be assessed on the basis of how well it facilitates the performance of appropriate visual tasks. We applied this principle to algorithms used to reconstruct scenes from incomplete and/or noisy projection data. For binary visual tasks, we used both the conventional disk detection and a new challenging task, inspired by the Rayleigh resolution criterion, of deciding whether an object was a blurred version of two dots or a bar. The results of human and machine observer tests were summarized with the detectability index based on the area under the ROC curve. We investigated a variety of reconstruction algorithms, including ART, with and without a nonnegativity constraint, and the MEMSYS3 algorithm. We concluded that the performance of the Raleigh task was optimized when the strength of the prior was near MEMSYS's default 'classic' value for both human and machine observers. A notable result was that the most-often-used metric of rms error in the reconstruction was not necessarily indicative of the value of a reconstructed image for the purpose of performing visual tasks.

  9. Modeling Student Performance in Mathematics Using Binary Logistic Regression at Selected Secondary Schools a Case Study of Mtwara Municipality and Ilemela District

    ERIC Educational Resources Information Center

    Mabula, Salyungu

    2015-01-01

    This study investigated the performance of secondary school students in Mathematics at the Selected Secondary Schools in Mtwara Municipality and Ilemela District by Absenteeism, Conduct, Type of School and Gender as explanatory Factors. The data used in the study was collected from documented records of 250 form three students with 1:1 gender…

  10. TAP 2, Performance-Based Training Manual

    SciTech Connect

    Not Available

    1991-07-01

    Training programs at DOE nuclear facilities should provide well- trained, qualified personnel to safely and efficiently operate the facilities in accordance with DOE requirements. A need has been identified for guidance regarding analysis, design, development, implementation, and evaluation of consistent and reliable performance-based training programs. Accreditation of training programs at Category A reactors and high-hazard and selected moderate-hazard nonreactor facilities will assure consistent, appropriate, and cost-effective training of personnel responsible for the operation, maintenance, and technical support of these facilities. Training programs that are designed and based on systematically job requirements, instead of subjective estimation of trainee needs, yield training activities that are consistent and develop or improve knowledge, skills, and abilities that can be directly related to the work setting. Because the training is job-related, the content of these programs more efficiently and effectively meets the needs of the employee. Besides a better trained work force, a greater level of operational reactor safety can be realized. This manual is intended to provide an overview of the accreditation process and a brief description of the elements necessary to construct and maintain training programs that are based on the requirements of the job. Two comparison manuals provide additional information to assist contractors in their efforts to accredit training programs.

  11. The Relationship between School Quality and the Probability of Passing Standards-Based High-Stakes Performance Assessments. CSE Technical Report 644

    ERIC Educational Resources Information Center

    Goldschmidt, Pete; Martinez-Fernandez, Jose-Felipe

    2004-01-01

    We examine whether school quality affects passing the California High School Exit Exam (CAHSEE), which is a standards-based high-stakes performance assessment. We use 3-level hierarchical logistic and linear models to examine student probabilities of passing the CAHSEE to take advantage of the availability of student, teacher, and school level…

  12. Walk-through survey report: HVLV (high velocity low volume) control technology for aircraft bonded wing and radome maintenance at Air Force Logistics Command, McClellan Air Force Base, Sacramento, California

    SciTech Connect

    Hollett, B.A.

    1983-08-01

    A walk through survey was conducted at the Sacramento Air Logistics Center, McClellan Air Force Base, California, on June 13, 1983, to evaluate the use of High Velocity Low Volume (HVLV) technology in the aircraft-maintenance industry. The HVLV system consisted of 65 ceiling drops in the bonded honeycomb shop where grinding and sanding operations created glass fiber and resin dusts. Preemployment and periodic physical examinations were required. Workers were required to wear disposable coveralls, and disposable dust masks were available. Workers walked through decontamination air jet showers before leaving the area to change clothes. Environmental monitoring revealed no significant dust exposures when the HVLV system was in use. Performance of the exhaust system on the eight-inch-diameter nose cone sanding operation was good, but the three-inch-diameter tools were too large and the shrouds too cumbersome for use on many hand-finishing tasks. The author concludes that the HVLV system is partially successful but requires additional shroud design. Further development of small tool shrouds is recommended.

  13. Classifying machinery condition using oil samples and binary logistic regression

    NASA Astrophysics Data System (ADS)

    Phillips, J.; Cripps, E.; Lau, John W.; Hodkiewicz, M. R.

    2015-08-01

    The era of big data has resulted in an explosion of condition monitoring information. The result is an increasing motivation to automate the costly and time consuming human elements involved in the classification of machine health. When working with industry it is important to build an understanding and hence some trust in the classification scheme for those who use the analysis to initiate maintenance tasks. Typically "black box" approaches such as artificial neural networks (ANN) and support vector machines (SVM) can be difficult to provide ease of interpretability. In contrast, this paper argues that logistic regression offers easy interpretability to industry experts, providing insight to the drivers of the human classification process and to the ramifications of potential misclassification. Of course, accuracy is of foremost importance in any automated classification scheme, so we also provide a comparative study based on predictive performance of logistic regression, ANN and SVM. A real world oil analysis data set from engines on mining trucks is presented and using cross-validation we demonstrate that logistic regression out-performs the ANN and SVM approaches in terms of prediction for healthy/not healthy engines.

  14. High Performance Oxides-Based Thermoelectric Materials

    NASA Astrophysics Data System (ADS)

    Ren, Guangkun; Lan, Jinle; Zeng, Chengcheng; Liu, Yaochun; Zhan, Bin; Butt, Sajid; Lin, Yuan-Hua; Nan, Ce-Wen

    2015-01-01

    Thermoelectric materials have attracted much attention due to their applications in waste-heat recovery, power generation, and solid state cooling. In comparison with thermoelectric alloys, oxide semiconductors, which are thermally and chemically stable in air at high temperature, are regarded as the candidates for high-temperature thermoelectric applications. However, their figure-of-merit ZT value has remained low, around 0.1-0.4 for more than 20 years. The poor performance in oxides is ascribed to the low electrical conductivity and high thermal conductivity. Since the electrical transport properties in these thermoelectric oxides are strongly correlated, it is difficult to improve both the thermoelectric power and electrical conductivity simultaneously by conventional methods. This review summarizes recent progresses on high-performance oxide-based thermoelectric bulk-materials including n-type ZnO, SrTiO3, and In2O3, and p-type Ca3Co4O9, BiCuSeO, and NiO, enhanced by heavy-element doping, band engineering and nanostructuring.

  15. Use of Ubiquitous Technologies in Military Logistic System in Iran

    NASA Astrophysics Data System (ADS)

    Jafari, P.; Sadeghi-Niaraki, A.

    2013-09-01

    This study is about integration and evaluation of RFID and ubiquitous technologies in military logistic system management. Firstly, supply chain management and the necessity of a revolution in logistic systems especially in military area, are explained. Secondly RFID and ubiquitous technologies and the advantages of their use in supply chain management are introduced. Lastly a system based on these technologies for controlling and increasing the speed and accuracy in military logistic system in Iran with its unique properties, is presented. The system is based on full control of military logistics (supplies) from the time of deployment to replenishment using sensor network, ubiquitous and RFID technologies.

  16. A Remote Sensing Based Approach for the Assessment of Debris Flow Hazards Using Artificial Neural Network and Binary Logistic Regression Modeling

    NASA Astrophysics Data System (ADS)

    El Kadiri, R.; Sultan, M.; Elbayoumi, T.; Sefry, S.

    2013-12-01

    Efforts to map the distribution of debris flows, to assess the factors controlling their development, and to identify the areas prone to their development are often hampered by the absence or paucity of appropriate monitoring systems and historical databases and the inaccessibility of these areas in many parts of the world. We developed methodologies that heavily rely on readily available observations extracted from remote sensing datasets and successfully applied these techniques over the the Jazan province, in the Red Sea hills of Saudi Arabia. We first identified debris flows (10,334 locations) from high spatial resolution satellite datasets (e.g., GeoEye, Orbview), and verified a subset of these occurrences in the field. We then constructed a GIS to host the identified debris flow locations together with co-registered relevant data (e.g., lithology, elevation) and derived products (e.g., slope, normalized difference vegetation index, etc). Spatial analysis of the data sets in the GIS sets indicated various degrees of correspondence between the distribution of debris flows and various variables (e.g., stream power index, topographic position index, normalized difference vegetation index, distance to stream, flow accumulation, slope and soil weathering index, aspect, elevation) suggesting a causal effect. For example, debris flows were found in areas of high slope, low distance to low stream orders and low vegetation index. To evaluate the extent to which these factors control landslide distribution, we constructed and applied: (1) a stepwise input selection by testing all input combinations to make the final model more compact and effective, (2) a statistic-based binary logistic regression (BLR) model, and (3) a mathematical-based artificial neural network (ANN) model. Only 80% (8267 locations) of the data was used for the construction of each of the models and the remaining samples (2067 locations) were used for the accuracy assessment purposes. Results

  17. Space Shuttle operational logistics plan

    NASA Technical Reports Server (NTRS)

    Botts, J. W.

    1983-01-01

    The Kennedy Space Center plan for logistics to support Space Shuttle Operations and to establish the related policies, requirements, and responsibilities are described. The Directorate of Shuttle Management and Operations logistics responsibilities required by the Kennedy Organizational Manual, and the self-sufficiency contracting concept are implemented. The Space Shuttle Program Level 1 and Level 2 logistics policies and requirements applicable to KSC that are presented in HQ NASA and Johnson Space Center directives are also implemented.

  18. Logistics Lessons Learned in NASA Space Flight

    NASA Technical Reports Server (NTRS)

    Evans, William A.; DeWeck, Olivier; Laufer, Deanna; Shull, Sarah

    2006-01-01

    Information System (LLIS) and verified that we received the same result using the internal version of LLIS for our logistics lesson searches. In conducting the research, information from multiple databases was consolidated into a single spreadsheet of 300 lessons learned. Keywords were applied for the purpose of sorting and evaluation. Once the lessons had been compiled, an analysis of the resulting data was performed, first sorting it by keyword, then finding duplication and root cause, and finally sorting by root cause. The data was then distilled into the top 7 lessons learned across programs, centers, and activities.

  19. The Design of Logistics Information Matching Platform for Highway Transportation

    NASA Astrophysics Data System (ADS)

    Chen, Daqiang; Zhu, Xiaoxiao; Tong, Bing; Shen, Xiahong; Feng, Tao

    The development status of logistics in the financial crisis requires the shippers and carriers' overall goal focus on cost reduction. This paper firstly analyzes the problem of information mismatch between shipper and carrier in nowadays, and describes the shippers and carriers' demand for information platform. Then based on requirement investigation and questionnaire statistics, the specific demands for logistics information matching platform are analyzed. Finally, logistics information matching platform system for highway transportation is designed.

  20. Reverse logistics in the construction industry.

    PubMed

    Hosseini, M Reza; Rameezdeen, Raufdeen; Chileshe, Nicholas; Lehmann, Steffen

    2015-06-01

    Reverse logistics in construction refers to the movement of products and materials from salvaged buildings to a new construction site. While there is a plethora of studies looking at various aspects of the reverse logistics chain, there is no systematic review of literature on this important subject as applied to the construction industry. Therefore, the objective of this study is to integrate the fragmented body of knowledge on reverse logistics in construction, with the aim of promoting the concept among industry stakeholders and the wider construction community. Through a qualitative meta-analysis, the study synthesises the findings of previous studies and presents some actions needed by industry stakeholders to promote this concept within the real-life context. First, the trend of research and terminology related with reverse logistics is introduced. Second, it unearths the main advantages and barriers of reverse logistics in construction while providing some suggestions to harness the advantages and mitigate these barriers. Finally, it provides a future research direction based on the review. PMID:26018543

  1. Reverse logistics in the construction industry.

    PubMed

    Hosseini, M Reza; Rameezdeen, Raufdeen; Chileshe, Nicholas; Lehmann, Steffen

    2015-06-01

    Reverse logistics in construction refers to the movement of products and materials from salvaged buildings to a new construction site. While there is a plethora of studies looking at various aspects of the reverse logistics chain, there is no systematic review of literature on this important subject as applied to the construction industry. Therefore, the objective of this study is to integrate the fragmented body of knowledge on reverse logistics in construction, with the aim of promoting the concept among industry stakeholders and the wider construction community. Through a qualitative meta-analysis, the study synthesises the findings of previous studies and presents some actions needed by industry stakeholders to promote this concept within the real-life context. First, the trend of research and terminology related with reverse logistics is introduced. Second, it unearths the main advantages and barriers of reverse logistics in construction while providing some suggestions to harness the advantages and mitigate these barriers. Finally, it provides a future research direction based on the review.

  2. Using Cultural Algorithms to Improve Intelligent Logistics

    NASA Astrophysics Data System (ADS)

    Ochoa, Alberto; García, Yazmani; Yañez, Javier; Teymanoglu, Yaddik

    Today the issue of logistics is a very important within companies to the extent that some have departments devoted exclusively to it. This has evolved over time and today is a fundamental aspect in the fight business seeking to consolidate or remain leaders in their field. With the above we know that logistics can be divided into different classes, however, in this regard, our study is based on the timely distribution to the customer with a lower cost, higher sales and better utilization of space resulting in excellent service. Finally, prepare a comparative analysis of the results with respect to another method of optimization solution space.

  3. Development of a comprehensive logistics and warfighting simulation system.

    SciTech Connect

    Hummel, J. R.

    1998-08-12

    An efficient logistics system is critical to the success of military operations. Recently, the Department of Defense (DoD) has begun to move from a ''just in case'' logistics system that relies on large stores of inventoried materials toward a ''just in time'' system based on obtaining and delivering supplies when and where they are needed. For this new logistics concept to operate smoothly and responsively and be highly robust, one must understand the interrelationships between warfighting and logistics, such as the impact of losses of logistics links/nodes and the changing pace of warfighting operations. Two DoD programs, the Distributed Intelligent Agents for Logistics (DIAL) and the Warfighting Logistics Technology and Assessment Environment (WLTAE), are focusing on different aspects of this problem. These programs are being integrated to develop a Comprehensive Logistics and Warfighting System (CLAWS) that can be used to address a variety of different logistics applications in the military arena. In this paper, we describe how CLAWS will be developed, including the development of a generalized Federation Object Model that could be used in a variety of logistics and military operations applications.

  4. The Effects of Performance-Based Assessment Criteria on Student Performance and Self-Assessment Skills

    ERIC Educational Resources Information Center

    Fastre, Greet Mia Jos; van der Klink, Marcel R.; van Merrienboer, Jeroen J. G.

    2010-01-01

    This study investigated the effect of performance-based versus competence-based assessment criteria on task performance and self-assessment skills among 39 novice secondary vocational education students in the domain of nursing and care. In a performance-based assessment group students are provided with a preset list of performance-based…

  5. Performance Demonstration Based Probablity of Detection (POD) Curves for Fatigue Cracks in Piping

    SciTech Connect

    Gosselin, Stephen R.; Simonen, Fredric A.; Heasler, Patrick G.; Becker, F. L.; Doctor, Steven R.; Carter, R. G.

    2005-07-01

    This paper evaluates non-destructive examination (NDE) detection capabilities for fatigue cracks in piping. Industry performance demonstration initiative (PDI) data for fatigue crack detection were used to develop a matrix of statistically based probability of detection (POD) curves that consider various NDE performance factors. Seven primary performance factors were identified – Material, Crack Geometry/Type, NDE Examination Access, NDE Procedure, Examiner Qualification, Pipe Diameter, and Pipe Wall Thickness. A database of 16,181 NDE performance observations, with 18 fields associated with each observation, was created and used to develop statistically based POD curves for 42 stainless steel and 14 carbon steel performance cases. Subsequent comparisons of the POD fits for each of the cases showed that excellent NDE performance for fatigue cracks can be expected for ferritic materials. Very little difference was observed between the POD curves for the 14 carbon steel performance cases considered in this study and NDE performance could therefore be represented by a single POD curve. For stainless steel, very good performance can also be expected for circumferential cracks located on the same side of the weld from which the NDE examination is made. POD depended primarily on component thickness. Three POD curves for stainless steel were prepared. Best estimate and the associated 95% confidence bounds for POD versas through-wall depth logistic regression digital data are provided. Probabilistic fracture mechanics (PFM) calculations were performed to compare best estimate leak probabilities obtained from both the new performance-based POD curves and previous PFM models. This work was performed under joint funding by EPRI and the U.S. Department of Energy (DOE), Office of Nuclear Energy Science and Technology’s Nuclear Energy Plant Optimization (NEPO) program.

  6. Logistic Regression: Concept and Application

    ERIC Educational Resources Information Center

    Cokluk, Omay

    2010-01-01

    The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…

  7. A Solution to Separation and Multicollinearity in Multiple Logistic Regression.

    PubMed

    Shen, Jianzhao; Gao, Sujuan

    2008-10-01

    In dementia screening tests, item selection for shortening an existing screening test can be achieved using multiple logistic regression. However, maximum likelihood estimates for such logistic regression models often experience serious bias or even non-existence because of separation and multicollinearity problems resulting from a large number of highly correlated items. Firth (1993, Biometrika, 80(1), 27-38) proposed a penalized likelihood estimator for generalized linear models and it was shown to reduce bias and the non-existence problems. The ridge regression has been used in logistic regression to stabilize the estimates in cases of multicollinearity. However, neither solves the problems for each other. In this paper, we propose a double penalized maximum likelihood estimator combining Firth's penalized likelihood equation with a ridge parameter. We present a simulation study evaluating the empirical performance of the double penalized likelihood estimator in small to moderate sample sizes. We demonstrate the proposed approach using a current screening data from a community-based dementia study.

  8. Teaching Logistics without Formal Classes: A Case Study

    ERIC Educational Resources Information Center

    Carravilla, Maria Antonia; Oliveira, Jose Fernando

    2004-01-01

    This paper describes a case study concerning the teaching of logistics in the Computers and Electrical Engineering degree at FEUP. The logistics course is taken in the last semester of the degree and there are no lectures given by the teachers. All the learning strategy is based upon the autonomous learning capacity of the students, following the…

  9. A Behavior-Based Employee Performance System.

    ERIC Educational Resources Information Center

    Abernathy, William B.

    2003-01-01

    Discusses human performance technology models for describing and understanding factors involved in day-to-day functioning of employees and then to develop specific remedial interventions as needed, and contrasts it to an organizational performance system perspective used to design an organization before employees are even hired to prevent bad…

  10. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  11. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  12. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  13. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  14. 48 CFR 32.1002 - Bases for performance-based payments.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Bases for performance... REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 32.1002 Bases for performance-based payments. Performance-based payments may be made on any of the following bases:...

  15. GIS-based groundwater spring potential mapping in the Sultan Mountains (Konya, Turkey) using frequency ratio, weights of evidence and logistic regression methods and their comparison

    NASA Astrophysics Data System (ADS)

    Ozdemir, Adnan

    2011-12-01

    SummaryIn this study, groundwater spring potential maps produced by three different methods, frequency ratio, weights of evidence, and logistic regression, were evaluated using validation data sets and compared to each other. Groundwater spring occurrence potential maps in the Sultan Mountains (Konya, Turkey) were constructed using the relationship between groundwater spring locations and their causative factors. Groundwater spring locations were identified in the study area from a topographic map. Different thematic maps of the study area, such as geology, topography, geomorphology, hydrology, and land use/cover, have been used to identify groundwater potential zones. Seventeen spring-related parameter layers of the entire study area were used to generate groundwater spring potential maps. These are geology (lithology), fault density, distance to fault, relative permeability of lithologies, elevation, slope aspect, slope steepness, curvature, plan curvature, profile curvature, topographic wetness index, stream power index, sediment transport capacity index, drainage density, distance to drainage, land use/cover, and precipitation. The predictive capability of each model was determined by the area under the relative operating characteristic curve. The areas under the curve for frequency ratio, weights of evidence and logistic regression methods were calculated as 0.903, 0.880, and 0.840, respectively. These results indicate that frequency ratio and weights of evidence models are relatively good estimators, whereas logistic regression is a relatively poor estimator of groundwater spring potential mapping in the study area. The frequency ratio model is simple; the process of input, calculation and output can be readily understood. The produced groundwater spring potential maps can serve planners and engineers in groundwater development plans and land-use planning.

  16. Cognitively automated assembly processes: a simulation based evaluation of performance.

    PubMed

    Mayer, Marcel Ph; Odenthal, Barbara; Faber, Marco; Schlick, Christopher M

    2012-01-01

    The numerical control of an experimental assembly cell with two robots--termed a cognitive control unit (CCU)--is able to simulate human information processing at a rule-based level of cognitive control. To enable the CCU to work on a large range of assembly tasks expected of a human operator, the cognitive architecture SOAR is used. The CCU can plan assembly processes autonomously and react to ad-hoc changes in assembly sequences effectively. Extensive simulation studies have shown that cognitive automation based on SOAR is especially suitable for random parts supply, which reduces planning effort in logistics. Conversely, a disproportional increase in processing time was observed for deterministic parts supply, especially for assemblies containing large numbers of identical parts. In this contribution, the effect of phase-shifts in deterministic part supply is investigated for assemblies containing maximal different parts. It can be shown that the concept of cognitive automation is as well suitable for these planning problems.

  17. Multi-Purpose Logistics Module (MPLM) Cargo Heat Exchanger

    NASA Technical Reports Server (NTRS)

    Zampiceni, John J.; Harper, Lon T.

    2002-01-01

    This paper describes the New Shuttle Orbiter's Multi- Purpose Logistics Modulo (MPLM) Cargo Heat Exchanger (HX) and associated MPLM cooling system. This paper presents Heat Exchanger (HX) design and performance characteristics of the system.

  18. A fast, streaming SIMD Extensions 2, logistic squashing function.

    PubMed

    Milner, J J; Grandison, A J

    2008-12-01

    Schraudolph proposed an excellent exponential approximation providing increased performance particularly suited to the logistic squashing function used within many neural networking applications. This note applies Intel's streaming SIMD Extensions 2 (SSE2), where SIMD is single instruction multiple data, of the Pentium IV class processor to Schraudolph's technique, further increasing the performance of the logistic squashing function. It was found that the calculation of the new 32-bit SSE2 logistic squashing function described here was up to 38 times faster than the conventional exponential function and up to 16 times faster than a Schraudolph-style 32-bit method on an Intel Pentium D 3.6 GHz CPU. PMID:18624654

  19. Nowcasting sunshine number using logistic modeling

    NASA Astrophysics Data System (ADS)

    Brabec, Marek; Badescu, Viorel; Paulescu, Marius

    2013-04-01

    In this paper, we present a formalized approach to statistical modeling of the sunshine number, binary indicator of whether the Sun is covered by clouds introduced previously by Badescu (Theor Appl Climatol 72:127-136, 2002). Our statistical approach is based on Markov chain and logistic regression and yields fully specified probability models that are relatively easily identified (and their unknown parameters estimated) from a set of empirical data (observed sunshine number and sunshine stability number series). We discuss general structure of the model and its advantages, demonstrate its performance on real data and compare its results to classical ARIMA approach as to a competitor. Since the model parameters have clear interpretation, we also illustrate how, e.g., their inter-seasonal stability can be tested. We conclude with an outlook to future developments oriented to construction of models allowing for practically desirable smooth transition between data observed with different frequencies and with a short discussion of technical problems that such a goal brings.

  20. Performance of Skutterudite-Based Modules

    NASA Astrophysics Data System (ADS)

    Nie, G.; Suzuki, S.; Tomida, T.; Sumiyoshi, A.; Ochi, T.; Mukaiyama, K.; Kikuchi, M.; Guo, J. Q.; Yamamoto, A.; Obara, H.

    2016-08-01

    Due to their excellent thermoelectric (TE) performance, skutterudite materials have been selected by many laboratories and companies for development of TE modules to recover power from waste heat at high temperatures (300°C to 600°C). After years of effort, we have developed reliable n- and p-type skutterudite materials showing maximum figure of merit (ZT) of 1.0 at 550°C and 0.75 at 450°C, respectively. In this work, we systematically investigated the performance of a module made using these two kinds of skutterudite. We demonstrate ˜7.2% conversion efficiency for temperature of 600°C at the hot side of the module and 50°C at the cold side, and show that the module had excellent stability in the high-temperature environment. Further improving the TE performance of our skutterudites, the conversion efficiency reached ˜8.5% under the same condition.

  1. Logistics in smallpox: the legacy.

    PubMed

    Wickett, John; Carrasco, Peter

    2011-12-30

    Logistics, defined as "the time-related positioning of resources" was critical to the implementation of the smallpox eradication strategy of surveillance and containment. Logistical challenges in the smallpox programme included vaccine delivery, supplies, staffing, vehicle maintenance, and financing. Ensuring mobility was essential as health workers had to travel to outbreaks to contain them. Three examples illustrate a range of logistic challenges which required imagination and innovation. Standard price lists were developed to expedite vehicle maintenance and repair in Bihar, India. Innovative staffing ensured an adequate infrastructure for vehicle maintenance in Bangladesh. The use of disaster relief mechanisms in Somalia provided airlifts, vehicles and funding within 27 days of their initiation. In contrast the Expanded Programme on Immunization (EPI) faces more complex logistical challenges.

  2. Fungible weights in logistic regression.

    PubMed

    Jones, Jeff A; Waller, Niels G

    2016-06-01

    In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record

  3. Logistic Regression Applied to Seismic Discrimination

    SciTech Connect

    BG Amindan; DN Hagedorn

    1998-10-08

    The usefulness of logistic discrimination was examined in an effort to learn how it performs in a regional seismic setting. Logistic discrimination provides an easily understood method, works with user-defined models and few assumptions about the population distributions, and handles both continuous and discrete data. Seismic event measurements from a data set compiled by Los Alamos National Laboratory (LANL) of Chinese events recorded at station WMQ were used in this demonstration study. PNNL applied logistic regression techniques to the data. All possible combinations of the Lg and Pg measurements were tried, and a best-fit logistic model was created. The best combination of Lg and Pg frequencies for predicting the source of a seismic event (earthquake or explosion) used Lg{sub 3.0-6.0} and Pg{sub 3.0-6.0} as the predictor variables. A cross-validation test was run, which showed that this model was able to correctly predict 99.7% earthquakes and 98.0% explosions for this given data set. Two other models were identified that used Pg and Lg measurements from the 1.5 to 3.0 Hz frequency range. Although these other models did a good job of correctly predicting the earthquakes, they were not as effective at predicting the explosions. Two possible biases were discovered which affect the predicted probabilities for each outcome. The first bias was due to this being a case-controlled study. The sampling fractions caused a bias in the probabilities that were calculated using the models. The second bias is caused by a change in the proportions for each event. If at a later date the proportions (a priori probabilities) of explosions versus earthquakes change, this would cause a bias in the predicted probability for an event. When using logistic regression, the user needs to be aware of the possible biases and what affect they will have on the predicted probabilities.

  4. Tailored logistics: the next advantage.

    PubMed

    Fuller, J B; O'Conor, J; Rawlinson, R

    1993-01-01

    How many top executives have ever visited with managers who move materials from the factory to the store? How many still reduce the costs of logistics to the rent of warehouses and the fees charged by common carriers? To judge by hours of senior management attention, logistics problems do not rank high. But logistics have the potential to become the next governing element of strategy. Whether they know it or not, senior managers of every retail store and diversified manufacturing company compete in logistically distinct businesses. Customer needs vary, and companies can tailor their logistics systems to serve their customers better and more profitably. Companies do not create value for customers and sustainable advantage for themselves merely by offering varieties of goods. Rather, they offer goods in distinct ways. A particular can of Coca-Cola, for example, might be a can of Coca-Cola going to a vending machine, or a can of Coca-Cola that comes with billing services. There is a fortune buried in this distinction. The goal of logistics strategy is building distinct approaches to distinct groups of customers. The first step is organizing a cross-functional team to proceed through the following steps: segmenting customers according to purchase criteria, establishing different standards of service for different customer segments, tailoring logistics pipelines to support each segment, and creating economics of scale to determine which assets can be shared among various pipelines. The goal of establishing logistically distinct businesses is familiar: improved knowledge of customers and improved means of satisfying them.

  5. Tailored logistics: the next advantage.

    PubMed

    Fuller, J B; O'Conor, J; Rawlinson, R

    1993-01-01

    How many top executives have ever visited with managers who move materials from the factory to the store? How many still reduce the costs of logistics to the rent of warehouses and the fees charged by common carriers? To judge by hours of senior management attention, logistics problems do not rank high. But logistics have the potential to become the next governing element of strategy. Whether they know it or not, senior managers of every retail store and diversified manufacturing company compete in logistically distinct businesses. Customer needs vary, and companies can tailor their logistics systems to serve their customers better and more profitably. Companies do not create value for customers and sustainable advantage for themselves merely by offering varieties of goods. Rather, they offer goods in distinct ways. A particular can of Coca-Cola, for example, might be a can of Coca-Cola going to a vending machine, or a can of Coca-Cola that comes with billing services. There is a fortune buried in this distinction. The goal of logistics strategy is building distinct approaches to distinct groups of customers. The first step is organizing a cross-functional team to proceed through the following steps: segmenting customers according to purchase criteria, establishing different standards of service for different customer segments, tailoring logistics pipelines to support each segment, and creating economics of scale to determine which assets can be shared among various pipelines. The goal of establishing logistically distinct businesses is familiar: improved knowledge of customers and improved means of satisfying them. PMID:10126157

  6. Performance Based Education. Technology Activity Modules.

    ERIC Educational Resources Information Center

    Custer, Rodney L., Ed.

    These Technology Activity Modules are designed to serve as an implementation resource for technology education teachers as they integrate technology education with Missouri's Academic Performance Standards and provide a source of activities and activity ideas that can be used to integrate and reinforce learning across the curriculum. The modules…

  7. SEU Performance of TAG Based Flip Flops

    NASA Technical Reports Server (NTRS)

    Shuler, Robert L.; Kouba, Coy; O'Neill, Patrick M.

    2005-01-01

    We describe heavy ion test results for two new SEU tolerant latches based on transition nand gates, one for single rail asynchronous and the other for dual rail synchronous designs, implemented in AMI 0.5microprocess.

  8. Biomass Supply Logistics and Infrastructure

    NASA Astrophysics Data System (ADS)

    Sokhansanj, Shahabaddine; Hess, J. Richard

    Feedstock supply system encompasses numerous unit operations necessary to move lignocellulosic feedstock from the place where it is produced (in the field or on the stump) to the start of the conversion process (reactor throat) of the biorefinery. These unit operations, which include collection, storage, preprocessing, handling, and transportation, represent one of the largest technical and logistics challenges to the emerging lignocellulosic biorefining industry. This chapter briefly reviews the methods of estimating the quantities of biomass, followed by harvesting and collection processes based on current practices on handling wet and dry forage materials. Storage and queuing are used to deal with seasonal harvest times, variable yields, and delivery schedules. Preprocessing can be as simple as grinding and formatting the biomass for increased bulk density or improved conversion efficiency, or it can be as complex as improving feedstock quality through fractionation, tissue separation, drying, blending, and densification. Handling and transportation consists of using a variety of transport equipment (truck, train, ship) for moving the biomass from one point to another. The chapter also provides typical cost figures for harvest and processing of biomass.

  9. Biomass Supply Logistics and Infrastructure

    SciTech Connect

    Sokhansanj, Shahabaddine

    2009-04-01

    Feedstock supply system encompasses numerous unit operations necessary to move lignocellulosic feedstock from the place where it is produced (in the field or on the stump) to the start of the conversion process (reactor throat) of the Biorefinery. These unit operations, which include collection, storage, preprocessing, handling, and transportation, represent one of the largest technical and logistics challenges to the emerging lignocellulosic biorefining industry. This chapter briefly reviews methods of estimating the quantities of biomass followed by harvesting and collection processes based on current practices on handling wet and dry forage materials. Storage and queuing are used to deal with seasonal harvest times, variable yields, and delivery schedules. Preprocessing can be as simple as grinding and formatting the biomass for increased bulk density or improved conversion efficiency, or it can be as complex as improving feedstock quality through fractionation, tissue separation, drying, blending, and densification. Handling and Transportation consists of using a variety of transport equipment (truck, train, ship) for moving the biomass from one point to another. The chapter also provides typical cost figures for harvest and processing of biomass.

  10. Containment performance perspectives based on IPE results

    SciTech Connect

    Lehner, J.R.; Lin, C.C.; Pratt, W.T.

    1996-12-31

    Perspectives on Containment Performance were obtained from the accident progression analyses, i.e. level 2 PRA analyses, found in the IPE submittals. Insights related to the containment failure modes, the releases associated with those failure modes, and the factors responsible for the types of containment failures and release sizes reported were gathered. The results summarized here are discussed in detail in volumes 1 and 2 of NUREG 1560. 3 refs., 4 figs.

  11. Performance Evaluation of Triangulation Based Range Sensors

    PubMed Central

    Guidi, Gabriele; Russo, Michele; Magrassi, Grazia; Bordegoni, Monica

    2010-01-01

    The performance of 2D digital imaging systems depends on several factors related with both optical and electronic processing. These concepts have originated standards, which have been conceived for photographic equipment and bi-dimensional scanning systems, and which have been aimed at estimating different parameters such as resolution, noise or dynamic range. Conversely, no standard test protocols currently exist for evaluating the corresponding performances of 3D imaging systems such as laser scanners or pattern projection range cameras. This paper is focused on investigating experimental processes for evaluating some critical parameters of 3D equipment, by extending the concepts defined by the ISO standards to the 3D domain. The experimental part of this work concerns the characterization of different range sensors through the extraction of their resolution, accuracy and uncertainty from sets of 3D data acquisitions of specifically designed test objects whose geometrical characteristics are known in advance. The major objective of this contribution is to suggest an easy characterization process for generating a reliable comparison between the performances of different range sensors and to check if a specific piece of equipment is compliant with the expected characteristics. PMID:22163599

  12. SDG&E`s performance-based ratemaking

    SciTech Connect

    Scadding, J.

    1995-11-01

    Performance-based ratemaking (PBR) in the electric utility industry is outlined. The following topics are discussed: average cents/RWh for residential customers; PBR throws its shadow before it; two phases of SDG and E`s PBR; elements of base-rates PBR; price performance benchmark; `non-price` performance indicators; two-way conditionality; and sharing and off-vamps.

  13. Competency/Performance-Based Student Teaching Program.

    ERIC Educational Resources Information Center

    Simms, Earline M.

    The competency-based, student teaching program of the Southern University (Baton Rouge, Louisiana) College of Education is described. The program basis is a listing of fourteen competencies (teaching skills) which provides a guide for structured and meaningful activities during the observation period, consistency in directing those experiences,…

  14. Performance Appraisal Is Based on Five Major Assumptions.

    ERIC Educational Resources Information Center

    Silver, Harvey A.

    This review of the performance appraisal process discusses the major assumptions on which performance appraisal is based, the general goals of performance appraisal, and the characteristics of effective performance appraisal programs. The author stresses the dependence of the process on the assumption that human behavior can be changed; he…

  15. Structuring a Performance-Based Teacher Education Program in Science

    ERIC Educational Resources Information Center

    Dziuban, Charles D.; Esler, William K.

    1973-01-01

    Discusses three components of a performance based teacher education program. The program objectives are defined in terms of knowledge; performance; consequence; and affective. The selection of conditions, and evaluation methods for each objective are outlined. (PS)

  16. 48 CFR 970.1100-1 - Performance-based contracting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-based contracting concepts and methodologies that may be generally applied to management and operating... methods of accomplishing the work; use measurable (i.e., terms of quality, timeliness, quantity) performance standards and objectives and quality assurance surveillance plans; provide performance...

  17. Performance-Based Thinking and Training for Competence.

    ERIC Educational Resources Information Center

    Rakow, Joel

    1982-01-01

    Discusses five job behavior functions viewed as necessary for practicing performance-based thinking in instructional development activities. Functions examined include the abilities to plan to perform a job, execute a task, monitor or control execution, troubleshoot, and evaluate. (MER)

  18. Demand Analysis of Logistics Information Matching Platform: A Survey from Highway Freight Market in Zhejiang Province

    NASA Astrophysics Data System (ADS)

    Chen, Daqiang; Shen, Xiahong; Tong, Bing; Zhu, Xiaoxiao; Feng, Tao

    With the increasing competition in logistics industry and promotion of lower logistics costs requirements, the construction of logistics information matching platform for highway transportation plays an important role, and the accuracy of platform design is the key to successful operation or not. Based on survey results of logistics service providers, customers and regulation authorities to access to information and in-depth information demand analysis of logistics information matching platform for highway transportation in Zhejiang province, a survey analysis for framework of logistics information matching platform for highway transportation is provided.

  19. Intelligent retail logistics scheduling

    SciTech Connect

    Rowe, J.; Jewers, K.; Codd, A.; Alcock, A.

    1996-12-31

    The Supply Chain Integrated Ordering Network (SCION) Depot Bookings system automates the planning and scheduling of perishable and non-perishable commodities and the vehicles that carry them into J. Sainsbury depots. This is a strategic initiative, enabling the business to make the key move from weekly to daily ordering. The system is mission critical, managing the inwards flow of commodities from suppliers into J. Sainsbury`s depots. The system leverages Al techniques to provide a business solution that meets challenging functional and performance needs. The SCION Depot Bookings system is operational providing schedules for 22 depots across the UK.

  20. Spreadsheet Based Scaling Calculations and Membrane Performance

    SciTech Connect

    Wolfe, T D; Bourcier, W L; Speth, T F

    2000-12-28

    Many membrane element manufacturers provide a computer program to aid buyers in the use of their elements. However, to date there are few examples of fully integrated public domain software available for calculating reverse osmosis and nanofiltration system performance. The Total Flux and Scaling Program (TFSP), written for Excel 97 and above, provides designers and operators new tools to predict membrane system performance, including scaling and fouling parameters, for a wide variety of membrane system configurations and feedwaters. The TFSP development was funded under EPA contract 9C-R193-NTSX. It is freely downloadable at www.reverseosmosis.com/download/TFSP.zip. TFSP includes detailed calculations of reverse osmosis and nanofiltration system performance. Of special significance, the program provides scaling calculations for mineral species not normally addressed in commercial programs, including aluminum, iron, and phosphate species. In addition, ASTM calculations for common species such as calcium sulfate (CaSO{sub 4}{times}2H{sub 2}O), BaSO{sub 4}, SrSO{sub 4}, SiO{sub 2}, and LSI are also provided. Scaling calculations in commercial membrane design programs are normally limited to the common minerals and typically follow basic ASTM methods, which are for the most part graphical approaches adapted to curves. In TFSP, the scaling calculations for the less common minerals use subsets of the USGS PHREEQE and WATEQ4F databases and use the same general calculational approach as PHREEQE and WATEQ4F. The activities of ion complexes are calculated iteratively. Complexes that are unlikely to form in significant concentration were eliminated to simplify the calculations. The calculation provides the distribution of ions and ion complexes that is used to calculate an effective ion product ''Q.'' The effective ion product is then compared to temperature adjusted solubility products (Ksp's) of solids in order to calculate a Saturation Index (SI) for each solid of

  1. Logistics Management: New trends in the Reverse Logistics

    NASA Astrophysics Data System (ADS)

    Antonyová, A.; Antony, P.; Soewito, B.

    2016-04-01

    Present level and quality of the environment are directly dependent on our access to natural resources, as well as their sustainability. In particular production activities and phenomena associated with it have a direct impact on the future of our planet. Recycling process, which in large enterprises often becomes an important and integral part of the production program, is usually in small and medium-sized enterprises problematic. We can specify a few factors, which have direct impact on the development and successful application of the effective reverse logistics system. Find the ways to economically acceptable model of reverse logistics, focusing on converting waste materials for renewable energy, is the task in progress.

  2. The Overall Odds Ratio as an Intuitive Effect Size Index for Multiple Logistic Regression: Examination of Further Refinements

    ERIC Educational Resources Information Center

    Le, Huy; Marcus, Justin

    2012-01-01

    This study used Monte Carlo simulation to examine the properties of the overall odds ratio (OOR), which was recently introduced as an index for overall effect size in multiple logistic regression. It was found that the OOR was relatively independent of study base rate and performed better than most commonly used R-square analogs in indexing model…

  3. Logistics background study: underground mining

    SciTech Connect

    Hanslovan, J. J.; Visovsky, R. G.

    1982-02-01

    Logistical functions that are normally associated with US underground coal mining are investigated and analyzed. These functions imply all activities and services that support the producing sections of the mine. The report provides a better understanding of how these functions impact coal production in terms of time, cost, and safety. Major underground logistics activities are analyzed and include: transportation and personnel, supplies and equipment; transportation of coal and rock; electrical distribution and communications systems; water handling; hydraulics; and ventilation systems. Recommended areas for future research are identified and prioritized.

  4. Continual Improvement in Shuttle Logistics

    NASA Technical Reports Server (NTRS)

    Flowers, Jean; Schafer, Loraine

    1995-01-01

    It has been said that Continual Improvement (CI) is difficult to apply to service oriented functions, especially in a government agency such as NASA. However, a constrained budget and increasing requirements are a way of life at NASA Kennedy Space Center (KSC), making it a natural environment for the application of CI tools and techniques. This paper describes how KSC, and specifically the Space Shuttle Logistics Project, a key contributor to KSC's mission, has embraced the CI management approach as a means of achieving its strategic goals and objectives. An overview of how the KSC Space Shuttle Logistics Project has structured its CI effort and examples of some of the initiatives are provided.

  5. Image based performance analysis of thermal imagers

    NASA Astrophysics Data System (ADS)

    Wegner, D.; Repasi, E.

    2016-05-01

    Due to advances in technology, modern thermal imagers resemble sophisticated image processing systems in functionality. Advanced signal and image processing tools enclosed into the camera body extend the basic image capturing capability of thermal cameras. This happens in order to enhance the display presentation of the captured scene or specific scene details. Usually, the implemented methods are proprietary company expertise, distributed without extensive documentation. This makes the comparison of thermal imagers especially from different companies a difficult task (or at least a very time consuming/expensive task - e.g. requiring the execution of a field trial and/or an observer trial). For example, a thermal camera equipped with turbulence mitigation capability stands for such a closed system. The Fraunhofer IOSB has started to build up a system for testing thermal imagers by image based methods in the lab environment. This will extend our capability of measuring the classical IR-system parameters (e.g. MTF, MTDP, etc.) in the lab. The system is set up around the IR- scene projector, which is necessary for the thermal display (projection) of an image sequence for the IR-camera under test. The same set of thermal test sequences might be presented to every unit under test. For turbulence mitigation tests, this could be e.g. the same turbulence sequence. During system tests, gradual variation of input parameters (e. g. thermal contrast) can be applied. First ideas of test scenes selection and how to assembly an imaging suite (a set of image sequences) for the analysis of imaging thermal systems containing such black boxes in the image forming path is discussed.

  6. Large Unbalanced Credit Scoring Using Lasso-Logistic Regression Ensemble

    PubMed Central

    Wang, Hong; Xu, Qingsong; Zhou, Lifeng

    2015-01-01

    Recently, various ensemble learning methods with different base classifiers have been proposed for credit scoring problems. However, for various reasons, there has been little research using logistic regression as the base classifier. In this paper, given large unbalanced data, we consider the plausibility of ensemble learning using regularized logistic regression as the base classifier to deal with credit scoring problems. In this research, the data is first balanced and diversified by clustering and bagging algorithms. Then we apply a Lasso-logistic regression learning ensemble to evaluate the credit risks. We show that the proposed algorithm outperforms popular credit scoring models such as decision tree, Lasso-logistic regression and random forests in terms of AUC and F-measure. We also provide two importance measures for the proposed model to identify important variables in the data. PMID:25706988

  7. Large unbalanced credit scoring using Lasso-logistic regression ensemble.

    PubMed

    Wang, Hong; Xu, Qingsong; Zhou, Lifeng

    2015-01-01

    Recently, various ensemble learning methods with different base classifiers have been proposed for credit scoring problems. However, for various reasons, there has been little research using logistic regression as the base classifier. In this paper, given large unbalanced data, we consider the plausibility of ensemble learning using regularized logistic regression as the base classifier to deal with credit scoring problems. In this research, the data is first balanced and diversified by clustering and bagging algorithms. Then we apply a Lasso-logistic regression learning ensemble to evaluate the credit risks. We show that the proposed algorithm outperforms popular credit scoring models such as decision tree, Lasso-logistic regression and random forests in terms of AUC and F-measure. We also provide two importance measures for the proposed model to identify important variables in the data.

  8. Mathematical study of trade-off relations in logistics systems

    NASA Astrophysics Data System (ADS)

    Nanazawa, Youhei; Suito, Hiroshi; Kawarada, Hideo

    2009-10-01

    This paper presents a mathematical model of trade-off relations arising in third party logistics using Pareto optimal solutions for multi-objective optimization problems. The model defines an optimal set of distribution costs and service levels constituting a trade-off relation. An analogy to the concept of the indifference curve in the field of economics is discussed. Numerical experiments for a simplified problem are performed, demonstrating an increasing process of the utility of logistics.

  9. Comparison of a Bayesian network with a logistic regression model to forecast IgA nephropathy.

    PubMed

    Ducher, Michel; Kalbacher, Emilie; Combarnous, François; Finaz de Vilaine, Jérome; McGregor, Brigitte; Fouque, Denis; Fauvel, Jean Pierre

    2013-01-01

    Models are increasingly used in clinical practice to improve the accuracy of diagnosis. The aim of our work was to compare a Bayesian network to logistic regression to forecast IgA nephropathy (IgAN) from simple clinical and biological criteria. Retrospectively, we pooled the results of all biopsies (n = 155) performed by nephrologists in a specialist clinical facility between 2002 and 2009. Two groups were constituted at random. The first subgroup was used to determine the parameters of the models adjusted to data by logistic regression or Bayesian network, and the second was used to compare the performances of the models using receiver operating characteristics (ROC) curves. IgAN was found (on pathology) in 44 patients. Areas under the ROC curves provided by both methods were highly significant but not different from each other. Based on the highest Youden indices, sensitivity reached (100% versus 67%) and specificity (73% versus 95%) using the Bayesian network and logistic regression, respectively. A Bayesian network is at least as efficient as logistic regression to estimate the probability of a patient suffering IgAN, using simple clinical and biological data obtained during consultation.

  10. Comparison of a Bayesian Network with a Logistic Regression Model to Forecast IgA Nephropathy

    PubMed Central

    Ducher, Michel; Kalbacher, Emilie; Combarnous, François; Finaz de Vilaine, Jérome; McGregor, Brigitte; Fouque, Denis; Fauvel, Jean Pierre

    2013-01-01

    Models are increasingly used in clinical practice to improve the accuracy of diagnosis. The aim of our work was to compare a Bayesian network to logistic regression to forecast IgA nephropathy (IgAN) from simple clinical and biological criteria. Retrospectively, we pooled the results of all biopsies (n = 155) performed by nephrologists in a specialist clinical facility between 2002 and 2009. Two groups were constituted at random. The first subgroup was used to determine the parameters of the models adjusted to data by logistic regression or Bayesian network, and the second was used to compare the performances of the models using receiver operating characteristics (ROC) curves. IgAN was found (on pathology) in 44 patients. Areas under the ROC curves provided by both methods were highly significant but not different from each other. Based on the highest Youden indices, sensitivity reached (100% versus 67%) and specificity (73% versus 95%) using the Bayesian network and logistic regression, respectively. A Bayesian network is at least as efficient as logistic regression to estimate the probability of a patient suffering IgAN, using simple clinical and biological data obtained during consultation. PMID:24328031

  11. The cross-validated AUC for MCP-logistic regression with high-dimensional data.

    PubMed

    Jiang, Dingfeng; Huang, Jian; Zhang, Ying

    2013-10-01

    We propose a cross-validated area under the receiving operator characteristic (ROC) curve (CV-AUC) criterion for tuning parameter selection for penalized methods in sparse, high-dimensional logistic regression models. We use this criterion in combination with the minimax concave penalty (MCP) method for variable selection. The CV-AUC criterion is specifically designed for optimizing the classification performance for binary outcome data. To implement the proposed approach, we derive an efficient coordinate descent algorithm to compute the MCP-logistic regression solution surface. Simulation studies are conducted to evaluate the finite sample performance of the proposed method and its comparison with the existing methods including the Akaike information criterion (AIC), Bayesian information criterion (BIC) or Extended BIC (EBIC). The model selected based on the CV-AUC criterion tends to have a larger predictive AUC and smaller classification error than those with tuning parameters selected using the AIC, BIC or EBIC. We illustrate the application of the MCP-logistic regression with the CV-AUC criterion on three microarray datasets from the studies that attempt to identify genes related to cancers. Our simulation studies and data examples demonstrate that the CV-AUC is an attractive method for tuning parameter selection for penalized methods in high-dimensional logistic regression models.

  12. A technique for determining viable military logistics support alternatives

    NASA Astrophysics Data System (ADS)

    Hester, Jesse Stuart

    so, a justifiable course of action (COA) can be determined based on a variety of quantitative and qualitative information available. This thesis describes and applies the ATLAS method to a notional military scenario that involves the Navy concept of Seabasing and the Marine Corps concept of Distributed Operations applied to a platoon sized element. The small force is tasked to conduct deterrence and combat operations over a seven day period. This work uses modeling and simulation to incorporate expert opinion and knowledge of military operations, dynamic reasoning methods, and certainty analysis to create a decisions support system (DSS) that can be used to provide the DM an enhanced view of the logistics environment and uses variables that impact specific measures of effectiveness. The results from applying the ATLAS method provide a better understanding and ability for the DM to conduct the logistics planning/execution more efficiently and quickly. This is accomplished by providing relevant data that can be applied to perform dynamic forecasting activities for the platoon and aids in determining the necessary support architecture to fulfill the forecasted need.

  13. Performance-Based Staff Development: The Cost-Effective Alternative.

    ERIC Educational Resources Information Center

    Boyer, Catherine M.

    1981-01-01

    Describes how to use the performance-based concept in developing staff. Discusses the identification of objectives based on performance expectations and the development of learning experiences that (1) emphasize application of knowledge; (2) integrate adult learning principles; and (3) make use of learning contracts, self-learning packages,…

  14. Team Primacy Concept (TPC) Based Employee Evaluation and Job Performance

    ERIC Educational Resources Information Center

    Muniute, Eivina I.; Alfred, Mary V.

    2007-01-01

    This qualitative study explored how employees learn from Team Primacy Concept (TPC) based employee evaluation and how they use the feedback in performing their jobs. TPC based evaluation is a form of multirater evaluation, during which the employee's performance is discussed by one's peers in a face-to-face team setting. The study used Kolb's…

  15. Standards of Performance for Community Based Educational Institutions: Quick Check of Institutional Performance.

    ERIC Educational Resources Information Center

    Association of Community Based Education, Washington, DC.

    Designed for use with "Standards of Performance for Community Based Educational Institutions" and a "Self-Assessment Workbook," this checklist helps community based educational institutions in identifying areas of performance which need improvement or further study and in assessing the overall effectiveness of the institution in carrying out its…

  16. Competency-Based Performance Appraisals: Improving Performance Evaluations of School Nutrition Managers and Assistants/Technicians

    ERIC Educational Resources Information Center

    Cross, Evelina W.; Asperin, Amelia Estepa; Nettles, Mary Frances

    2009-01-01

    Purpose: The purpose of the research was to develop a competency-based performance appraisal resource for evaluating school nutrition (SN) managers and assistants/technicians. Methods: A two-phased process was used to develop the competency-based performance appraisal resource for SN managers and assistants/technicians. In Phase I, draft…

  17. DISPLA: decision information system for procurement and logistics analysis

    NASA Astrophysics Data System (ADS)

    Calvo, Alberto B.; Danish, Alexander J.; Lamonakis, Gregory G.

    2002-08-01

    This paper describes an information-exchange system for Display systems acquisition and logistics support. DISPLA (Decision Information System for Procurement and Logistics Analysis) is an Internet-based system concept for bringing sellers (display system and component suppliers) and buyers (Government Program Offices and System Integrators) together in an electronic exchange to improve the acquisition and logistics analysis support of Flat Panel Displays for the military. A proof-of-concept demonstration is presented in this paper using sample data from vendor Web sites and Government data sources.

  18. Logistics support of space facilities

    NASA Technical Reports Server (NTRS)

    Lewis, William C.

    1988-01-01

    The logistic support of space facilities is described, with special attention given to the problem of sizing the inventory of ready spares kept at the space facility. Where possible, data from the Space Shuttle Orbiter is extrapolated to provide numerical estimates for space facilities. Attention is also given to repair effort estimation and long duration missions.

  19. NASA Space Rocket Logistics Challenges

    NASA Technical Reports Server (NTRS)

    Bramon, Chris; Neeley, James R.; Jones, James V.; Watson, Michael D.; Inman, Sharon K.; Tuttle, Loraine

    2014-01-01

    The Space Launch System (SLS) is the new NASA heavy lift launch vehicle in development and is scheduled for its first mission in 2017. SLS has many of the same logistics challenges as any other large scale program. However, SLS also faces unique challenges. This presentation will address the SLS challenges, along with the analysis and decisions to mitigate the threats posed by each.

  20. Cognitively automated assembly processes: a simulation based evaluation of performance.

    PubMed

    Mayer, Marcel Ph; Odenthal, Barbara; Faber, Marco; Schlick, Christopher M

    2012-01-01

    The numerical control of an experimental assembly cell with two robots--termed a cognitive control unit (CCU)--is able to simulate human information processing at a rule-based level of cognitive control. To enable the CCU to work on a large range of assembly tasks expected of a human operator, the cognitive architecture SOAR is used. The CCU can plan assembly processes autonomously and react to ad-hoc changes in assembly sequences effectively. Extensive simulation studies have shown that cognitive automation based on SOAR is especially suitable for random parts supply, which reduces planning effort in logistics. Conversely, a disproportional increase in processing time was observed for deterministic parts supply, especially for assemblies containing large numbers of identical parts. In this contribution, the effect of phase-shifts in deterministic part supply is investigated for assemblies containing maximal different parts. It can be shown that the concept of cognitive automation is as well suitable for these planning problems. PMID:22317246

  1. New logistics protocols for distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Taylor, Darrin; Morrison, John; Katz, Warren; Felton, Erik; Herman, Deborah A.

    1995-06-01

    In today's environment, the transportation and maintenance of military forces is nearly as important as combat operations. Rapid deployment to regions of low-intensity conflict will become a very common training scenario for the U.S. military. Thus it is desirable to apply distributed simulation technology to train logistics personnel in their combat support roles. Currently, distributed interactive simulation (DIS) only contains rudimentary logistics protocols. This paper introduces new protocols designed to handle the logistics problem. The Newtonian protocol takes a physics-based approach to modeling interactions on the simulation network. This protocol consists of a family of protocol data units (PDUs) which are used to communicate forces in different circumstances. The protocol implements a small set of physical relations. This represents a flexible and general mechanism to describe battlefield interactions between network entities. The migratory object protocol (MOP) family addresses the transfer of control. General mechanisms provide the means to simulate resupply, repair, and maintenance of entities at any level of abstraction (individual soldier to division). It can also increase the fidelity of mine laying, enable handover of weapons for terminal guidance, allow for the distribution of aggregate-level simulation entities, provide capabilities for the simulation of personnel, etc.

  2. Estimating Contraceptive Prevalence Using Logistics Data for Short-Acting Methods: Analysis Across 30 Countries

    PubMed Central

    Cunningham, Marc; Brown, Niquelle; Sacher, Suzy; Hatch, Benjamin; Inglis, Andrew; Aronovich, Dana

    2015-01-01

    Background: Contraceptive prevalence rate (CPR) is a vital indicator used by country governments, international donors, and other stakeholders for measuring progress in family planning programs against country targets and global initiatives as well as for estimating health outcomes. Because of the need for more frequent CPR estimates than population-based surveys currently provide, alternative approaches for estimating CPRs are being explored, including using contraceptive logistics data. Methods: Using data from the Demographic and Health Surveys (DHS) in 30 countries, population data from the United States Census Bureau International Database, and logistics data from the Procurement Planning and Monitoring Report (PPMR) and the Pipeline Monitoring and Procurement Planning System (PipeLine), we developed and evaluated 3 models to generate country-level, public-sector contraceptive prevalence estimates for injectable contraceptives, oral contraceptives, and male condoms. Models included: direct estimation through existing couple-years of protection (CYP) conversion factors, bivariate linear regression, and multivariate linear regression. Model evaluation consisted of comparing the referent DHS prevalence rates for each short-acting method with the model-generated prevalence rate using multiple metrics, including mean absolute error and proportion of countries where the modeled prevalence rate for each method was within 1, 2, or 5 percentage points of the DHS referent value. Results: For the methods studied, family planning use estimates from public-sector logistics data were correlated with those from the DHS, validating the quality and accuracy of current public-sector logistics data. Logistics data for oral and injectable contraceptives were significantly associated (P<.05) with the referent DHS values for both bivariate and multivariate models. For condoms, however, that association was only significant for the bivariate model. With the exception of the CYP-based

  3. Computing measures of explained variation for logistic regression models.

    PubMed

    Mittlböck, M; Schemper, M

    1999-01-01

    The proportion of explained variation (R2) is frequently used in the general linear model but in logistic regression no standard definition of R2 exists. We present a SAS macro which calculates two R2-measures based on Pearson and on deviance residuals for logistic regression. Also, adjusted versions for both measures are given, which should prevent the inflation of R2 in small samples. PMID:10195643

  4. Generalized Fiducial Inference for Binary Logistic Item Response Models.

    PubMed

    Liu, Yang; Hannig, Jan

    2016-06-01

    Generalized fiducial inference (GFI) has been proposed as an alternative to likelihood-based and Bayesian inference in mainstream statistics. Confidence intervals (CIs) can be constructed from a fiducial distribution on the parameter space in a fashion similar to those used with a Bayesian posterior distribution. However, no prior distribution needs to be specified, which renders GFI more suitable when no a priori information about model parameters is available. In the current paper, we apply GFI to a family of binary logistic item response theory models, which includes the two-parameter logistic (2PL), bifactor and exploratory item factor models as special cases. Asymptotic properties of the resulting fiducial distribution are discussed. Random draws from the fiducial distribution can be obtained by the proposed Markov chain Monte Carlo sampling algorithm. We investigate the finite-sample performance of our fiducial percentile CI and two commonly used Wald-type CIs associated with maximum likelihood (ML) estimation via Monte Carlo simulation. The use of GFI in high-dimensional exploratory item factor analysis was illustrated by the analysis of a set of the Eysenck Personality Questionnaire data. PMID:26769340

  5. [The quality control based on the predictable performance].

    PubMed

    Zheng, D X

    2016-09-01

    The clinical performance can only be evaluated when it comes to the last step in the conventional way of prosthesis. However, it often causes the failure because of the unconformity between the expectation and final performance. Resulting from this kind of situation, quality control based on the predictable results has been suggested. It is a new idea based on the way of reverse thinking, and focuses on the need of patient and puts the final performance of the prosthesis to the first place. With the prosthodontically driven prodedure, dentists can complete the unification with the expectation and the final performance. PMID:27596338

  6. Logistics, electronic commerce, and the environment

    NASA Astrophysics Data System (ADS)

    Sarkis, Joseph; Meade, Laura; Talluri, Srinivas

    2002-02-01

    Organizations realize that a strong supporting logistics or electronic logistics (e-logistics) function is important from both commercial and consumer perspectives. The implications of e-logistics models and practices cover the forward and reverse logistics functions of organizations. They also have direct and profound impact on the natural environment. This paper will focus on a discussion of forward and reverse e-logistics and their relationship to the natural environment. After discussion of the many pertinent issues in these areas, directions of practice and implications for study and research are then described.

  7. Application of a sigmapolycyclic aromatic hydrocarbon model and a logistic regression model to sediment toxicity data based on a species-specific, water-only LC50 toxic unit for Hyalella azteca.

    PubMed

    Lee, J H; Landrum, P F; Field, L J; Koh, C H

    2001-09-01

    Two models, a sigmapolycyclic aromatic hydrocarbon (PAH) model based on equilibrium partitioning theory and a logistic-regression model, were developed and evaluated to predict sediment-associated PAH toxicity to Hyalella azteca. A sigmaPAH model was applied to freshwater sediments. This study is the first attempt to use a sigmaPAH model based on water-only, median lethal concentration (LC50) toxic unit (TU) values for sediment-associated PAH mixtures and its application to freshwater sediments. To predict the toxicity (i.e., mortality) from contaminated sediments to H. azteca, an interstitial water TU, calculated as the ambient interstitial water concentration divided by the water-only LC50 in which the interstitial water concentrations were predicted by equilibrium partitioning theory, was used. Assuming additive toxicity for PAH, the sum of TUs was calculated to predict the total toxicity of PAH mixtures in sediments. The sigmaPAH model was developed from 10- and 14-d H. azteca water-only LC50 values. To obtain estimates of LC50 values for a wide range of PAHs, a quantitative structure-activity relationship (QSAR) model (log LC50 - log Kow) with a constant slope was derived using the time-variable LC50 values for four PAH congeners. The logistic-regression model was derived to assess the concentration-response relationship for field sediments, which showed that 1.3 (0.6-3.9) TU were required for a 50% probability that a sediment was toxic. The logistic-regression model reflects both the effects of co-occurring contaminants (i.e., nonmeasured PAH and unknown pollutants) and the overestimation of exposure to sediment-associated PAH. An apparent site-specific bioavailability limitation of sediment-associated PAH was found for a site contaminated by creosote. At this site, no toxic samples were less than 3.9 TU. Finally, the predictability of the sigmaPAH model can be affected by species-specific responses (Hyalella vs Rhepoxynius); chemical specific (PAH vs DDT in

  8. Design of a Performance-Adaptive PID Control System Based on Modeling Performance Assessment

    NASA Astrophysics Data System (ADS)

    Yamamoto, Toru

    In industrial processes represented by petroleum and refinery processes, it is necessary to establish the performance-driven control strategy in order to improve the productivity, which the control performance is firstly evaluated, and the controller is reconstructed. This paper describes a design scheme of performance-adaptive PID controllers which are based on the above control mechanism. According to the proposed control scheme, the system identification works corresponding to the result of modeling performance assessment, and PID parameters are computed using the newly estimated system parameters. In calculating the PID parameters, the desired control performance is considered. The behaviour of the proposed control scheme is numerically examined in some simulation examples.

  9. Personal, Social, and Game-Related Correlates of Active and Non-Active Gaming Among Dutch Gaming Adolescents: Survey-Based Multivariable, Multilevel Logistic Regression Analyses

    PubMed Central

    de Vet, Emely; Chinapaw, Mai JM; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes

    2014-01-01

    Background Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games—active games—seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. Objective The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. Methods A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Results Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; P<.001), a less positive attitude toward non-active games (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; P<.001) and friends (OR 3.4, CI 1.4-8.4; P=.009) who spend more time on active gaming and a little bit lower score on game engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P<.001), having friends who spend more time on non-active gaming (OR 3.3, CI 1.46-7.53; P=.004), and a more positive image of a non-active gamer (OR 2, CI 1.07–3.75; P=.03). Conclusions Various factors were significantly associated with active gaming ≥1 h/wk and non-active gaming >7 h/wk. Active gaming is most

  10. Logistic systems with linear feedback

    NASA Astrophysics Data System (ADS)

    Son, Leonid; Shulgin, Dmitry; Ogluzdina, Olga

    2016-08-01

    A wide variety of systems may be described by specific dependence, which is known as logistic curve, or S-curve, between the internal characteristic and the external parameter. Linear feedback between these two values may be suggested for a wide set of systems also. In present paper, we suggest a bifurcation behavior for systems with both features, and discuss it for two cases, which are the Ising magnet in external field, and the development of manufacturing enterprise.

  11. Sparse Multinomial Logistic Regression via Approximate Message Passing

    NASA Astrophysics Data System (ADS)

    Byrne, Evan; Schniter, Philip

    2016-11-01

    For the problem of multi-class linear classification and feature selection, we propose approximate message passing approaches to sparse multinomial logistic regression (MLR). First, we propose two algorithms based on the Hybrid Generalized Approximate Message Passing (HyGAMP) framework: one finds the maximum a posteriori (MAP) linear classifier and the other finds an approximation of the test-error-rate minimizing linear classifier. Then we design computationally simplified variants of these two algorithms. Next, we detail methods to tune the hyperparameters of their assumed statistical models using Stein's unbiased risk estimate (SURE) and expectation-maximization (EM), respectively. Finally, using both synthetic and real-world datasets, we demonstrate improved error-rate and runtime performance relative to existing state-of-the-art approaches to sparse MLR.

  12. Space Station logistic support by Aries

    NASA Astrophysics Data System (ADS)

    Cougnet, C.; Groepper, P.

    1987-10-01

    The architecture and functions of Aries, a low-cost expendable vehicle, are discussed. The Aries design is based on the Ariane 5 L5 and VEB. The major components of Aries are upgraded L5 and VEB and a payload adaptor; the design and operations of these components are described. The avionics and propulsion system for Aries are examined. Aries is to be employed for logistic support, assembly, and the placement of satellites. An example of a mission scenario and diagrams of Aries are provided.

  13. The Effects of Base Rate, Selection Ratio, Sample Size, and Reliability of Predictors on Predictive Efficiency Indices Associated with Logistic Regression Models.

    ERIC Educational Resources Information Center

    Soderstrom, Irina R.; Leitner, Dennis W.

    While it is imperative that attempts be made to assess the predictive accuracy of any prediction model, traditional measures of predictive accuracy have been criticized as suffering from "the base rate problem." The base rate refers to the relative frequency of occurrence of the event being studied in the population of interest, and the problem…

  14. Performance-Based Pay in the Federal Government. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Performance-Based Pay in the Federal Government"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Steve Nelson discusses the evolution of employee pay systems in the federal government, from the inception of the General Schedule to continuing interest in creating more…

  15. Assessment in Performance-Based Secondary Music Classes

    ERIC Educational Resources Information Center

    Pellegrino, Kristen; Conway, Colleen M.; Russell, Joshua A.

    2015-01-01

    After sharing research findings about grading and assessment practices in secondary music ensemble classes, we offer examples of commonly used assessment tools (ratings scale, checklist, rubric) for the performance ensemble. Then, we explore the various purposes of assessment in performance-based music courses: (1) to meet state, national, and…

  16. Logistic regression: a brief primer.

    PubMed

    Stoltzfus, Jill C

    2011-10-01

    Regression techniques are versatile in their application to medical research because they can measure associations, predict outcomes, and control for confounding variable effects. As one such technique, logistic regression is an efficient and powerful way to analyze the effect of a group of independent variables on a binary outcome by quantifying each independent variable's unique contribution. Using components of linear regression reflected in the logit scale, logistic regression iteratively identifies the strongest linear combination of variables with the greatest probability of detecting the observed outcome. Important considerations when conducting logistic regression include selecting independent variables, ensuring that relevant assumptions are met, and choosing an appropriate model building strategy. For independent variable selection, one should be guided by such factors as accepted theory, previous empirical investigations, clinical considerations, and univariate statistical analyses, with acknowledgement of potential confounding variables that should be accounted for. Basic assumptions that must be met for logistic regression include independence of errors, linearity in the logit for continuous variables, absence of multicollinearity, and lack of strongly influential outliers. Additionally, there should be an adequate number of events per independent variable to avoid an overfit model, with commonly recommended minimum "rules of thumb" ranging from 10 to 20 events per covariate. Regarding model building strategies, the three general types are direct/standard, sequential/hierarchical, and stepwise/statistical, with each having a different emphasis and purpose. Before reaching definitive conclusions from the results of any of these methods, one should formally quantify the model's internal validity (i.e., replicability within the same data set) and external validity (i.e., generalizability beyond the current sample). The resulting logistic regression model

  17. Analysis of efficiency of waste reverse logistics for recycling.

    PubMed

    Veiga, Marcelo M

    2013-10-01

    Brazil is an agricultural country with the highest pesticide consumption in the world. Historically, pesticide packaging has not been disposed of properly. A federal law requires the chemical industry to provide proper waste management for pesticide-related products. A reverse logistics program was implemented, which has been hailed a great success. This program was designed to target large rural communities, where economy of scale can take place. Over the last 10 years, the recovery rate has been very poor in most small rural communities. The objective of this study was to analyze the case of this compulsory reverse logistics program for pesticide packaging under the recent Brazilian Waste Management Policy, which enforces recycling as the main waste management solution. This results of this exploratory research indicate that despite its aggregate success, the reverse logistics program is not efficient for small rural communities. It is not possible to use the same logistic strategy for small and large communities. The results also indicate that recycling might not be the optimal solution, especially in developing countries with unsatisfactory recycling infrastructure and large transportation costs. Postponement and speculation strategies could be applied for improving reverse logistics performance. In most compulsory reverse logistics programs, there is no economical solution. Companies should comply with the law by ranking cost-effective alternatives. PMID:23997069

  18. Analysis of efficiency of waste reverse logistics for recycling.

    PubMed

    Veiga, Marcelo M

    2013-10-01

    Brazil is an agricultural country with the highest pesticide consumption in the world. Historically, pesticide packaging has not been disposed of properly. A federal law requires the chemical industry to provide proper waste management for pesticide-related products. A reverse logistics program was implemented, which has been hailed a great success. This program was designed to target large rural communities, where economy of scale can take place. Over the last 10 years, the recovery rate has been very poor in most small rural communities. The objective of this study was to analyze the case of this compulsory reverse logistics program for pesticide packaging under the recent Brazilian Waste Management Policy, which enforces recycling as the main waste management solution. This results of this exploratory research indicate that despite its aggregate success, the reverse logistics program is not efficient for small rural communities. It is not possible to use the same logistic strategy for small and large communities. The results also indicate that recycling might not be the optimal solution, especially in developing countries with unsatisfactory recycling infrastructure and large transportation costs. Postponement and speculation strategies could be applied for improving reverse logistics performance. In most compulsory reverse logistics programs, there is no economical solution. Companies should comply with the law by ranking cost-effective alternatives.

  19. Improving Public School Performance through Vision-Based Leadership

    ERIC Educational Resources Information Center

    Kantabutra, Sooksan

    2005-01-01

    While vision-based leadership, frequently referred to as transformational leadership in the education literature, is widely regarded as critical to successful organization transformation, little research has been conducted into the relationship between vision-based leadership and public school performance in Thailand. Derived from substantial…

  20. Evaluating hydrological model performance using information theory-based metrics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  1. Memory Benchmarks for SMP-Based High Performance Parallel Computers

    SciTech Connect

    Yoo, A B; de Supinski, B; Mueller, F; Mckee, S A

    2001-11-20

    As the speed gap between CPU and main memory continues to grow, memory accesses increasingly dominates the performance of many applications. The problem is particularly acute for symmetric multiprocessor (SMP) systems, where the shared memory may be accessed concurrently by a group of threads running on separate CPUs. Unfortunately, several key issues governing memory system performance in current systems are not well understood. Complex interactions between the levels of the memory hierarchy, buses or switches, DRAM back-ends, system software, and application access patterns can make it difficult to pinpoint bottlenecks and determine appropriate optimizations, and the situation is even more complex for SMP systems. To partially address this problem, we formulated a set of multi-threaded microbenchmarks for characterizing and measuring the performance of the underlying memory system in SMP-based high-performance computers. We report our use of these microbenchmarks on two important SMP-based machines. This paper has four primary contributions. First, we introduce a microbenchmark suite to systematically assess and compare the performance of different levels in SMP memory hierarchies. Second, we present a new tool based on hardware performance monitors to determine a wide array of memory system characteristics, such as cache sizes, quickly and easily; by using this tool, memory performance studies can be targeted to the full spectrum of performance regimes with many fewer data points than is otherwise required. Third, we present experimental results indicating that the performance of applications with large memory footprints remains largely constrained by memory. Fourth, we demonstrate that thread-level parallelism further degrades memory performance, even for the latest SMPs with hardware prefetching and switch-based memory interconnects.

  2. NASA Advanced Explorations Systems: Concepts for Logistics to Living

    NASA Technical Reports Server (NTRS)

    Shull, Sarah A.; Howe, A. Scott; Flynn, Michael T.; Howard, Robert

    2012-01-01

    , Howard 2010]. Several of the L2L concepts that have shown the most potential in the past are based on NASA cargo transfer bags (CTBs) or their equivalents which are currently used to transfer cargo to and from the ISS. A high percentage of all logistics supplies are packaging mass and for a 6-month mission a crew of four might need over 100 CTBs. These CTBs are used for on-orbit transfer and storage but eventually becomes waste after use since down mass is very limited. The work being done in L2L also considering innovative interior habitat construction that integrate the CTBs into the walls of future habitats. The direct integration could provide multiple functions: launch packaging, stowage, radiation protection, water processing, life support augmentation, as well as structure. Reuse of these CTBs would reduce the amount of waste generated and also significantly reduce future up mass requirements for exploration missions. Also discussed here is the L2L water wall , an innovative reuse of an unfolded CTB as a passive water treatment system utilizing forward osmosis. The bags have been modified to have an inner membrane liner that allows them to purify wastewater. They may also provide a structural water-wall element that can be used to provide radiation protection and as a structural divider. Integration of the components into vehicle/habitat architecture and consideration of operations concepts and human factors will be discussed. In the future these bags could be designed to treat wastewater, concentrated brines, and solid wastes, and to dewater solid wastes and produce a bio-stabilized construction element. This paper will describe the follow-on work done in design, fabrication and demonstrations of various L2L concepts, including advanced CTBs for reuse/repurposing, internal outfitting studies and the CTB-based forward osmosis water wall.

  3. Logistics Handbook, 1976. Colorado Outward Bound School.

    ERIC Educational Resources Information Center

    Colorado Outward Bound School, Denver.

    Logistics, a support mission, is vital to the successful operation of the Colorado Outward Bound School (COBS) courses. Logistics is responsible for purchasing, maintaining, transporting, and replenishing a wide variety of items, i.e., food, mountaineering and camping equipment, medical and other supplies, and vehicles. The Logistics coordinator…

  4. Information logistics: A production-line approach to information services

    NASA Technical Reports Server (NTRS)

    Adams, Dennis; Lee, Chee-Seng

    1991-01-01

    Logistics can be defined as the process of strategically managing the acquisition, movement, and storage of materials, parts, and finished inventory (and the related information flow) through the organization and its marketing channels in a cost effective manner. It is concerned with delivering the right product to the right customer in the right place at the right time. The logistics function is composed of inventory management, facilities management, communications unitization, transportation, materials management, and production scheduling. The relationship between logistics and information systems is clear. Systems such as Electronic Data Interchange (EDI), Point of Sale (POS) systems, and Just in Time (JIT) inventory management systems are important elements in the management of product development and delivery. With improved access to market demand figures, logisticians can decrease inventory sizes and better service customer demand. However, without accurate, timely information, little, if any, of this would be feasible in today's global markets. Information systems specialists can learn from logisticians. In a manner similar to logistics management, information logistics is concerned with the delivery of the right data, to the ring customer, at the right time. As such, information systems are integral components of the information logistics system charged with providing customers with accurate, timely, cost-effective, and useful information. Information logistics is a management style and is composed of elements similar to those associated with the traditional logistics activity: inventory management (data resource management), facilities management (distributed, centralized and decentralized information systems), communications (participative design and joint application development methodologies), unitization (input/output system design, i.e., packaging or formatting of the information), transportations (voice, data, image, and video communication systems

  5. Differentially private distributed logistic regression using private and public data

    PubMed Central

    2014-01-01

    Background Privacy protecting is an important issue in medical informatics and differential privacy is a state-of-the-art framework for data privacy research. Differential privacy offers provable privacy against attackers who have auxiliary information, and can be applied to data mining models (for example, logistic regression). However, differentially private methods sometimes introduce too much noise and make outputs less useful. Given available public data in medical research (e.g. from patients who sign open-consent agreements), we can design algorithms that use both public and private data sets to decrease the amount of noise that is introduced. Methodology In this paper, we modify the update step in Newton-Raphson method to propose a differentially private distributed logistic regression model based on both public and private data. Experiments and results We try our algorithm on three different data sets, and show its advantage over: (1) a logistic regression model based solely on public data, and (2) a differentially private distributed logistic regression model based on private data under various scenarios. Conclusion Logistic regression models built with our new algorithm based on both private and public datasets demonstrate better utility than models that trained on private or public datasets alone without sacrificing the rigorous privacy guarantee. PMID:25079786

  6. Benefits to space logistics and supportability using intelligent, decision-making self-prognostic equipment

    NASA Astrophysics Data System (ADS)

    Losik, L.

    To improve logistics and supportability for existing and future space systems, the key design driver needs to be changed from equipment and system performance to equipment usable life as is done on Air Force fighter aircraft and the new Boeing commercial passenger aircraft. Today, all space system procurement contracts require equipment performance to be measured and confirmed before purchase and delivery, but the same procurement contracts do not require the usable life of the equipment to be measured and confirmed resulting in equipment whose reliability/usable life is dominated by premature (infant mortality) failures. Premature failures drive space system logistics and supportability, increasing cost and decreasing serviceability and availability. However, reliability-centered systems measure equipment usable life to identify any equipment that suffer from an infant mortality for replacement before delivery, offer superior system availability, maintainability, reliability and supportability along with meeting or exceeding equipment and system performance requirements. Today, the expensive and outdated routine maintenance programs can be replaced by the cost-saving, condition-based maintenance (CBM) program. The CBM includes using intelligent, decision-making self-prognostic equipment that decrease increases availability while lowering the life cycle cost. The CBM is ideal for improving the logistics, availability and supportability for existing and tomorrow's space exploration programs that benefit financially from having the right equipment and supplies available at the right time.

  7. Integrating policy-based management and SLA performance monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu

    2001-10-01

    Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the

  8. Human Factors Considerations for Performance-Based Navigation

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Adams, Catherine A.

    2006-01-01

    A transition toward a performance-based navigation system is currently underway in both the United States and around the world. Performance-based navigation incorporates Area Navigation (RNAV) and Required Navigation Performance (RNP) procedures that do not rely on the location of ground-based navigation aids. These procedures offer significant benefits to both operators and air traffic managers. Under sponsorship from the Federal Aviation Administration (FAA), the National Aeronautics and Space Administration (NASA) has undertaken a project to document human factors issues that have emerged during RNAV and RNP operations and propose areas for further consideration. Issues were found to include aspects of air traffic control and airline procedures, aircraft systems, and procedure design. Major findings suggest the need for human factors-specific instrument procedure design guidelines. Ongoing industry and government activities to address air-ground communication terminology, procedure design improvements, and chart-database commonality are strongly encouraged.

  9. Parallel performance optimizations on unstructured mesh-based simulations

    SciTech Connect

    Sarje, Abhinav; Song, Sukhyun; Jacobsen, Douglas; Huck, Kevin; Hollingsworth, Jeffrey; Malony, Allen; Williams, Samuel; Oliker, Leonid

    2015-06-01

    This paper addresses two key parallelization challenges the unstructured mesh-based ocean modeling code, MPAS-Ocean, which uses a mesh based on Voronoi tessellations: (1) load imbalance across processes, and (2) unstructured data access patterns, that inhibit intra- and inter-node performance. Our work analyzes the load imbalance due to naive partitioning of the mesh, and develops methods to generate mesh partitioning with better load balance and reduced communication. Furthermore, we present methods that minimize both inter- and intranode data movement and maximize data reuse. Our techniques include predictive ordering of data elements for higher cache efficiency, as well as communication reduction approaches. We present detailed performance data when running on thousands of cores using the Cray XC30 supercomputer and show that our optimization strategies can exceed the original performance by over 2×. Additionally, many of these solutions can be broadly applied to a wide variety of unstructured grid-based computations.

  10. Performance-based development of a basic nuclear engineering course

    SciTech Connect

    Knief, R.A. )

    1993-01-01

    Over the past 19 yr, a basic nuclear engineering course has been developed with methods traditionally applied to training programs. The performance-based approach uses elements classified roughly as analysis, design/development, implementation, and evaluation/feedback. Prior to the accident at Three Mile Island unit 2 (TMI-2), performance-based training applications were rare in the nuclear community. An exception was at Sandia Laboratories (SNL). American Telephone Telegraph (AT T) - holder of the SNL contract with what is now the US Department of Energy - and its Bell Laboratories subsidiary had for some time emphasized in-hours technical education and training using performance-based methods to ensure that in-house and contracted instructors focused on course outcomes rather than merely subject matter.

  11. Alpha neurofeedback training improves SSVEP-based BCI performance

    NASA Astrophysics Data System (ADS)

    Wan, Feng; Nuno da Cruz, Janir; Nan, Wenya; Wong, Chi Man; Vai, Mang I.; Rosa, Agostinho

    2016-06-01

    Objective. Steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) can provide relatively easy, reliable and high speed communication. However, the performance is still not satisfactory, especially in some users who are not able to generate strong enough SSVEP signals. This work aims to strengthen a user’s SSVEP by alpha down-regulating neurofeedback training (NFT) and consequently improve the performance of the user in using SSVEP-based BCIs. Approach. An experiment with two steps was designed and conducted. The first step was to investigate the relationship between the resting alpha activity and the SSVEP-based BCI performance, in order to determine the training parameter for the NFT. Then in the second step, half of the subjects with ‘low’ performance (i.e. BCI classification accuracy <80%) were randomly assigned to a NFT group to perform a real-time NFT, and the rest half to a non-NFT control group for comparison. Main results. The first step revealed a significant negative correlation between the BCI performance and the individual alpha band (IAB) amplitudes in the eyes-open resting condition in a total of 33 subjects. In the second step, it was found that during the IAB down-regulating NFT, on average the subjects were able to successfully decrease their IAB amplitude over training sessions. More importantly, the NFT group showed an average increase of 16.5% in the SSVEP signal SNR (signal-to-noise ratio) and an average increase of 20.3% in the BCI classification accuracy, which was significant compared to the non-NFT control group. Significance. These findings indicate that the alpha down-regulating NFT can be used to improve the SSVEP signal quality and the subjects’ performance in using SSVEP-based BCIs. It could be helpful to the SSVEP related studies and would contribute to more effective SSVEP-based BCI applications.

  12. Performance Comparison of HPF and MPI Based NAS Parallel Benchmarks

    NASA Technical Reports Server (NTRS)

    Saini, Subhash

    1997-01-01

    Compilers supporting High Performance Form (HPF) features first appeared in late 1994 and early 1995 from Applied Parallel Research (APR), Digital Equipment Corporation, and The Portland Group (PGI). IBM introduced an HPF compiler for the IBM RS/6000 SP2 in April of 1996. Over the past two years, these implementations have shown steady improvement in terms of both features and performance. The performance of various hardware/ programming model (HPF and MPI) combinations will be compared, based on latest NAS Parallel Benchmark results, thus providing a cross-machine and cross-model comparison. Specifically, HPF based NPB results will be compared with MPI based NPB results to provide perspective on performance currently obtainable using HPF versus MPI or versus hand-tuned implementations such as those supplied by the hardware vendors. In addition, we would also present NPB, (Version 1.0) performance results for the following systems: DEC Alpha Server 8400 5/440, Fujitsu CAPP Series (VX, VPP300, and VPP700), HP/Convex Exemplar SPP2000, IBM RS/6000 SP P2SC node (120 MHz), NEC SX-4/32, SGI/CRAY T3E, and SGI Origin2000. We would also present sustained performance per dollar for Class B LU, SP and BT benchmarks.

  13. Roadmap Toward a Predictive Performance-based Commercial Energy Code

    SciTech Connect

    Rosenberg, Michael I.; Hart, Philip R.

    2014-10-01

    Energy codes have provided significant increases in building efficiency over the last 38 years, since the first national energy model code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. The current focus on prescriptive codes has limitations including significant variation in actual energy performance depending on which prescriptive options are chosen, a lack of flexibility for designers and developers, and the inability to handle control optimization that is specific to building type and use. This paper provides a high level review of different options for energy codes, including prescriptive, prescriptive packages, EUI Target, outcome-based, and predictive performance approaches. This paper also explores a next generation commercial energy code approach that places a greater emphasis on performance-based criteria. A vision is outlined to serve as a roadmap for future commercial code development. That vision is based on code development being led by a specific approach to predictive energy performance combined with building specific prescriptive packages that are designed to be both cost-effective and to achieve a desired level of performance. Compliance with this new approach can be achieved by either meeting the performance target as demonstrated by whole building energy modeling, or by choosing one of the prescriptive packages.

  14. Biotechnology-based odour control: design criteria and performance data.

    PubMed

    Quigley, C; Easter, C; Burrowes, P; Witherspoon, J

    2004-01-01

    As neighbouring areas continue to encroach upon wastewater treatment plants, there is an increasing need for odour control to mitigate potential negative offsite odorous impacts. One technology that is gaining widespread acceptance is biotechnology, which utilises the inherent ability of certain microorganisms to biodegrade offensive odorous compounds. Two main advantages of this form of treatment over other odour control technologies include the absence of hazardous chemicals and relatively low operation and maintenance requirements. The purpose of this paper is to provide information related to odour control design criteria used in sizing/selecting biotechnology-based odour control technologies, and to provide odour removal performance data obtained from several different biotechnology-based odour control systems. CH2M HILL has collected biotechnology-based odour control performance data over the last several years in order to track the continued performance of various biofilters and biotowers over time. Specifically, odour removal performance data have been collected from soil-, organic- and inorganic-media biofilters and inert inorganic media biotowers. Results indicate that biotechnology-based odour control is a viable and consistent technology capable of achieving high removal performance for odour and hydrogen sulphide. It is anticipated that the information presented in this paper will be of interest to anyone involved with odour control technology evaluation/selection or design review.

  15. Biotechnology-based odour control: design criteria and performance data.

    PubMed

    Quigley, C; Easter, C; Burrowes, P; Witherspoon, J

    2004-01-01

    As neighbouring areas continue to encroach upon wastewater treatment plants, there is an increasing need for odour control to mitigate potential negative offsite odorous impacts. One technology that is gaining widespread acceptance is biotechnology, which utilises the inherent ability of certain microorganisms to biodegrade offensive odorous compounds. Two main advantages of this form of treatment over other odour control technologies include the absence of hazardous chemicals and relatively low operation and maintenance requirements. The purpose of this paper is to provide information related to odour control design criteria used in sizing/selecting biotechnology-based odour control technologies, and to provide odour removal performance data obtained from several different biotechnology-based odour control systems. CH2M HILL has collected biotechnology-based odour control performance data over the last several years in order to track the continued performance of various biofilters and biotowers over time. Specifically, odour removal performance data have been collected from soil-, organic- and inorganic-media biofilters and inert inorganic media biotowers. Results indicate that biotechnology-based odour control is a viable and consistent technology capable of achieving high removal performance for odour and hydrogen sulphide. It is anticipated that the information presented in this paper will be of interest to anyone involved with odour control technology evaluation/selection or design review. PMID:15484776

  16. Performance-Based Technology Selection Filter description report

    SciTech Connect

    O'Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  17. The effect of high leverage points on the logistic ridge regression estimator having multicollinearity

    NASA Astrophysics Data System (ADS)

    Ariffin, Syaiba Balqish; Midi, Habshah

    2014-06-01

    This article is concerned with the performance of logistic ridge regression estimation technique in the presence of multicollinearity and high leverage points. In logistic regression, multicollinearity exists among predictors and in the information matrix. The maximum likelihood estimator suffers a huge setback in the presence of multicollinearity which cause regression estimates to have unduly large standard errors. To remedy this problem, a logistic ridge regression estimator is put forward. It is evident that the logistic ridge regression estimator outperforms the maximum likelihood approach for handling multicollinearity. The effect of high leverage points are then investigated on the performance of the logistic ridge regression estimator through real data set and simulation study. The findings signify that logistic ridge regression estimator fails to provide better parameter estimates in the presence of both high leverage points and multicollinearity.

  18. Measurement-based performance evaluation technique for high-performance computers

    NASA Technical Reports Server (NTRS)

    Sharma, S.; Natarajan, C.; Iyer, R. K.

    1993-01-01

    A measurement-based performance evaluation technique has been used to characterize the OS performance of Cedar, a hierarchical shared-memory multiprocessor system. Thirteen OS performance meters were used to capture the operating system activities for compute-bound workloads. Three representative applications from the Perfect Benchmark Suite were used to measure the OS performance in a dedicated system and in multiprogrammed workloads. It was found that 13-23 percent of the total execution time on a dedicated system was spent in executing OS-related activities. Under multiprogramming, 12-14 percent of the total execution time was used by the OS. The impact of multiprogramming on the operating system performance meters was also measured.

  19. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  20. Reuse and recycling - reverse logistics opportunities

    SciTech Connect

    Kopicki, R.; Berg, M.J.; Legg, L.

    1993-12-31

    This book is intended to serve as a managerial guide for planning and implementing waste reduction programs. It is based on the premise that proactive management of environmental issues is becoming vital to corporate success, and that these issues are creating new roles and opportunities for logistic professionals. Examined in detail are nonhazardous waste reduction activities; reuse and recycling activities; and source reduction. The book is based on in-depth interviews with seventeen firms and several trade associations acknowledged to be leaders in waste reduction efforts. Topics discussed include adapting inbound supply chains to use more recycled goods; minimizing packaging waste; reverse distribution capabilities for taking back products and packaging; and the use of third party services for recycling, reuse, and source reduction activities. Included are two case analyses of progressive firms like E.I. Dupont Nemours and Home Depot and their waste reduction efforts.

  1. Performance Invalidity Base Rates Among Healthy Undergraduate Research Participants.

    PubMed

    Ross, Thomas P; Poston, Ashley M; Rein, Patricia A; Salvatore, Andrew N; Wills, Nathan L; York, Taylor M

    2016-02-01

    Few studies have examined base rates of suboptimal effort among healthy, undergraduate students recruited for neuropsychological research. An and colleagues (2012, Conducting research with non-clinical healthy undergraduates: Does effort play a role in neuropsychological test performance? Archives of Clinical Neuropsychology, 27, 849-857) reported high rates of performance invalidity (30.8%-55.6%), calling into question the validity of findings generated from samples of college students. In contrast, subsequent studies have reported much lower base rates ranging from 2.6% to 12%. The present study replicated and extended previous work by examining the performance of 108 healthy undergraduates on the Dot Counting Test, Victoria Symptom Validity Test, Word Memory Test, and a brief battery of neuropsychological measures. During initial testing, 8.3% of the sample scored below cutoffs on at least one Performance Validity Test, while 3.7% were classified as invalid at Time 2 (M interval = 34.4 days). The present findings add to a growing number of studies that suggest performance invalidity base rates in samples of non-clinical, healthy college students are much lower than An and colleagues initial findings. Although suboptimal effort is much less problematic than suggested by An and colleagues, recent reports as high as 12% indicate including measures of effort may be of value when using college students as participants. Methodological issues and recommendations for future research are presented.

  2. Advanced Organic Permeable-Base Transistor with Superior Performance.

    PubMed

    Klinger, Markus P; Fischer, Axel; Kaschura, Felix; Scholz, Reinhard; Lüssem, Björn; Kheradmand-Boroujeni, Bahman; Ellinger, Frank; Kasemann, Daniel; Leo, Karl

    2015-12-16

    An optimized vertical organic permeable-base transistor (OPBT) competing with the best organic field-effect transistors in performance, while employing low-cost fabrication techniques, is presented. The OPBT stands out by its excellent power efficiency at the highest frequencies.

  3. Leading Instructional Practices in a Performance-Based System

    ERIC Educational Resources Information Center

    Kauble, Anna; Wise, Donald

    2015-01-01

    Given the shift to Common Core, educational leaders are challenged to see new directions in teaching and learning. The purpose of this study was to investigate the instructional practices which may be related to the effectiveness of a performance-based system (PBS) and their impact on student achievement, as part of a thematic set of dissertations…

  4. Joint Workshops. Performance Based Apprentice and Technical Training. Final Report.

    ERIC Educational Resources Information Center

    Oriel, Arthur E.

    A series of five workshops were held to disseminate, to 39 industrial and college and 61 Bureau of Apprenticeship and Training (BAT) personnel, information about the principles, methods, and effectiveness of Performance Based Training (PBT) in apprentice programs. Following the workshops, 90% of the industrial and 61% of the BAT personnel…

  5. Performance-Based Measurement: Action for Organizations and HPT Accountability

    ERIC Educational Resources Information Center

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  6. 48 CFR 970.3706 - Performance-based acquisition.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Performance-based acquisition. 970.3706 Section 970.3706 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Facilities Management Contracting...

  7. 48 CFR 970.3706 - Performance-based acquisition.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Performance-based acquisition. 970.3706 Section 970.3706 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Facilities Management Contracting...

  8. Begging the Question: Performativity and Studio-Based Research

    ERIC Educational Resources Information Center

    Petelin, George

    2014-01-01

    The requirement that candidates in studio-based or practice-led higher degrees by research should formulate a research question has been found to be problematic by some writers. The present article argues that this stance, particularly as it is articulated by proponents of the influential category of "performative research" (Haseman,…

  9. Energy Conservation in the Home. Performance Based Lesson Plans.

    ERIC Educational Resources Information Center

    Alabama State Dept. of Education, Montgomery. Home Economics Service.

    These ten performance-based lesson plans concentrate on tasks related to energy conservation in the home. They are (1) caulk cracks, holes, and joints; (2) apply weatherstripping to doors and windows; (3) add plastic/solar screen window covering; (4) arrange furniture for saving energy; (5) set heating/cooling thermostat; (6) replace faucet…

  10. The Evolution of Performance Based Teacher Education Programs.

    ERIC Educational Resources Information Center

    Aubertine, Horace E.

    This document is a discussion of a systemized approach to education theory and practice, especially as it applies to performance-based teacher education. The author uses as the basis of his discussion the physical sciences and their use of approximation models (an illustration of this use is the historical development of the description of matter…

  11. Teachers' Reactions towards Performance-Based Language Assessment

    ERIC Educational Resources Information Center

    Chinda, Bordin

    2014-01-01

    This research aims at examining the reactions of tertiary EFL teachers towards the use of performance-based language assessment. The study employed a mixed-method research methodology. For the quantitative method, 36 teachers responded to a questionnaire survey. In addition, four teachers participated in the in-depth interviews which were…

  12. School-Based Performance Awards: Research Findings and Future Directions.

    ERIC Educational Resources Information Center

    Kelley, Carolyn; Heneman, Herbert, III; Milanowski, Anthony

    This paper synthesizes research on how motivation influenced teachers at two school-based performance award (SBPA) programs in Kentucky and in North Carolina. The research was conducted between 1995 and 1998 by the Consortium for Policy Research in Education. SBPA programs provide teachers and other school staff with pay bonuses for the…

  13. Planning and Implementing a High Performance Knowledge Base.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1999-01-01

    Discusses the conceptual framework for developing a rapid-prototype high-performance knowledge base for the four mission agencies of the United States Department of Agriculture and their university partners. Describes the background of the project and methods used for establishing the requirements; examines issues and problems surrounding semantic…

  14. Critical Arts-Based Research in Education: Performing Undocumented Historias

    ERIC Educational Resources Information Center

    Bagley, Carl; Castro-Salazar, Ricardo

    2012-01-01

    The article seeks to elucidate and academically position the genre of critical arts-based research in education. The article fuses Critical Race Theory (CRT), life history and performance, alongside work with undocumented American students of Mexican origin, to show how a politicised qualitative paradigmatic re envisioning can occur in which…

  15. 48 CFR 970.1100-1 - Performance-based contracting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Performance-based contracting. 970.1100-1 Section 970.1100-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Describing Agency Needs 970.1100-1...

  16. Performance Evaluation in Network-Based Parallel Computing

    NASA Technical Reports Server (NTRS)

    Dezhgosha, Kamyar

    1996-01-01

    Network-based parallel computing is emerging as a cost-effective alternative for solving many problems which require use of supercomputers or massively parallel computers. The primary objective of this project has been to conduct experimental research on performance evaluation for clustered parallel computing. First, a testbed was established by augmenting our existing SUNSPARCs' network with PVM (Parallel Virtual Machine) which is a software system for linking clusters of machines. Second, a set of three basic applications were selected. The applications consist of a parallel search, a parallel sort, a parallel matrix multiplication. These application programs were implemented in C programming language under PVM. Third, we conducted performance evaluation under various configurations and problem sizes. Alternative parallel computing models and workload allocations for application programs were explored. The performance metric was limited to elapsed time or response time which in the context of parallel computing can be expressed in terms of speedup. The results reveal that the overhead of communication latency between processes in many cases is the restricting factor to performance. That is, coarse-grain parallelism which requires less frequent communication between processes will result in higher performance in network-based computing. Finally, we are in the final stages of installing an Asynchronous Transfer Mode (ATM) switch and four ATM interfaces (each 155 Mbps) which will allow us to extend our study to newer applications, performance metrics, and configurations.

  17. Hierarchical Logistic Regression: Accounting for Multilevel Data in DIF Detection

    ERIC Educational Resources Information Center

    French, Brian F.; Finch, W. Holmes

    2010-01-01

    The purpose of this study was to examine the performance of differential item functioning (DIF) assessment in the presence of a multilevel structure that often underlies data from large-scale testing programs. Analyses were conducted using logistic regression (LR), a popular, flexible, and effective tool for DIF detection. Data were simulated…

  18. Simulation-based education and performance assessments for pediatric surgeons.

    PubMed

    Barsness, Katherine

    2014-08-01

    Education in the knowledge, skills, and attitudes necessary for a surgeon to perform at an expert level in the operating room, and beyond, must address all potential cognitive and technical performance gaps, professionalism and personal behaviors, and effective team communication. Educational strategies should also seek to replicate the stressors and distractions that might occur during a high-risk operation or critical care event. Finally, education cannot remain fixed in an apprenticeship model of "See one, do one, teach one," whereby patients are exposed to the risk of harm inherent to any learning curve. The majority of these educational goals can be achieved with the addition of simulation-based education (SBE) as a valuable adjunct to traditional training methods. This article will review relevant principles of SBE, explore currently available simulation-based educational tools for pediatric surgeons, and finally make projections for the future of SBE and performance assessments for pediatric surgeons.

  19. Design and performance comparison of fuzzy logic based tracking controllers

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Jani, Yashvant

    1992-01-01

    Several camera tracking controllers based on fuzzy logic principles have been designed and tested in software simulation in the software technology branch at the Johnson Space Center. The fuzzy logic based controllers utilize range measurement and pixel positions from the image as input parameters and provide pan and tilt gimble rate commands as output. Two designs of the rulebase and tuning process applied to the membership functions are discussed in light of optimizing performance. Seven test cases have been designed to test the performance of the controllers for proximity operations where approaches like v-bar, fly-around and station keeping are performed. The controllers are compared in terms of responsiveness, and ability to maintain the object in the field-of-view of the camera. Advantages of the fuzzy logic approach with respect to the conventional approach have been discussed in terms of simplicity and robustness.

  20. Fundamental performance improvement to dispersive spectrograph based imaging technologies

    NASA Astrophysics Data System (ADS)

    Meade, Jeff T.; Behr, Bradford B.; Cenko, Andrew T.; Christensen, Peter; Hajian, Arsen R.; Hendrikse, Jan; Sweeney, Frederic D.

    2011-03-01

    Dispersive-based spectrometers may be qualified by their spectral resolving power and their throughput efficiency. A device known as a virtual slit is able to improve the resolving power by factors of several with a minimal loss in throughput, thereby fundamentally improving the quality of the spectrometer. A virtual slit was built and incorporated into a low performing spectrometer (R ~ 300) and was shown to increase the performance without a significant loss in signal. The operation and description of virtual slits is also given. High-performance, lowlight, and high-speed imaging instruments based on a dispersive-type spectrometer see the greatest impact from a virtual slit. The impact of a virtual slit on spectral domain optical coherence tomography (SD-OCT) is shown to improve the imaging quality substantially.

  1. Space station synergetic RAM-logistics analysis

    NASA Technical Reports Server (NTRS)

    Dejulio, Edmund T.; Leet, Joel H.

    1988-01-01

    NASA's Space Station Maintenance Planning and Analysis (MP&A) Study is a step in the overall Space Station Program to define optimum approaches for on-orbit maintenance planning and logistics support. The approach used in the MP&A study and the analysis process used are presented. Emphasis is on maintenance activities and processes that can be accomplished on orbit within the known design and support constraints of the Space Station. From these analyses, recommendations for maintainability/maintenance requirements are established. The ultimate goal of the study is to reduce on-orbit maintenance requirements to a practical and safe minimum, thereby conserving crew time for productive endeavors. The reliability, availability, and maintainability (RAM) and operations performance evaluation models used were assembled and developed as part of the MP&A study and are described. A representative space station system design is presented to illustrate the analysis process.

  2. Performance-Based Compensation: Linking Performance to Teacher Salaries. Ask the Team

    ERIC Educational Resources Information Center

    Behrstock-Sherratt, Ellen; Potemski, Amy

    2013-01-01

    To achieve the goal of attracting and retaining talented professionals in education, performance-based compensation systems (PBCS) must offer salaries that are both fair and sufficiently competitive at each point across an educator's career continuum. Although many states, especially with the support of the Teacher Incentive Fund (TIF) grants,…

  3. High performance network and channel-based storage

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.

    1991-01-01

    In the traditional mainframe-centered view of a computer system, storage devices are coupled to the system through complex hardware subsystems called input/output (I/O) channels. With the dramatic shift towards workstation-based computing, and its associated client/server model of computation, storage facilities are now found attached to file servers and distributed throughout the network. We discuss the underlying technology trends that are leading to high performance network-based storage, namely advances in networks, storage devices, and I/O controller and server architectures. We review several commercial systems and research prototypes that are leading to a new approach to high performance computing based on network-attached storage.

  4. Performability modeling based on real data: A case study

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1988-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of apparent types of errors.

  5. A measurement-based performability model for a multiprocessor system

    NASA Technical Reports Server (NTRS)

    Ilsueh, M. C.; Iyer, Ravi K.; Trivedi, K. S.

    1987-01-01

    A measurement-based performability model based on real error-data collected on a multiprocessor system is described. Model development from the raw errror-data to the estimation of cumulative reward is described. Both normal and failure behavior of the system are characterized. The measured data show that the holding times in key operational and failure states are not simple exponential and that semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different failure types and recovery procedures.

  6. Performability modeling based on real data: A casestudy

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1987-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different types of errors.

  7. Gender-based performance differences in an introductory physics course

    NASA Astrophysics Data System (ADS)

    McKinnon, Mark Lee

    Cognitive research has indicated that the difference between males and females is negligible. Paradoxically, in traditionally-taught college level introductory physics courses, males have outperformed females. UC Davis' Physics 7A (the first class of a three-quarter Introduction to Physics sequence for Life-Science students), however, counters this trend since females perform similarly to males. The gender-based performance difference within the other two quarters (Physics 7B & 7C) of the radically restructured, active-learning physics sequence still echo the traditionally-taught courses. In one experiment, I modified the laboratory activity instructions of the Physics 7C course to encourage further group interaction. These modifications did not affect the gender-based performance difference. In a later experiment, I compared students' performance on different forms of assessment for certain physics concepts during the Physics 7C course. Over 500 students took weekly quizzes at different times. The students were given different quiz questions on the same topics. Several quiz questions seemed to favor males while others were more gender equitable. I highlighted comparisons between a few pairs of questions that assessed students' understanding of the same physical concept. Males tended to perform better in responding to questions that seemed to require spatial visualization. Questions that required greater understanding of the physical concept or scientific model were more gender neutral.

  8. Prescriptive vs. performance based cook-off fire testing.

    SciTech Connect

    Nakos, James Thomas; Tieszen, Sheldon Robert; Erikson, William Wilding; Gill, Walter; Blanchat, Thomas K.

    2010-07-01

    In the fire safety community, the trend is toward implementing performance-based standards in place of existing prescriptive ones. Prescriptive standards can be difficult to adapt to changing design methods, materials, and application situations of systems that ultimately must perform well in unwanted fire situations. In general, this trend has produced positive results and is embraced by the fire protection community. The question arises as to whether this approach could be used to advantage in cook-off testing. Prescribed fuel fire cook-off tests have been instigated because of historical incidents that led to extensive damage to structures and loss of life. They are designed to evaluate the propensity for a violent response. The prescribed protocol has several advantages: it can be defined in terms of controllable parameters (wind speed, fuel type, pool size, etc.); and it may be conservative for a particular scenario. However, fires are inherently variable and prescribed tests are not necessarily representative of a particular accident scenario. Moreover, prescribed protocols are not necessarily adaptable and may not be conservative. We also consider performance-based testing. This requires more knowledge and thought regarding not only the fire environment, but the behavior of the munitions themselves. Sandia uses a performance based approach in assuring the safe behavior of systems of interest that contain energetic materials. Sandia also conducts prescriptive fire testing for the IAEA, NRC and the DOT. Here we comment on the strengths and weakness of both approaches and suggest a path forward should it be desirable to pursue a performance based cook-off standard.

  9. The Effects of Acute Abstinence from Smoking and Performance-Based Rewards on Performance Monitoring

    PubMed Central

    Schlienz, Nicolas J.; Hawk, Larry W.; Rosch, Keri S.

    2013-01-01

    Rationale Abstinence from smoking disrupts performance in multiple cognitive domains, and such cognitive effects may serve to maintain smoking behavior. Rather than having specific effects on a narrow domain of processing, abstinence may disrupt more general cognitive control processes and/or motivation. Objectives The present study tested the prediction that overnight abstinence from smoking would disrupt a general performance monitoring system indexed via the error-related negativity (ERN). A secondary aim was to determine the extent to which performance-based monetary rewards improved the ERN among smokers and whether the effect of reward was diminished during abstinence. Methods The ERN was assessed during a flanker task among 25 heavy, non-treatment-seeking smokers both when smoking as usual and after overnight abstinence; reward and no-reward trial blocks occurred within each session. Results As predicted, mean ERN amplitude was reduced during abstinence. The ERN was enhanced by reward; this effect did not vary with smoking abstinence. Conclusion This study provides novel data that suggest acute abstinence from smoking disrupts a neurophysiological index of a general performance monitoring system that is involved in a range of cognitive functions. The ERN may be a useful complement to narrow-band cognitive studies of abstinence and interventions designed to target cognition in addiction. Because the ERN was concurrently sensitive to abstinence and performance-based incentives, it may be particular useful for examining the interplay of cognition and motivation in smoking and smoking cessation. PMID:23681159

  10. Green Packaging Management of Logistics Enterprises

    NASA Astrophysics Data System (ADS)

    Zhang, Guirong; Zhao, Zongjian

    From the connotation of green logistics management, we discuss the principles of green packaging, and from the two levels of government and enterprises, we put forward a specific management strategy. The management of green packaging can be directly and indirectly promoted by laws, regulations, taxation, institutional and other measures. The government can also promote new investment to the development of green packaging materials, and establish specialized institutions to identify new packaging materials, standardization of packaging must also be accomplished through the power of the government. Business units of large scale through the packaging and container-based to reduce the use of packaging materials, develop and use green packaging materials and easy recycling packaging materials for proper packaging.

  11. Logistics Reduction Technologies for Exploration Missions

    NASA Technical Reports Server (NTRS)

    Broyan, James L., Jr.; Ewert, Michael K.; Fink, Patrick W.

    2014-01-01

    Human exploration missions under study are limited by the launch mass capacity of existing and planned launch vehicles. The logistical mass of crew items is typically considered separate from the vehicle structure, habitat outfitting, and life support systems. Although mass is typically the focus of exploration missions, due to its strong impact on launch vehicle and habitable volume for the crew, logistics volume also needs to be considered. NASA's Advanced Exploration Systems (AES) Logistics Reduction and Repurposing (LRR) Project is developing six logistics technologies guided by a systems engineering cradle-to-grave approach to enable after-use crew items to augment vehicle systems. Specifically, AES LRR is investigating the direct reduction of clothing mass, the repurposing of logistical packaging, the use of autonomous logistics management technologies, the processing of spent crew items to benefit radiation shielding and water recovery, and the conversion of trash to propulsion gases. Reduction of mass has a corresponding and significant impact to logistical volume. The reduction of logistical volume can reduce the overall pressurized vehicle mass directly, or indirectly benefit the mission by allowing for an increase in habitable volume during the mission. The systematic implementation of these types of technologies will increase launch mass efficiency by enabling items to be used for secondary purposes and improve the habitability of the vehicle as mission durations increase. Early studies have shown that the use of advanced logistics technologies can save approximately 20 m(sup 3) of volume during transit alone for a six-person Mars conjunction class mission.

  12. Reverse bifurcation and fractal of the compound logistic map

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Liang, Qingyong

    2008-07-01

    The nature of the fixed points of the compound logistic map is researched and the boundary equation of the first bifurcation of the map in the parameter space is given out. Using the quantitative criterion and rule of chaotic system, the paper reveal the general features of the compound logistic map transforming from regularity to chaos, the following conclusions are shown: (1) chaotic patterns of the map may emerge out of double-periodic bifurcation and (2) the chaotic crisis phenomena and the reverse bifurcation are found. At the same time, we analyze the orbit of critical point of the compound logistic map and put forward the definition of Mandelbrot-Julia set of compound logistic map. We generalize the Welstead and Cromer's periodic scanning technology and using this technology construct a series of Mandelbrot-Julia sets of compound logistic map. We investigate the symmetry of Mandelbrot-Julia set and study the topological inflexibility of distributing of period region in the Mandelbrot set, and finds that Mandelbrot set contain abundant information of structure of Julia sets by founding the whole portray of Julia sets based on Mandelbrot set qualitatively.

  13. [Unconditioned logistic regression and sample size: a bibliographic review].

    PubMed

    Ortega Calvo, Manuel; Cayuela Domínguez, Aurelio

    2002-01-01

    Unconditioned logistic regression is a highly useful risk prediction method in epidemiology. This article reviews the different solutions provided by different authors concerning the interface between the calculation of the sample size and the use of logistics regression. Based on the knowledge of the information initially provided, a review is made of the customized regression and predictive constriction phenomenon, the design of an ordinal exposition with a binary output, the event of interest per variable concept, the indicator variables, the classic Freeman equation, etc. Some skeptical ideas regarding this subject are also included. PMID:12025266

  14. Establishing performance requirements of computer based systems subject to uncertainty

    SciTech Connect

    Robinson, D.

    1997-02-01

    An organized systems design approach is dictated by the increasing complexity of computer based systems. Computer based systems are unique in many respects but share many of the same problems that have plagued design engineers for decades. The design of complex systems is difficult at best, but as a design becomes intensively dependent on the computer processing of external and internal information, the design process quickly borders chaos. This situation is exacerbated with the requirement that these systems operate with a minimal quantity of information, generally corrupted by noise, regarding the current state of the system. Establishing performance requirements for such systems is particularly difficult. This paper briefly sketches a general systems design approach with emphasis on the design of computer based decision processing systems subject to parameter and environmental variation. The approach will be demonstrated with application to an on-board diagnostic (OBD) system for automotive emissions systems now mandated by the state of California and the Federal Clean Air Act. The emphasis is on an approach for establishing probabilistically based performance requirements for computer based systems.

  15. Confidence measure and performance evaluation for HRRR-based classifiers

    NASA Astrophysics Data System (ADS)

    Rago, Constantino; Zajic, Tim; Huff, Melvyn; Mehra, Raman K.; Mahler, Ronald P. S.; Noviskey, Michael J.

    2002-07-01

    The work presented here is a continuation of research first reported in Mahler, et. al. Our earlier efforts included integrating the Statistical Features algorithm with a Bayesian nonlinear filter, allowing simultaneous determination of target position, velocity, pose and type via maximum a posteriori estimation. We then considered three alternative classifiers: the first based on a principal component decomposition, the second on a linear discriminant approach, and the third on a wavelet representation. In addition, preliminary results were given with regards to assigning a measure of confidence to the output of the wavelet based classifier. In this paper we continue to address the problem of target classification based on high range resolution radar signatures. In particular, we examine the performance of a variant of the principal component based classifier as the number of principal components is varied. We have chosen to quantify the performance in terms of the Bhattacharyya distance. We also present further results regarding the assignment of confidence values to the output of the wavelet based classifier.

  16. Improving the performance of a filling line based on simulation

    NASA Astrophysics Data System (ADS)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  17. Performance characterization of structured light-based fingerprint scanner

    NASA Astrophysics Data System (ADS)

    Hassebrook, Laurence G.; Wang, Minghao; Daley, Raymond C.

    2013-05-01

    Our group believes that the evolution of fingerprint capture technology is in transition to include 3-D non-contact fingerprint capture. More specifically we believe that systems based on structured light illumination provide the highest level of depth measurement accuracy. However, for these new technologies to be fully accepted by the biometric community, they must be compliant with federal standards of performance. At present these standards do not exist for this new biometric technology. We propose and define a set of test procedures to be used to verify compliance with the Federal Bureau of Investigation's image quality specification for Personal Identity Verification single fingerprint capture devices. The proposed test procedures include: geometric accuracy, lateral resolution based on intensity or depth, gray level uniformity and flattened fingerprint image quality. Several 2-D contact analogies, performance tradeoffs and optimization dilemmas are evaluated and proposed solutions are presented.

  18. Tools for evaluating team performance in simulation-based training

    PubMed Central

    Rosen, Michael A; Weaver, Sallie J; Lazzara, Elizabeth H; Salas, Eduardo; Wu, Teresa; Silvestri, Salvatore; Schiebel, Nicola; Almeida, Sandra; King, Heidi B

    2010-01-01

    Teamwork training constitutes one of the core approaches for moving healthcare systems toward increased levels of quality and safety, and simulation provides a powerful method of delivering this training, especially for face-paced and dynamic specialty areas such as Emergency Medicine. Team performance measurement and evaluation plays an integral role in ensuring that simulation-based training for teams (SBTT) is systematic and effective. However, this component of SBTT systems is overlooked frequently. This article addresses this gap by providing a review and practical introduction to the process of developing and implementing evaluation systems in SBTT. First, an overview of team performance evaluation is provided. Second, best practices for measuring team performance in simulation are reviewed. Third, some of the prominent measurement tools in the literature are summarized and discussed relative to the best practices. Subsequently, implications of the review are discussed for the practice of training teamwork in Emergency Medicine. PMID:21063558

  19. Infrared target tracking with kernel-based performance metric and eigenvalue-based similarity measure

    NASA Astrophysics Data System (ADS)

    Ling, Jianguo; Liu, Erqi; Liang, Haiyan; Yang, Jie

    2007-06-01

    An infrared target tracking framework is presented that consists of three main parts: mean shift tracking, its tracking performance evaluation, and position correction. The mean shift tracking algorithm, which is a widely used kernel-based method, has been developed for the initial tracking for its efficiency and effectiveness. A performance evaluation module is applied for the online evaluation of its tracking performance with a kernel- based metric to unify the tracking and performance metric within a kernel-based tracking framework. Then the tracking performance evaluation result is input into a controller in which a decision is made whether to trigger a position correction process. The position correction module employs a matching method with a new eigenvalue-based similarity measure computed from a local complexity degree weighted covariance matrix. Experimental results on real-life infrared image sequences are presented to demonstrate the efficacy of the proposed method.

  20. ASP Performance Assessment: Toward a Science-Based Understanding

    SciTech Connect

    Sale, K

    2008-04-22

    Several approaches to ASP performance can be contemplated. Perhaps the ideal would be a full cost/benefit analysis (which is probably utterly infeasible). Another approach would be a test-based figure-of-merit (FOM), this approach has the virtue of being quantitative and the challenge that each customer and application would be characterized by a different FOM. The alternative proposed here is an approach that uses information about the limits of detection of real instruments to support informed judgments.

  1. Estimation of Ultrafilter Performance Based on Characterization Data

    SciTech Connect

    Peterson, Reid A.; Geeting, John GH; Daniel, Richard C.

    2007-08-02

    Due to limited availability of test data with actual waste samples, a method was developed to estimate expected filtration performance based on physical characterization data for the Hanford Waste Treatment and Immobilization Plant. A test with simulated waste was analyzed to demonstrate that filtration of this class of waste is consistent with a concentration polarization model. Subsequently, filtration data from actual waste samples were analyzed to demonstrate that centrifuged solids concentrations provide a reasonable estimate of the limiting concentration for filtration.

  2. A High Performance Content Based Recommender System Using Hypernym Expansion

    SciTech Connect

    Potok, Thomas E; Patton, Robert M

    2015-10-20

    There are two major limitations in content-based recommender systems, the first is accurately measuring the similarity of preferred documents to a large set of general documents, and the second is over-specialization which limits the "interesting" documents recommended from a general document set. To address these issues, we propose combining linguistic methods and term frequency methods to improve overall performance and recommendation.

  3. Performance analysis of vortex based mixers for confined flows

    NASA Astrophysics Data System (ADS)

    Buschhagen, Timo

    The hybrid rocket is still sparsely employed within major space or defense projects due to their relatively poor combustion efficiency and low fuel grain regression rate. Although hybrid rockets can claim advantages in safety, environmental and performance aspects against established solid and liquid propellant systems, the boundary layer combustion process and the diffusion based mixing within a hybrid rocket grain port leaves the core flow unmixed and limits the system performance. One principle used to enhance the mixing of gaseous flows is to induce streamwise vorticity. The counter-rotating vortex pair (CVP) mixer utilizes this principle and introduces two vortices into a confined flow, generating a stirring motion in order to transport near wall media towards the core and vice versa. Recent studies investigated the velocity field introduced by this type of swirler. The current work is evaluating the mixing performance of the CVP concept, by using an experimental setup to simulate an axial primary pipe flow with a radially entering secondary flow. Hereby the primary flow is altered by the CVP swirler unit. The resulting setup therefore emulates a hybrid rocket motor with a cylindrical single port grain. In order to evaluate the mixing performance the secondary flow concentration at the pipe assembly exit is measured, utilizing a pressure-sensitive paint based procedure.

  4. A Comparison of Hierarchical and Nonhierarchical Logistic Regression for Estimating Cutoff Scores in Course Placement.

    ERIC Educational Resources Information Center

    Schulz, E. Matthew; Betebenner, Damian; Ahn, Meeyeon

    This study was performed to determine whether hierarchical logistic regression models could reduce the sample size requirements of ordinary (nonhierarchical) logistic regression models. Data from courses with varying class size were randomly partitioned into two halves per course. Grades of students in college algebra courses were obtained from 40…

  5. ASCAL: A Microcomputer Program for Estimating Logistic IRT Item Parameters.

    ERIC Educational Resources Information Center

    Vale, C. David; Gialluca, Kathleen A.

    ASCAL is a microcomputer-based program for calibrating items according to the three-parameter logistic model of item response theory. It uses a modified multivariate Newton-Raphson procedure for estimating item parameters. This study evaluated this procedure using Monte Carlo Simulation Techniques. The current version of ASCAL was then compared to…

  6. Perfectionism in Anorexia Nervosa: Novel Performance Based Evidence

    PubMed Central

    Lloyd, Samantha; Yiend, Jenny; Schmidt, Ulrike; Tchanturia, Kate

    2014-01-01

    Existing research into perfectionism in Anorexia Nervosa (AN) is limited by a reliance upon self-report measures. This study used novel performance based measures to investigate whether there is behavioural evidence for elevated perfectionism in AN. 153 participants took part in the study – 81 with a diagnosis of AN and 72 healthy controls (HCs). Participants completed two performance based tasks assessing perfectionism – a text replication task and a bead sorting task – along with self-report measures of perfectionism. Significant group differences were observed on both tasks. In the text replication task the AN group took significantly longer compared with healthy controls (p = 0.03, d = 0.36) and produced significantly higher quality copies (p = <0.01, d = 0.45). In the bead sorting task, there was a trend towards more participants in the AN group choosing to check their work compared with the HC group (p = 0.07, d = 0.30) and the AN group took significantly longer checking than those in the HC group (p = <0.01, d = 0.45). Only copy quality uniquely predicted scores on self report measures of perfectionism. This study provides empirically tested evidence of elevated performance based perfectionism in AN compared with a healthy control group. PMID:25360690

  7. Parallel performance optimizations on unstructured mesh-based simulations

    DOE PAGESBeta

    Sarje, Abhinav; Song, Sukhyun; Jacobsen, Douglas; Huck, Kevin; Hollingsworth, Jeffrey; Malony, Allen; Williams, Samuel; Oliker, Leonid

    2015-06-01

    This paper addresses two key parallelization challenges the unstructured mesh-based ocean modeling code, MPAS-Ocean, which uses a mesh based on Voronoi tessellations: (1) load imbalance across processes, and (2) unstructured data access patterns, that inhibit intra- and inter-node performance. Our work analyzes the load imbalance due to naive partitioning of the mesh, and develops methods to generate mesh partitioning with better load balance and reduced communication. Furthermore, we present methods that minimize both inter- and intranode data movement and maximize data reuse. Our techniques include predictive ordering of data elements for higher cache efficiency, as well as communication reduction approaches.more » We present detailed performance data when running on thousands of cores using the Cray XC30 supercomputer and show that our optimization strategies can exceed the original performance by over 2×. Additionally, many of these solutions can be broadly applied to a wide variety of unstructured grid-based computations.« less

  8. Performance-based competencies for culturally responsive interprofessional collaborative practice.

    PubMed

    Banfield, Valerie; Lackie, Kelly

    2009-11-01

    This paper will highlight how a literature review and stakeholder-expert feedback guided the creation of an interprofessional facilitator-collaborator competency tool, which was then used to design an interprofessional facilitator development program for the Partners for Interprofessional Cancer Education (PICE) Project. Cancer Care Nova Scotia (CCNS), one of the PICE Project partners, uses an Interprofessional Core Curriculum (ICC) to provide continuing education workshops to community-based practitioners, who as a portion of their practice, care for patients experiencing cancer. In order to deliver this curriculum, health professionals from a variety of disciplines required education that would enable them to become culturally sensitive interprofessional educators in promoting collaborative patient-centred practice. The Registered Nurses Professional Development Centre (RN-PDC), another PICE Project partner, has expertise in performance-based certification program design and utilizes a competency-based methodology in its education framework. This framework and methodology was used to develop the necessary interprofessional facilitator competencies that incorporate the knowledge, skills, and attitudes required for performance. Three main competency areas evolved, each with its own set of competencies, performance criteria and behavioural indicators.

  9. Perfectionism in anorexia nervosa: novel performance based evidence.

    PubMed

    Lloyd, Samantha; Yiend, Jenny; Schmidt, Ulrike; Tchanturia, Kate

    2014-01-01

    Existing research into perfectionism in Anorexia Nervosa (AN) is limited by a reliance upon self-report measures. This study used novel performance based measures to investigate whether there is behavioural evidence for elevated perfectionism in AN. 153 participants took part in the study--81 with a diagnosis of AN and 72 healthy controls (HCs). Participants completed two performance based tasks assessing perfectionism--a text replication task and a bead sorting task--along with self-report measures of perfectionism. Significant group differences were observed on both tasks. In the text replication task the AN group took significantly longer compared with healthy controls (p = 0.03, d = 0.36) and produced significantly higher quality copies (p = <0.01, d = 0.45). In the bead sorting task, there was a trend towards more participants in the AN group choosing to check their work compared with the HC group (p = 0.07, d = 0.30) and the AN group took significantly longer checking than those in the HC group (p = <0.01, d = 0.45). Only copy quality uniquely predicted scores on self report measures of perfectionism. This study provides empirically tested evidence of elevated performance based perfectionism in AN compared with a healthy control group. PMID:25360690

  10. Criteria for Identifying Radiologists with Acceptable Screening Mammography Interpretive Performance based on Multiple Performance Measures

    PubMed Central

    Miglioretti, Diana L.; Ichikawa, Laura; Smith, Robert A.; Bassett, Lawrence W.; Feig, Stephen A.; Monsees, Barbara; Parikh, Jay R.; Rosenberg, Robert D.; Sickles, Edward A.; Carney, Patricia A.

    2014-01-01

    Objective Using a combination of performance measures, we updated previously proposed criteria for identifying physicians whose performance interpreting screening mammograms may indicate suboptimal interpretation skills. Materials and Methods In this Institutional Review Board-approved, HIPAA-compliant study, six expert breast imagers used a method based on the Angoff approach to update criteria for acceptable mammography performance on the basis of combined performance measures: (Group 1) sensitivity and specificity, for facilities with complete capture of false-negative cancers; and (Group 2) cancer detection rate (CDR), recall rate, and positive predictive value of a recall (PPV1), for facilities that cannot capture false negatives, but have reliable cancer follow-up information for positive mammograms. Decisions were informed by normative data from the Breast Cancer Surveillance Consortium (BCSC). Results Updated, combined ranges for acceptable sensitivity and specificity of screening mammography are: (1) sensitivity ≥80% and specificity ≥85% or (2) sensitivity 75–79% and specificity 88–97%. Updated ranges for CDR, recall rate, and PPV1 are: (1) CDR ≥6/1000, recall rate 3–20%, and any PPV1; (2) CDR 4–6/1000, recall rate 3–15%, and PPV1 ≥3%; or (3) CDR 2.5–4/1000, recall rate 5–12%, and PPV1 3–8%. Using the original criteria, 51% of BCSC radiologists had acceptable sensitivity and specificity; 40% had acceptable CDR, recall rate, and PPV1. Using the combined criteria, 69% had acceptable sensitivity and specificity and 62% had acceptable CDR, recall rate, and PPV1. Conclusion The combined criteria improve previous criteria by considering the inter-relationships of multiple performance measures and broaden the acceptable performance ranges compared to previous criteria based on individual measures. PMID:25794100

  11. FUTURE LOGISTICS AND OPERATIONAL ADAPTABILITY

    SciTech Connect

    Houck, Roger P.

    2009-10-01

    While we cannot predict the future, we can ascertain trends and examine them through the use of alternative futures methodologies and tools. From a logistics perspective, we know that many different futures are possible, all of which are obviously dependent on decisions we make in the present. As professional logisticians we are obligated to provide the field - our Soldiers - with our best professional opinion of what will result in success on the battlefield. Our view of the future should take history and contemporary conflict into account, but it must also consider that continuity with the past cannot be taken for granted. If we are too focused on past and current experience, then our vision of the future will be limited indeed. On the one hand, the future must be explained in language that does not defy common sense. On the other hand, the pace of change is such that we must conduct qualitative and quantitative trend analyses, forecasting, and explorative scenario development in ways that allow for significant breaks - or "shocks" - that may "change the game". We will need capabilities and solutions that are constantly evolving - and improving - to match the operational tempo of a radically changing threat environment. For those who provide quartermaster services, this article will briefly examine what this means from the perspective of creating what might be termed a preferred future.

  12. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points".

  13. Testing Game-Based Performance in Team-Handball.

    PubMed

    Wagner, Herbert; Orwat, Matthias; Hinz, Matthias; Pfusterschmied, Jürgen; Bacharach, David W; von Duvillard, Serge P; Müller, Erich

    2016-10-01

    Wagner, H, Orwat, M, Hinz, M, Pfusterschmied, J, Bacharach, DW, von Duvillard, SP, and Müller, E. Testing game-based performance in team-handball. J Strength Cond Res 30(10): 2794-2801, 2016-Team-handball is a fast paced game of defensive and offensive action that includes specific movements of jumping, passing, throwing, checking, and screening. To date and to the best of our knowledge, a game-based performance test (GBPT) for team-handball does not exist. Therefore, the aim of this study was to develop and validate such a test. Seventeen experienced team-handball players performed 2 GBPTs separated by 7 days between each test, an incremental treadmill running test, and a team-handball test game (TG) (2 × 20 minutes). Peak oxygen uptake (V[Combining Dot Above]O2peak), blood lactate concentration (BLC), heart rate (HR), sprinting time, time of offensive and defensive actions as well as running intensities, ball velocity, and jump height were measured in the game-based test. Reliability of the tests was calculated using an intraclass correlation coefficient (ICC). Additionally, we measured V[Combining Dot Above]O2peak in the incremental treadmill running test and BLC, HR, and running intensities in the team-handball TG to determine the validity of the GBPT. For the test-retest reliability, we found an ICC >0.70 for the peak BLC and HR, mean offense and defense time, as well as ball velocity that yielded an ICC >0.90 for the V[Combining Dot Above]O2peak in the GBPT. Percent walking and standing constituted 73% of total time. Moderate (18%) and high (9%) intensity running in the GBPT was similar to the team-handball TG. Our results indicated that the GBPT is a valid and reliable test to analyze team-handball performance (physiological and biomechanical variables) under conditions similar to competition.

  14. Cluster-localized sparse logistic regression for SNP data.

    PubMed

    Binder, Harald; Müller, Tina; Schwender, Holger; Golka, Klaus; Steffens, Michael; Hengstler, Jan G; Ickstadt, Katja; Schumacher, Martin

    2012-08-14

    The task of analyzing high-dimensional single nucleotide polymorphism (SNP) data in a case-control design using multivariable techniques has only recently been tackled. While many available approaches investigate only main effects in a high-dimensional setting, we propose a more flexible technique, cluster-localized regression (CLR), based on localized logistic regression models, that allows different SNPs to have an effect for different groups of individuals. Separate multivariable regression models are fitted for the different groups of individuals by incorporating weights into componentwise boosting, which provides simultaneous variable selection, hence sparse fits. For model fitting, these groups of individuals are identified using a clustering approach, where each group may be defined via different SNPs. This allows for representing complex interaction patterns, such as compositional epistasis, that might not be detected by a single main effects model. In a simulation study, the CLR approach results in improved prediction performance, compared to the main effects approach, and identification of important SNPs in several scenarios. Improved prediction performance is also obtained for an application example considering urinary bladder cancer. Some of the identified SNPs are predictive for all individuals, while others are only relevant for a specific group. Together with the sets of SNPs that define the groups, potential interaction patterns are uncovered.

  15. Visualizing the Logistic Map with a Microcontroller

    ERIC Educational Resources Information Center

    Serna, Juan D.; Joshi, Amitabh

    2012-01-01

    The logistic map is one of the simplest nonlinear dynamical systems that clearly exhibits the route to chaos. In this paper, we explore the evolution of the logistic map using an open-source microcontroller connected to an array of light-emitting diodes (LEDs). We divide the one-dimensional domain interval [0,1] into ten equal parts, an associate…

  16. Exploration Mission Benefits From Logistics Reduction Technologies

    NASA Technical Reports Server (NTRS)

    Broyan, James Lee, Jr.; Ewert, Michael K.; Schlesinger, Thilini

    2016-01-01

    Technologies that reduce logistical mass, volume, and the crew time dedicated to logistics management become more important as exploration missions extend further from the Earth. Even modest reductions in logistical mass can have a significant impact because it also reduces the packaging burden. NASA's Advanced Exploration Systems' Logistics Reduction Project is developing technologies that can directly reduce the mass and volume of crew clothing and metabolic waste collection. Also, cargo bags have been developed that can be reconfigured for crew outfitting, and trash processing technologies are under development to increase habitable volume and improve protection against solar storm events. Additionally, Mars class missions are sufficiently distant that even logistics management without resupply can be problematic due to the communication time delay with Earth. Although exploration vehicles are launched with all consumables and logistics in a defined configuration, the configuration continually changes as the mission progresses. Traditionally significant ground and crew time has been required to understand the evolving configuration and to help locate misplaced items. For key mission events and unplanned contingencies, the crew will not be able to rely on the ground for logistics localization assistance. NASA has been developing a radio-frequency-identification autonomous logistics management system to reduce crew time for general inventory and enable greater crew self-response to unplanned events when a wide range of items may need to be located in a very short time period. This paper provides a status of the technologies being developed and their mission benefits for exploration missions.

  17. The Automated Logistics Element Planning System (ALEPS)

    NASA Technical Reports Server (NTRS)

    Schwaab, Douglas G.

    1991-01-01

    The design and functions of ALEPS (Automated Logistics Element Planning System) is a computer system that will automate planning and decision support for Space Station Freedom Logistical Elements (LEs) resupply and return operations. ALEPS provides data management, planning, analysis, monitoring, interfacing, and flight certification for support of LE flight load planning activities. The prototype ALEPS algorithm development is described.

  18. Standards for Standardized Logistic Regression Coefficients

    ERIC Educational Resources Information Center

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  19. The Automated Logistics Element Planning System (ALEPS)

    NASA Technical Reports Server (NTRS)

    Schwaab, Douglas G.

    1992-01-01

    ALEPS, which is being developed to provide the SSF program with a computer system to automate logistics resupply/return cargo load planning and verification, is presented. ALEPS will make it possible to simultaneously optimize both the resupply flight load plan and the return flight reload plan for any of the logistics carriers. In the verification mode ALEPS will support the carrier's flight readiness reviews and control proper execution of the approved plans. It will also support the SSF inventory management system by providing electronic block updates to the inventory database on the cargo arriving at or departing the station aboard a logistics carrier. A prototype drawer packing algorithm is described which is capable of generating solutions for 3D packing of cargo items into a logistics carrier storage accommodation. It is concluded that ALEPS will provide the capability to generate and modify optimized loading plans for the logistics elements fleet.

  20. Do Performance-Based Codes Support Universal Design in Architecture?

    PubMed

    Grangaard, Sidse; Frandsen, Anne Kathrine

    2016-01-01

    The research project 'An analysis of the accessibility requirements' studies how Danish architectural firms experience the accessibility requirements of the Danish Building Regulations and it examines their opinions on how future regulative models can support innovative and inclusive design - Universal Design (UD). The empirical material consists of input from six workshops to which all 700 Danish Architectural firms were invited, as well as eight group interviews. The analysis shows that the current prescriptive requirements are criticized for being too homogenous and possibilities for differentiation and zoning are required. Therefore, a majority of professionals are interested in a performance-based model because they think that such a model will support 'accessibility zoning', achieving flexibility because of different levels of accessibility in a building due to its performance. The common understanding of accessibility and UD is directly related to buildings like hospitals and care centers. When the objective is both innovative and inclusive architecture, the request of a performance-based model should be followed up by a knowledge enhancement effort in the building sector. Bloom's taxonomy of educational objectives is suggested as a tool for such a boost. The research project has been financed by the Danish Transport and Construction Agency. PMID:27534292

  1. Do Performance-Based Codes Support Universal Design in Architecture?

    PubMed

    Grangaard, Sidse; Frandsen, Anne Kathrine

    2016-01-01

    The research project 'An analysis of the accessibility requirements' studies how Danish architectural firms experience the accessibility requirements of the Danish Building Regulations and it examines their opinions on how future regulative models can support innovative and inclusive design - Universal Design (UD). The empirical material consists of input from six workshops to which all 700 Danish Architectural firms were invited, as well as eight group interviews. The analysis shows that the current prescriptive requirements are criticized for being too homogenous and possibilities for differentiation and zoning are required. Therefore, a majority of professionals are interested in a performance-based model because they think that such a model will support 'accessibility zoning', achieving flexibility because of different levels of accessibility in a building due to its performance. The common understanding of accessibility and UD is directly related to buildings like hospitals and care centers. When the objective is both innovative and inclusive architecture, the request of a performance-based model should be followed up by a knowledge enhancement effort in the building sector. Bloom's taxonomy of educational objectives is suggested as a tool for such a boost. The research project has been financed by the Danish Transport and Construction Agency.

  2. Driver performance-based assessment of thermal display degradation effects

    NASA Astrophysics Data System (ADS)

    Ruffner, John W.; Massimi, Michael S.; Choi, Yoon S.; Ferrett, Donald A.

    1998-07-01

    The Driver's Vision Enhancer (DVE) is a thermal sensor and display combination currently being procured for use in U.S. Army combat and tactical wheeled vehicles. During the DVE production process, a given number of sensor or display pixels may either vary from the desired luminance values (nonuniform) or be inactive (nonresponsive). The amount and distribution of pixel luminance nonuniformity (NU) and nonresponsivity (NR) allowable in production DVEs is a significant cost factor. No driver performance-based criteria exist for determining the maximum amount of allowable NU and NR. For safety reasons, these characteristics are specified conservatively. This paper describes an experiment to assess the effects of different levels of display NU and NR on Army drivers' ability to identify scene features and obstacles using a simulated DVE display and videotaped driving scenarios. Baseline, NU, and NR display conditions were simulated using real-time image processing techniques and a computer graphics workstation. The results indicate that there is a small, but statistically insignificant decrease in identification performance with the NU conditions tested. The pattern of the performance-based results is consistent with drivers' subjective assessments of display adequacy. The implications of the results for specifying NU and NR criteria for the DVE display are discussed.

  3. Spatial correlation in Bayesian logistic regression with misclassification.

    PubMed

    Bihrmann, Kristine; Toft, Nils; Nielsen, Søren Saxmose; Ersbøll, Annette Kjær

    2014-06-01

    Standard logistic regression assumes that the outcome is measured perfectly. In practice, this is often not the case, which could lead to biased estimates if not accounted for. This study presents Bayesian logistic regression with adjustment for misclassification of the outcome applied to data with spatial correlation. The models assessed include a fixed effects model, an independent random effects model, and models with spatially correlated random effects modelled using conditional autoregressive prior distributions (ICAR and ICAR(ρ)). Performance of these models was evaluated in a simulation study. Parameters were estimated by Markov Chain Monte Carlo methods, using slice sampling to improve convergence. The results demonstrated that adjustment for misclassification must be included to produce unbiased regression estimates. With strong correlation the ICAR model performed best. With weak or moderate correlation the ICAR(ρ) performed best. With unknown spatial correlation the recommended model would be the ICAR(ρ), assuming convergence can be obtained. PMID:24889989

  4. Performance modeling of feature-based classification in SAR imagery

    NASA Astrophysics Data System (ADS)

    Boshra, Michael; Bhanu, Bir

    1998-09-01

    We present a novel method for modeling the performance of a vote-based approach for target classification in SAR imagery. In this approach, the geometric locations of the scattering centers are used to represent 2D model views of a 3D target for a specific sensor under a given viewing condition (azimuth, depression and squint angles). Performance of such an approach is modeled in the presence of data uncertainty, occlusion, and clutter. The proposed method captures the structural similarity between model views, which plays an important role in determining the classification performance. In particular, performance would improve if the model views are dissimilar and vice versa. The method consists of the following steps. In the first step, given a bound on data uncertainty, model similarity is determined by finding feature correspondence in the space of relative translations between each pair of model views. In the second step, statistical analysis is carried out in the vote, occlusion and clutter space, in order to determine the probability of misclassifying each model view. In the third step, the misclassification probability is averaged for all model views to estimate the probability-of-correct- identification (PCI) plot as a function of occlusion and clutter rates. Validity of the method is demonstrated by comparing predicted PCI plots with ones that are obtained experimentally. Results are presented using both XPATCH and MSTAR SAR data.

  5. Performance analysis of charge plasma based dual electrode tunnel FET

    NASA Astrophysics Data System (ADS)

    Anand, Sunny; Intekhab Amin, S.; Sarin, R. K.

    2016-05-01

    This paper proposes the charge plasma based dual electrode doping-less tunnel FET (DEDLTFET). The paper compares the device performance of the conventional doping-less TFET (DLTFET) and doped TFET (DGTFET). DEDLTEFT gives the superior results with high ON state current (ION ∼ 0.56 mA/μm), ION/IOFF ratio ∼ 9.12 × 1013 and an average subthreshold swing (AV-SS ∼ 48 mV/dec). The variation of different device parameters such as channel length, gate oxide material, gate oxide thickness, silicon thickness, gate work function and temperature variation are done and compared with DLTFET and DGTFET. Through the extensive analysis it is found that DEDLTFET shows the better performance than the other two devices, which gives the indication for an excellent future in low power applications.

  6. GPU-based high-performance computing for radiation therapy.

    PubMed

    Jia, Xun; Ziegenhein, Peter; Jiang, Steve B

    2014-02-21

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented.

  7. GPU-based High-Performance Computing for Radiation Therapy

    PubMed Central

    Jia, Xun; Ziegenhein, Peter; Jiang, Steve B.

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. Graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past a few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of studies have been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this article, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. PMID:24486639

  8. Transportation mode performance comparison for a sustained manned Mars base

    NASA Technical Reports Server (NTRS)

    Hoffman, S. J.; Friedlander, A. L.; Nock, K. T.

    1986-01-01

    The results of a study performed to characterize the propellant mass requirements of two new types of orbit transfers between earth and Mars are discussed. These new orbit types, called VISIT and Up/Down Escalators, cycle continuously between the two planets, allowing a large crew facility, or CASTLE, to remain in this orbit with smaller crew transfer vehicles, or Taxis, used to shuttle between planetary orbits and this vehicle. Trajectory options and infrastructure elements are discussed along with the assumptions made in this study. The latter include the existence of a mature Martian base and the production and use of extraterrestrial propellants. Performance results from each of the three orbit transfer options presented here are discussed.

  9. Personality based clusters as predictors of aviator attitudes and performance

    NASA Technical Reports Server (NTRS)

    Gregorich, Steve; Helmreich, Robert L.; Wilhelm, John A.; Chidester, Thomas

    1989-01-01

    The feasibility of identification of personality-based population clusters was investigated along with the relationships of these subpopulations to relevant attitude and performance measures. The results of instrumental and expressive personality tests, using the Personal Characteristics Inventory (PCI) test battery and the Cockpit Management Attitudes Questionnaire, suggest that theoretically meaningful subpopulations exist among aviators, and that these groupings are useful in understanding of personality factors acting as moderator variables in the determination of aviator attitudes and performance. Out of the three clusters most easily described in terms of their relative elevations on the PCI subscales ('the right stuff', the 'wrong stuff', and the 'no stuff'), the members of the right stuff cluster tended to have more desirable patterns of responses along relevant attitudinal dimensions.

  10. Vanadium based materials as electrode materials for high performance supercapacitors

    NASA Astrophysics Data System (ADS)

    Yan, Yan; Li, Bing; Guo, Wei; Pang, Huan; Xue, Huaiguo

    2016-10-01

    As a kind of supercapacitors, pseudocapacitors have attracted wide attention in recent years. The capacitance of the electrochemical capacitors based on pseudocapacitance arises mainly from redox reactions between electrolytes and active materials. These materials usually have several oxidation states for oxidation and reduction. Many research teams have focused on the development of an alternative material for electrochemical capacitors. Many transition metal oxides have been shown to be suitable as electrode materials of electrochemical capacitors. Among them, vanadium based materials are being developed for this purpose. Vanadium based materials are known as one of the best active materials for high power/energy density electrochemical capacitors due to its outstanding specific capacitance and long cycle life, high conductivity and good electrochemical reversibility. There are different kinds of synthetic methods such as sol-gel hydrothermal/solvothermal method, template method, electrospinning method, atomic layer deposition, and electrodeposition method that have been successfully applied to prepare vanadium based electrode materials. In our review, we give an overall summary and evaluation of the recent progress in the research of vanadium based materials for electrochemical capacitors that include synthesis methods, the electrochemical performances of the electrode materials and the devices.

  11. Performance based vs. compliance based auditing: The similarities and the differences

    SciTech Connect

    Malsbury, J.A.

    1996-09-26

    Princeton University`s Plasma Physics Laboratory (PPPL) is a world leader in research associated with plasma science including the use of materials, the development of future fusion devices, and the application of plasma techniques in industry. At PPPL, one of Quality Assurance`s responsibilities includes the internal audit/appraisal program. In early FY95 a task force, including representation from internal customers, was created to improve the program and to assure that the program better supports the mission of the Laboratory. One of the most significant changes recommended by the task force was to move from a compliance based auditing program to a performance based program. A trial of this change was successfully performed in fiscal year 1995. Because of the success of the trial, this change was adopted as standard practice. Today, a scheduled audit may be performance based, compliance based, or a combination of the two as determined jointly by the Quality Assurance Manager and the management of the program to be audited. This paper discusses the similarities and differences between these two types of audits. Both audits are performed to effect improvements in the program being audited. However, compliance based audits focus on compliance issues with the risk of missing performance or efficiency issues. Performance based audits identify system level problems and inefficiencies but may miss compliance issues.

  12. Electrolytic actuators: alternative, high-performance, material-based devices.

    PubMed

    Cameron, Colin G; Freund, Michael S

    2002-06-11

    The emerging field of materials-based actuation continues to be the focus of considerable research because of its inherent scalability and its promise to drive micromechanical devices that cannot be realized with conventional mechanical actuator strategies. The electrolytic phase transformation actuator offers a new broad-spectrum solution to the problem of direct conversion of electrical to mechanical energy. Strains of 136,000% and unoptimized work cycle efficiencies near 50% are demonstrated in a prototype device. Conceivably capable of generating stress beyond 200 MPa, this new approach promises performance orders of magnitude beyond other novel actuation strategies.

  13. Small Cation-Based High-Performance Energetic Nitraminofurazanates.

    PubMed

    Tang, Yongxing; He, Chunlin; Mitchell, Lauren A; Parrish, Damon A; Shreeve, Jean'ne M

    2016-08-01

    Large nitramino-substituted furazan anions were combined with small cations (hydroxylammonium, hydrazinium, and ammonium) to form a series of energetic salts that was fully characterized. The structures of several of the compounds (1 a, 2 a, 3 a, and 4 a) were further confirmed by single-crystal X-ray diffraction. Based on their physiochemical properties, such as density, thermal stability, and sensitivity, together with the calculated detonation properties, it was found that they exhibit good detonation performance and have potential application as high-energy-density materials. PMID:27356077

  14. Small Cation-Based High-Performance Energetic Nitraminofurazanates.

    PubMed

    Tang, Yongxing; He, Chunlin; Mitchell, Lauren A; Parrish, Damon A; Shreeve, Jean'ne M

    2016-08-01

    Large nitramino-substituted furazan anions were combined with small cations (hydroxylammonium, hydrazinium, and ammonium) to form a series of energetic salts that was fully characterized. The structures of several of the compounds (1 a, 2 a, 3 a, and 4 a) were further confirmed by single-crystal X-ray diffraction. Based on their physiochemical properties, such as density, thermal stability, and sensitivity, together with the calculated detonation properties, it was found that they exhibit good detonation performance and have potential application as high-energy-density materials.

  15. Validation of Ultrafilter Performance Model Based on Systematic Simulant Evaluation

    SciTech Connect

    Russell, Renee L.; Billing, Justin M.; Smith, Harry D.; Peterson, Reid A.

    2009-11-18

    Because of limited availability of test data with actual Hanford tank waste samples, a method was developed to estimate expected filtration performance based on physical characterization data for the Hanford Tank Waste Treatment and Immobilization Plant. A test with simulated waste was analyzed to demonstrate that filtration of this class of waste is consistent with a concentration polarization model. Subsequently, filtration data from actual waste samples were analyzed to demonstrate that centrifuged solids concentrations provide a reasonable estimate of the limiting concentration for filtration.

  16. Trajectory-Based Performance Assessment for Aviation Weather Information

    NASA Technical Reports Server (NTRS)

    Vigeant-Langlois, Laurence; Hansman, R. John, Jr.

    2003-01-01

    Based on an analysis of aviation decision-makers' time-related weather information needs, an abstraction of the aviation weather decision task was developed, that involves 4-D intersection testing between aircraft trajectory hypertubes and hazardous weather hypervolumes. The framework builds on the hypothesis that hazardous meteorological fields can be simplified using discrete boundaries of surrogate threat attributes. The abstractions developed in the framework may be useful in studying how to improve the performance of weather forecasts from the trajectory-centric perspective, as well as for developing useful visualization techniques of weather information.

  17. Electrolytic actuators: Alternative, high-performance, material-based devices

    PubMed Central

    Cameron, Colin G.; Freund, Michael S.

    2002-01-01

    The emerging field of materials-based actuation continues to be the focus of considerable research because of its inherent scalability and its promise to drive micromechanical devices that cannot be realized with conventional mechanical actuator strategies. The electrolytic phase transformation actuator offers a new broad-spectrum solution to the problem of direct conversion of electrical to mechanical energy. Strains of 136,000% and unoptimized work cycle efficiencies near 50% are demonstrated in a prototype device. Conceivably capable of generating stress beyond 200 MPa, this new approach promises performance orders of magnitude beyond other novel actuation strategies. PMID:12060728

  18. Component-based software for high-performance scientific computing

    NASA Astrophysics Data System (ADS)

    Alexeev, Yuri; Allan, Benjamin A.; Armstrong, Robert C.; Bernholdt, David E.; Dahlgren, Tamara L.; Gannon, Dennis; Janssen, Curtis L.; Kenny, Joseph P.; Krishnan, Manojkumar; Kohl, James A.; Kumfert, Gary; Curfman McInnes, Lois; Nieplocha, Jarek; Parker, Steven G.; Rasmussen, Craig; Windus, Theresa L.

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  19. 48 CFR 225.7401 - Contracts requiring performance or delivery in a foreign country.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... operational area, follow the procedures at PGI 225.7401(a). (b) For work performed in Germany, eligibility for logistics support or base privileges of contractor employees is governed by U.S.-German bilateral...

  20. Power and Sample Size Calculations for Logistic Regression Tests for Differential Item Functioning

    ERIC Educational Resources Information Center

    Li, Zhushan

    2014-01-01

    Logistic regression is a popular method for detecting uniform and nonuniform differential item functioning (DIF) effects. Theoretical formulas for the power and sample size calculations are derived for likelihood ratio tests and Wald tests based on the asymptotic distribution of the maximum likelihood estimators for the logistic regression model.…

  1. The performance of biomass-based AMBI in lagoonal ecosystems.

    PubMed

    Mistri, Michele; Munari, Cristina

    2015-10-15

    We studied the performance of the AZTI Marine Biotic Index AMBI manipulating input data collected from lagoonal ecosystems. Our data set consisted of macrofaunal abundance and biomass counts gathered at a variety of sites at which the disturbance status was known. Input data were also manipulated using a set of transformations of increasing severity. Biotic indices were calculated using raw and transformed abundance, biomass and production. Among the three categories of AMBI-based indices, medium transformation of data gave the highest correlation with pressures. However, increasing the severity of transformation generally resulted in a decrease of the correlation with environmental factors. The relative importance of ecological groups changed when using abundance or biomass, sometimes leading to an improved ecological status classification. Being biomass and production more ecologically relevant than abundance, using them to derive AMBI-based new indices seems intriguing, at least in lagoonal waters, where the community is naturally disturbed and dominated by opportunists. PMID:26219686

  2. Insight into mechanisms of reduced orthostatic performance after exposure to microgravity: comparison of ground-based and space flight data

    NASA Technical Reports Server (NTRS)

    Convertino, V. A.

    1998-01-01

    Since the beginning of human spaceflight, the value of understanding mechanisms of physiological adaptation to microgravity became apparent to life scientists who were interested in maintining crew health and developing countermeasures agains adverse effects of the mission. However, several characteristics associated the the logistics of spaceflight presented significant limitations to the scientific study of human adaptation to microgravity. Because space missions are so infrequent and involve minimal numbers of crewmembers, meaninful statistical analysis of data are limited. Reproducibility of results from spaceflight experiments is difficult to assess since there are few repeated space missions involving the same crewmembers. Since the emphasis of space missions is placed on operations, experiments are compromised without adequate control over various factors (e.g., time, diet, physical activities, etc.) that can impact measured responses. With the mimimal opportunity to collect spaceflight data, there is a high risk of experiments that simultaneously interfere with other experiments by the increasing demand on the crewmembers to participate in mumerous experiments proposed by multiple investigators. The technology and ability to measure physiological functions necessary to test specific hypotheses can be severely limited by physical space and power constraints of the space enviroment. Finally, technical and logistical aspects of space missions such as launch delays, extended missions, and inflight operational emergencies can significantly compromise the timing and control of experiments. These limitations have stimulated scientists to develop ground-based analogs of microgravity in an effort to investigate the effects of spaceflight on physiological function in a controlled experimental setting. The purpose of this paper is to provide a selected comparison of data collected from ground-based experiments with those obtained from spaceflight in an effort to

  3. Ranking streamflow model performance based on Information theory metrics

    NASA Astrophysics Data System (ADS)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  4. Exploration Mission Benefits From Logistics Reduction Technologies

    NASA Technical Reports Server (NTRS)

    Broyan, James Lee, Jr.; Schlesinger, Thilini; Ewert, Michael K.

    2016-01-01

    Technologies that reduce logistical mass, volume, and the crew time dedicated to logistics management become more important as exploration missions extend further from the Earth. Even modest reductions in logical mass can have a significant impact because it also reduces the packing burden. NASA's Advanced Exploration Systems' Logistics Reduction Project is developing technologies that can directly reduce the mass and volume of crew clothing and metabolic waste collection. Also, cargo bags have been developed that can be reconfigured for crew outfitting and trash processing technologies to increase habitable volume and improve protection against solar storm events are under development. Additionally, Mars class missions are sufficiently distant that even logistics management without resupply can be problematic due to the communication time delay with Earth. Although exploration vehicles are launched with all consumables and logistics in a defined configuration, the configuration continually changes as the mission progresses. Traditionally significant ground and crew time has been required to understand the evolving configuration and locate misplaced items. For key mission events and unplanned contingencies, the crew will not be able to rely on the ground for logistics localization assistance. NASA has been developing a radio frequency identification autonomous logistics management system to reduce crew time for general inventory and enable greater crew self-response to unplanned events when a wide range of items may need to be located in a very short time period. This paper provides a status of the technologies being developed and there mission benefits for exploration missions.

  5. ISS Logistics Hardware Disposition and Metrics Validation

    NASA Technical Reports Server (NTRS)

    Rogers, Toneka R.

    2010-01-01

    I was assigned to the Logistics Division of the International Space Station (ISS)/Spacecraft Processing Directorate. The Division consists of eight NASA engineers and specialists that oversee the logistics portion of the Checkout, Assembly, and Payload Processing Services (CAPPS) contract. Boeing, their sub-contractors and the Boeing Prime contract out of Johnson Space Center, provide the Integrated Logistics Support for the ISS activities at Kennedy Space Center. Essentially they ensure that spares are available to support flight hardware processing and the associated ground support equipment (GSE). Boeing maintains a Depot for electrical, mechanical and structural modifications and/or repair capability as required. My assigned task was to learn project management techniques utilized by NASA and its' contractors to provide an efficient and effective logistics support infrastructure to the ISS program. Within the Space Station Processing Facility (SSPF) I was exposed to Logistics support components, such as, the NASA Spacecraft Services Depot (NSSD) capabilities, Mission Processing tools, techniques and Warehouse support issues, required for integrating Space Station elements at the Kennedy Space Center. I also supported the identification of near-term ISS Hardware and Ground Support Equipment (GSE) candidates for excessing/disposition prior to October 2010; and the validation of several Logistics Metrics used by the contractor to measure logistics support effectiveness.

  6. Logistics Modeling for Lunar Exploration Systems

    NASA Technical Reports Server (NTRS)

    Andraschko, Mark R.; Merrill, R. Gabe; Earle, Kevin D.

    2008-01-01

    The extensive logistics required to support extended crewed operations in space make effective modeling of logistics requirements and deployment critical to predicting the behavior of human lunar exploration systems. This paper discusses the software that has been developed as part of the Campaign Manifest Analysis Tool in support of strategic analysis activities under the Constellation Architecture Team - Lunar. The described logistics module enables definition of logistics requirements across multiple surface locations and allows for the transfer of logistics between those locations. A key feature of the module is the loading algorithm that is used to efficiently load logistics by type into carriers and then onto landers. Attention is given to the capabilities and limitations of this loading algorithm, particularly with regard to surface transfers. These capabilities are described within the context of the object-oriented software implementation, with details provided on the applicability of using this approach to model other human exploration scenarios. Some challenges of incorporating probabilistics into this type of logistics analysis model are discussed at a high level.

  7. Tobacco Stem-Based Activated Carbons for High Performance Supercapacitors

    NASA Astrophysics Data System (ADS)

    Xia, Xiaohong; Liu, Hongbo; Shi, Lei; He, Yuede

    2012-09-01

    Tobacco stem-based activated carbons (TS-ACs) were prepared by simple KOH activation and their application as electrodes in the electrical double layer capacitor (EDLC) performed successfully. The BET surface area, pore volume, and pore size distribution of the TS-ACs were evaluated based on N2 adsorption isotherms at 77 K. The surface area of the obtained activated carbons varies over a wide range (1472.8-3326.7 m2/g) and the mesoporosity was enhanced significantly as the ratio of KOH to tobacco stem (TS) increased. The electrochemical behaviors of series TS-ACs were characterized by means of galvanostatic charging/discharging, cyclic voltammetry, and impedance spectroscopy. The correlation between electrochemical properties and pore structure was investigated. A high specific capacitance value as 190 F/g at 1 mA/cm2 was obtained in 1 M LiPF6-EC/DMC/DEC electrolyte solution. Furthermore, good performance is also achieved even at high current densities. A development of new use for TS into a valuable energy storage material is explored.

  8. Integrating reconfigurable hardware-based grid for high performance computing.

    PubMed

    Dondo Gazzano, Julio; Sanchez Molina, Francisco; Rincon, Fernando; López, Juan Carlos

    2015-01-01

    FPGAs have shown several characteristics that make them very attractive for high performance computing (HPC). The impressive speed-up factors that they are able to achieve, the reduced power consumption, and the easiness and flexibility of the design process with fast iterations between consecutive versions are examples of benefits obtained with their use. However, there are still some difficulties when using reconfigurable platforms as accelerator that need to be addressed: the need of an in-depth application study to identify potential acceleration, the lack of tools for the deployment of computational problems in distributed hardware platforms, and the low portability of components, among others. This work proposes a complete grid infrastructure for distributed high performance computing based on dynamically reconfigurable FPGAs. Besides, a set of services designed to facilitate the application deployment is described. An example application and a comparison with other hardware and software implementations are shown. Experimental results show that the proposed architecture offers encouraging advantages for deployment of high performance distributed applications simplifying development process.

  9. Integrating reconfigurable hardware-based grid for high performance computing.

    PubMed

    Dondo Gazzano, Julio; Sanchez Molina, Francisco; Rincon, Fernando; López, Juan Carlos

    2015-01-01

    FPGAs have shown several characteristics that make them very attractive for high performance computing (HPC). The impressive speed-up factors that they are able to achieve, the reduced power consumption, and the easiness and flexibility of the design process with fast iterations between consecutive versions are examples of benefits obtained with their use. However, there are still some difficulties when using reconfigurable platforms as accelerator that need to be addressed: the need of an in-depth application study to identify potential acceleration, the lack of tools for the deployment of computational problems in distributed hardware platforms, and the low portability of components, among others. This work proposes a complete grid infrastructure for distributed high performance computing based on dynamically reconfigurable FPGAs. Besides, a set of services designed to facilitate the application deployment is described. An example application and a comparison with other hardware and software implementations are shown. Experimental results show that the proposed architecture offers encouraging advantages for deployment of high performance distributed applications simplifying development process. PMID:25874241

  10. Integrating Reconfigurable Hardware-Based Grid for High Performance Computing

    PubMed Central

    Dondo Gazzano, Julio; Sanchez Molina, Francisco; Rincon, Fernando; López, Juan Carlos

    2015-01-01

    FPGAs have shown several characteristics that make them very attractive for high performance computing (HPC). The impressive speed-up factors that they are able to achieve, the reduced power consumption, and the easiness and flexibility of the design process with fast iterations between consecutive versions are examples of benefits obtained with their use. However, there are still some difficulties when using reconfigurable platforms as accelerator that need to be addressed: the need of an in-depth application study to identify potential acceleration, the lack of tools for the deployment of computational problems in distributed hardware platforms, and the low portability of components, among others. This work proposes a complete grid infrastructure for distributed high performance computing based on dynamically reconfigurable FPGAs. Besides, a set of services designed to facilitate the application deployment is described. An example application and a comparison with other hardware and software implementations are shown. Experimental results show that the proposed architecture offers encouraging advantages for deployment of high performance distributed applications simplifying development process. PMID:25874241

  11. Performance Characteristic Mems-Based IMUs for UAVs Navigation

    NASA Astrophysics Data System (ADS)

    Mohamed, H. A.; Hansen, J. M.; Elhabiby, M. M.; El-Sheimy, N.; Sesay, A. B.

    2015-08-01

    Accurate 3D reconstruction has become essential for non-traditional mapping applications such as urban planning, mining industry, environmental monitoring, navigation, surveillance, pipeline inspection, infrastructure monitoring, landslide hazard analysis, indoor localization, and military simulation. The needs of these applications cannot be satisfied by traditional mapping, which is based on dedicated data acquisition systems designed for mapping purposes. Recent advances in hardware and software development have made it possible to conduct accurate 3D mapping without using costly and high-end data acquisition systems. Low-cost digital cameras, laser scanners, and navigation systems can provide accurate mapping if they are properly integrated at the hardware and software levels. Unmanned Aerial Vehicles (UAVs) are emerging as a mobile mapping platform that can provide additional economical and practical advantages. However, such economical and practical requirements need navigation systems that can provide uninterrupted navigation solution. Hence, testing the performance characteristics of Micro-Electro-Mechanical Systems (MEMS) or low cost navigation sensors for various UAV applications is important research. This work focuses on studying the performance characteristics under different manoeuvres using inertial measurements integrated with single point positioning, Real-Time-Kinematic (RTK), and additional navigational aiding sensors. Furthermore, the performance of the inertial sensors is tested during Global Positioning System (GPS) signal outage.

  12. Neurophysiological predictor of SMR-based BCI performance.

    PubMed

    Blankertz, Benjamin; Sannelli, Claudia; Halder, Sebastian; Hammer, Eva M; Kübler, Andrea; Müller, Klaus-Robert; Curio, Gabriel; Dickhaus, Thorsten

    2010-07-15

    Brain-computer interfaces (BCIs) allow a user to control a computer application by brain activity as measured, e.g., by electroencephalography (EEG). After about 30years of BCI research, the success of control that is achieved by means of a BCI system still greatly varies between subjects. For about 20% of potential users the obtained accuracy does not reach the level criterion, meaning that BCI control is not accurate enough to control an application. The determination of factors that may serve to predict BCI performance, and the development of methods to quantify a predictor value from psychological and/or physiological data serve two purposes: a better understanding of the 'BCI-illiteracy phenomenon', and avoidance of a costly and eventually frustrating training procedure for participants who might not obtain BCI control. Furthermore, such predictors may lead to approaches to antagonize BCI illiteracy. Here, we propose a neurophysiological predictor of BCI performance which can be determined from a two minute recording of a 'relax with eyes open' condition using two Laplacian EEG channels. A correlation of r=0.53 between the proposed predictor and BCI feedback performance was obtained on a large data base with N=80 BCI-naive participants in their first session with the Berlin brain-computer interface (BBCI) system which operates on modulations of sensory motor rhythms (SMRs). PMID:20303409

  13. Case base classification on digital mammograms: improving the performance of case base classifier

    NASA Astrophysics Data System (ADS)

    Raman, Valliappan; Then, H. H.; Sumari, Putra; Venkatesa Mohan, N.

    2011-10-01

    Breast cancer continues to be a significant public health problem in the world. Early detection is the key for improving breast cancer prognosis. The aim of the research presented here is in twofold. First stage of research involves machine learning techniques, which segments and extracts features from the mass of digital mammograms. Second level is on problem solving approach which includes classification of mass by performance based case base classifier. In this paper we build a case-based Classifier in order to diagnose mammographic images. We explain different methods and behaviors that have been added to the classifier to improve the performance of the classifier. Currently the initial Performance base Classifier with Bagging is proposed in the paper and it's been implemented and it shows an improvement in specificity and sensitivity.

  14. Logistics intra-theater support tool for course of action logistical feasibility analysis

    SciTech Connect

    Macal, C.M.; Van Groningen, C.N.; Widing, M.a.; Duffy, M.K.

    1989-01-01

    The Logistics Intra-theater Support Tool (LIST) is a prototype system that evaluates a course of action for its logical feasibility. LIST addresses the question of whether a theater's infrastructure and strategic lift allocation are adequate to support the movement of specified ground forces and supplies to a given destination by a specified day. The logistical requirements of a course of action are based on the force units, latest arrival dates, and other scenario data. Capabilities are estimated by simulating the movement of personnel, equipment, and supplies through the theater's seaports, airports, and road/rail network to the destination. The system reports conclusions to the user, identifies real and potential bottlenecks, and recommends modifications to the course of action that are likely to improve its feasibility. LIST includes an expert system written in Prolog that select routes and seaports and move personnel, equipment, and supplies through a road/rail network. A user interacts with LIST through menus and series of map displays. The maps show relevant features and allow the user to access an object-oriented database. 7 refs., 6 figs.

  15. An Approach for Performance Based Glove Mobility Requirements

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay; Benson, Elizabeth; England, Scott

    2015-01-01

    occupational therapy arenas to develop a protocol that assesses gloved range of motion, strength, dexterity, tactility, and fit in comparative quantitative terms and also provides qualitative insight to direct hardware design iterations. The protocol was evaluated using five experienced test subjects wearing the EMU pressurized to 4.3psid with three different glove configurations. The results of the testing are presented to illustrate where the protocol is and is not valid for benchmark comparisons. The process for requirements development based upon the results is also presented along with suggested performance values for the High Performance EVA Gloves to be procured in fiscal year 2015.

  16. An Approach for Performance Based Glove Mobility Requirements

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay; Benson, Elizabeth; England, Scott

    2016-01-01

    occupational therapy arenas to develop a protocol that assesses gloved range of motion, strength, dexterity, tactility, and fit in comparative quantitative terms and also provides qualitative insight to direct hardware design iterations. The protocol was evaluated using five experienced test subjects wearing the EMU pressurized to 4.3psid with three different glove configurations. The results of the testing are presented to illustrate where the protocol is and is not valid for benchmark comparisons. The process for requirements development based upon the results is also presented along with suggested performance values for the High Performance EVA Gloves currently in development.

  17. Removing very low-performing therapists: A simulation of performance-based retention in psychotherapy

    PubMed Central

    Imel, Zac E.; Sheng, Elisa; Baldwin, Scott A.; Atkins, David C.

    2016-01-01

    Therapists can impact the likelihood a given patient will benefit from psychotherapy. However, therapists are rarely held accountable for their patients' outcomes. As a result, low performing providers likely continue to practice alongside providers with high response rates. In the current study, we conducted a Monte Carlo simulation to illustrate a thought experiment—what happens to patient outcomes if therapists with the worst outcomes were removed from practice? We drew initial samples of 50 therapists from three simulated populations of 1,000 therapists with a mean patient response rate of 50% and different effect sizes for therapist variability in outcomes. We simulated 30 patient outcomes for each therapist, with outcome defined as response to treatment versus no response. We removed therapists with response rates in the bottom 5% and replaced them with a random sample of therapists from the population. Over 10 years, the difference in responses between the lowest and highest performing therapists was substantial (between 697 and 997 additional responses to treatment). After repeatedly removing the lowest performing providers 40 times (simulating a 10 year time span), response rates increased substantially. The cumulative number of patient responses (i.e., summing the total number of responses across 10 years) increased by 4266, 6404, and 9307 when therapists accounted for 5%, 10%, or 20% of the patient outcome variance, respectively. These findings indicate that performance-based retention of therapists could improve the quality of psychotherapy in health systems by improving the average response rate and decreasing the probability that a patient will be treated by a therapist who has little chance of helping. PMID:26301424

  18. Removing very low-performing therapists: A simulation of performance-based retention in psychotherapy.

    PubMed

    Imel, Zac E; Sheng, Elisa; Baldwin, Scott A; Atkins, David C

    2015-09-01

    Therapists can impact the likelihood a given patient will benefit from psychotherapy. However, therapists are rarely held accountable for their patients' outcomes. As a result, low-performing providers likely continue to practice alongside providers with high response rates. In the current study, we conducted a Monte Carlo simulation to illustrate a thought experiment-what happens to patient outcomes if therapists with the worst outcomes were removed from practice? We drew initial samples of 50 therapists from 3 simulated populations of 1,000 therapists with a mean patient response rate of 50% and different effect sizes for therapist variability in outcomes. We simulated 30 patient outcomes for each therapist, with outcome defined as response to treatment versus no response. We removed therapists with response rates in the bottom 5% and replaced them with a random sample of therapists from the population. Over 10 years, the difference in responses between the lowest and highest performing therapists was substantial (between 697 and 997 additional responses to treatment). After repeatedly removing the lowest performing providers 40 times (simulating a 10-year time span), response rates increased substantially. The cumulative number of patient responses (i.e., summing the total number of responses across 10 years) increased by 4,266, 6,404, and 9,307 when therapists accounted for 5%, 10%, or 20% of the patient outcome variance, respectively. These findings indicate that performance-based retention of therapists could improve the quality of psychotherapy in health systems by improving the average response rate and decreasing the probability that a patient will be treated by a therapist who consistently has poor outcomes.

  19. Frying performance of palm-based solid frying shortening.

    PubMed

    Omar, M N; Nor-Nazuha, M N; Nor-Dalilah, M N; Sahri, M M

    2010-03-15

    In order to evaluate the frying performance of palm-based solid frying shortening against standard olein, the fresh potato chips were fried in both frying media using an open fryer. After frying the chips for 40 h in an open batch fryer, it was found that the frying quality of palm-based solid frying shortening was better than standard palm olein in terms of Free Fatty Acid (FFA) values, Total Polar Content (TPC) and Total Polymeric Material (TPM). Solid shortening gave FFA, TPC and TPM values of 0.7, 15.3 and 2.67%, respectively, whilst standard palm olein gave values for FFA, TPC and TPM of 1.2, 19.6 and 3.10%, respectively. In terms of sensory mean scores, sensory panelists preferred the color of potato chips fried in solid shortening on the first day of frying, while on the third and fifth day of frying there were no significant differences (p < 0.05) in the sensory scores of fried products in both frying mediums. However, on the fifth day of frying, panelists gave higher scores in terms of taste, flavor and crispness for potato chips fried in solid shortening. These findings show that the palm-based solid shortening is better than palm olein when used for deep fat frying in terms of FFA values, total polar content and total polymeric material, especially for starch-based products such as potato chips. The result also shows that, in terms of sensory mean scores, after frying for 40 h, the sensory panelists gave higher scores in terms of taste, flavor and crispiness for potato chips fried in palm-based solid shortening.

  20. In-space propellant logistics. Volume 4: Project planning data

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The prephase A conceptual project planning data as it pertains to the development of the selected logistics module configuration transported into earth orbit by the space shuttle orbiter. The data represents the test, implementation, and supporting research and technology requirements for attaining the propellant transfer operational capability for early 1985. The plan is based on a propellant module designed to support the space-based tug with cryogenic oxygen-hydrogen propellants. A logical sequence of activities that is required to define, design, develop, fabricate, test, launch, and flight test the propellant logistics module is described. Included are the facility and ground support equipment requirements. The schedule of activities are based on the evolution and relationship between the R and T, the development issues, and the resultant test program.

  1. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    PubMed

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory. PMID:27295204

  2. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    PubMed

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory.

  3. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  4. Classification of microarray data with penalized logistic regression

    NASA Astrophysics Data System (ADS)

    Eilers, Paul H. C.; Boer, Judith M.; van Ommen, Gert-Jan; van Houwelingen, Hans C.

    2001-06-01

    Classification of microarray data needs a firm statistical basis. In principle, logistic regression can provide it, modeling the probability of membership of a class with (transforms of) linear combinations of explanatory variables. However, classical logistic regression does not work for microarrays, because generally there will be far more variables than observations. One problem is multicollinearity: estimating equations become singular and have no unique and stable solution. A second problem is over-fitting: a model may fit well into a data set, but perform badly when used to classify new data. We propose penalized likelihood as a solution to both problems. The values of the regression coefficients are constrained in a similar way as in ridge regression. All variables play an equal role, there is no ad-hoc selection of most relevant or most expressed genes. The dimension of the resulting systems of equations is equal to the number of variables, and generally will be too large for most computers, but it can dramatically be reduced with the singular value decomposition of some matrices. The penalty is optimized with AIC (Akaike's Information Criterion), which essentially is a measure of prediction performance. We find that penalized logistic regression performs well on a public data set (the MIT ALL/AML data).

  5. A Bayesian goodness of fit test and semiparametric generalization of logistic regression with measurement data.

    PubMed

    Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E

    2013-06-01

    Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework.

  6. Implementation of Benchmarking Transportation Logistics Practices and Future Benchmarking Organizations

    SciTech Connect

    Thrower, A.W.; Patric, J.; Keister, M.

    2008-07-01

    The purpose of the Office of Civilian Radioactive Waste Management's (OCRWM) Logistics Benchmarking Project is to identify established government and industry practices for the safe transportation of hazardous materials which can serve as a yardstick for design and operation of OCRWM's national transportation system for shipping spent nuclear fuel and high-level radioactive waste to the proposed repository at Yucca Mountain, Nevada. The project will present logistics and transportation practices and develop implementation recommendations for adaptation by the national transportation system. This paper will describe the process used to perform the initial benchmarking study, highlight interim findings, and explain how these findings are being implemented. It will also provide an overview of the next phase of benchmarking studies. The benchmarking effort will remain a high-priority activity throughout the planning and operational phases of the transportation system. The initial phase of the project focused on government transportation programs to identify those practices which are most clearly applicable to OCRWM. These Federal programs have decades of safe transportation experience, strive for excellence in operations, and implement effective stakeholder involvement, all of which parallel OCRWM's transportation mission and vision. The initial benchmarking project focused on four business processes that are critical to OCRWM's mission success, and can be incorporated into OCRWM planning and preparation in the near term. The processes examined were: transportation business model, contract management/out-sourcing, stakeholder relations, and contingency planning. More recently, OCRWM examined logistics operations of AREVA NC's Business Unit Logistics in France. The next phase of benchmarking will focus on integrated domestic and international commercial radioactive logistic operations. The prospective companies represent large scale shippers and have vast experience in

  7. Performance of partial statistics in individual-based landscape genetics.

    PubMed

    Kierepka, E M; Latch, E K

    2015-05-01

    Individual-based landscape genetic methods have become increasingly popular for quantifying fine-scale landscape influences on gene flow. One complication for individual-based methods is that gene flow and landscape variables are often correlated with geography. Partial statistics, particularly Mantel tests, are often employed to control for these inherent correlations by removing the effects of geography while simultaneously correlating measures of genetic differentiation and landscape variables of interest. Concerns about the reliability of Mantel tests prompted this study, in which we use simulated landscapes to evaluate the performance of partial Mantel tests and two ordination methods, distance-based redundancy analysis (dbRDA) and redundancy analysis (RDA), for detecting isolation by distance (IBD) and isolation by landscape resistance (IBR). Specifically, we described the effects of suitable habitat amount, fragmentation and resistance strength on metrics of accuracy (frequency of correct results, type I/II errors and strength of IBR according to underlying landscape and resistance strength) for each test using realistic individual-based gene flow simulations. Mantel tests were very effective for detecting IBD, but exhibited higher error rates when detecting IBR. Ordination methods were overall more accurate in detecting IBR, but had high type I errors compared to partial Mantel tests. Thus, no one test outperformed another completely. A combination of statistical tests, for example partial Mantel tests to detect IBD paired with appropriate ordination techniques for IBR detection, provides the best characterization of fine-scale landscape genetic structure. Realistic simulations of empirical data sets will further increase power to distinguish among putative mechanisms of differentiation.

  8. Design and performance of nitride-based ultraviolet (UV) LEDs

    SciTech Connect

    CRAWFORD,MARY H.; HAN,JUNG

    2000-04-24

    The authors overview several of the challenges in achieving high efficiency nitride-based UV (< 400 nm) LEDs. The issue of optical efficiency is presented through temperature-dependent photoluminescence studies of various UV active regions. These studies demonstrate enhanced optical efficiencies for active regions with In-containing alloys (InGaN, AlInGaN). The authors compare the performance of two distinct UV LED structures. GaN/AlGaN quantum well LEDs with {lambda} < 360 nm emission have demonstrated output powers > 0.1 mW, but present designs suffer from internal absorption effects. InGaN/AlInGaN quantum well LEDs with 370 nm < {lambda} < 390 nm emission and > 1 mW output power are also presented.

  9. Mining Behavior Based Safety Data to Predict Safety Performance

    SciTech Connect

    Jeffrey C. Joe

    2010-06-01

    The Idaho National Laboratory (INL) operates a behavior based safety program called Safety Observations Achieve Results (SOAR). This peer-to-peer observation program encourages employees to perform in-field observations of each other's work practices and habits (i.e., behaviors). The underlying premise of conducting these observations is that more serious accidents are prevented from occurring because lower level “at risk” behaviors are identified and corrected before they can propagate into culturally accepted “unsafe” behaviors that result in injuries or fatalities. Although the approach increases employee involvement in safety, the premise of the program has not been subject to sufficient empirical evaluation. The INL now has a significant amount of SOAR data on these lower level “at risk” behaviors. This paper describes the use of data mining techniques to analyze these data to determine whether they can predict if and when a more serious accident will occur.

  10. Automated tools for the generation of performance-based training

    SciTech Connect

    Trainor, M.S.; Fries, J.

    1990-01-01

    The field of educational technology is not a new one, but the emphasis in the past has been on the use of technologies for the delivery of instruction and tests. This paper explores the application of technology to the development of performance-based instruction and to the analyses leading up to the development of the instruction. Several technologies are discussed, with specific software packages described. The purpose of these technologies is to streamline the instructional analysis and design process, using the computer for its strengths to aid the human-in-the-loop. Currently, the process is all accomplished manually. Applying automated tools to the process frees the humans from some of the tedium involved so that they can be dedicated to the more complex aspects of the process. 12 refs.

  11. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  12. An easily fabricated high performance ionic polymer based sensor network

    NASA Astrophysics Data System (ADS)

    Zhu, Zicai; Wang, Yanjie; Hu, Xiaopin; Sun, Xiaofei; Chang, Longfei; Lu, Pin

    2016-08-01

    Ionic polymer materials can generate an electrical potential from ion migration under an external force. For traditional ionic polymer metal composite sensors, the output voltage is very small (a few millivolts), and the fabrication process is complex and time-consuming. This letter presents an ionic polymer based network of pressure sensors which is easily and quickly constructed, and which can generate high voltage. A 3 × 3 sensor array was prepared by casting Nafion solution directly over copper wires. Under applied pressure, two different levels of voltage response were observed among the nine nodes in the array. For the group producing the higher level, peak voltages reached as high as 25 mV. Computational stress analysis revealed the physical origin of the different responses. High voltages resulting from the stress concentration and asymmetric structure can be further utilized to modify subsequent designs to improve the performance of similar sensors.

  13. Higher-dimensional performance of port-based teleportation.

    PubMed

    Wang, Zhi-Wei; Braunstein, Samuel L

    2016-01-01

    Port-based teleportation (PBT) is a variation of regular quantum teleportation that operates without a final unitary correction. However, its behavior for higher-dimensional systems has been hard to calculate explicitly beyond dimension d = 2. Indeed, relying on conventional Hilbert-space representations entails an exponential overhead with increasing dimension. Some general upper and lower bounds for various success measures, such as (entanglement) fidelity, are known, but some become trivial in higher dimensions. Here we construct a graph-theoretic algebra (a subset of Temperley-Lieb algebra) which allows us to explicitly compute the higher-dimensional performance of PBT for so-called "pretty-good measurements" with negligible representational overhead. This graphical algebra allows us to explicitly compute the success probability to distinguish the different outcomes and fidelity for arbitrary dimension d and low number of ports N, obtaining in addition a simple upper bound. The results for low N and arbitrary d show that the entanglement fidelity asymptotically approaches N/d(2) for large d, confirming the performance of one lower bound from the literature. PMID:27605383

  14. Design and performance of nitride-based UV LEDs

    SciTech Connect

    CRAWFORD,MARY H.; HAN,JUNG; CHOW,WENG W.; BANAS,MICHAEL ANTHONY; FIGIEL,JEFFREY J.; ZHANG,LEI; SHUL,RANDY J.

    2000-02-16

    In this paper, the authors overview several of the critical materials growth, design and performance issues for nitride-based UV (< 400 nm) LEDs. The critical issue of optical efficiency is presented through temperature-dependent photoluminescence studies of various UV active regions. These studies demonstrate enhanced optical efficiencies for active regions with In-containing alloys (InGaN, AlInGaN). The authors discuss the trade-off between the challenging growth of high Al containing alloys (AlGaN, AlGaInN), and the need for sufficient carrier confinement in UV heterostructures. Carrier leakage for various composition AlGaN barriers is examined through a calculation of the total unconfined carrier density in the quantum well system. They compare the performance of two distinct UV LED structures: GaN/AlGaN quantum well LEDs for {lambda}< 360 nm emission, and InGaN/AlGaInN quantum well LEDs for 370 nm <{lambda}< 390 nm emission.

  15. Higher-dimensional performance of port-based teleportation

    PubMed Central

    Wang, Zhi-Wei; Braunstein, Samuel L.

    2016-01-01

    Port-based teleportation (PBT) is a variation of regular quantum teleportation that operates without a final unitary correction. However, its behavior for higher-dimensional systems has been hard to calculate explicitly beyond dimension d = 2. Indeed, relying on conventional Hilbert-space representations entails an exponential overhead with increasing dimension. Some general upper and lower bounds for various success measures, such as (entanglement) fidelity, are known, but some become trivial in higher dimensions. Here we construct a graph-theoretic algebra (a subset of Temperley-Lieb algebra) which allows us to explicitly compute the higher-dimensional performance of PBT for so-called “pretty-good measurements” with negligible representational overhead. This graphical algebra allows us to explicitly compute the success probability to distinguish the different outcomes and fidelity for arbitrary dimension d and low number of ports N, obtaining in addition a simple upper bound. The results for low N and arbitrary d show that the entanglement fidelity asymptotically approaches N/d2 for large d, confirming the performance of one lower bound from the literature. PMID:27605383

  16. Regime-based forecast performance during WFIP 1

    NASA Astrophysics Data System (ADS)

    Freedman, J. M.; Zack, J. W.; Manobianco, J.; Beaucage, P.; Rojowsky, K.

    2015-12-01

    The principal objectives of the first Wind Forecast Improvement Project (WFIP 1) were to improve short-term (0 - 6 hr) wind power forecasts through the assimilation of targeted remote sensing and surface observations with an enhanced model ensemble forcast system. The WFIP 1 field deployment/modeling campaign in the Southern Study Area (SSA--encompassing most of central and western Texas) ran from August 2011 through Septembe 2012. This ensured observational data and model output for all representative weather regimes affecting the SSA. Cold and warm season regimes featured synoptic-scale, convective, and low-level jet (LLJ) phenomena that are responsible for the favorable wind resource in the SSA, and also posed a challenge for assigning specific explanations for the observed forecast improvements (e.g. additional observations, model improvements, or a combination of both). LLJs produced hourly capacity factors exceeding 80% in aggregated wind farm power production, while synoptic-scale systems were responsible for the largest ramp events observed during WFIP 1. Accurately forecasting convective phenomena (such as outflow boundaries) during WFIP 1 was at times problematic. Here, we present regime-based and phenomenological-related forecast performance results for WFIP 1. These performance metrics suggest future research pathways that will facilitate improvements in operational wind power forecasts.

  17. Higher-dimensional performance of port-based teleportation.

    PubMed

    Wang, Zhi-Wei; Braunstein, Samuel L

    2016-01-01

    Port-based teleportation (PBT) is a variation of regular quantum teleportation that operates without a final unitary correction. However, its behavior for higher-dimensional systems has been hard to calculate explicitly beyond dimension d = 2. Indeed, relying on conventional Hilbert-space representations entails an exponential overhead with increasing dimension. Some general upper and lower bounds for various success measures, such as (entanglement) fidelity, are known, but some become trivial in higher dimensions. Here we construct a graph-theoretic algebra (a subset of Temperley-Lieb algebra) which allows us to explicitly compute the higher-dimensional performance of PBT for so-called "pretty-good measurements" with negligible representational overhead. This graphical algebra allows us to explicitly compute the success probability to distinguish the different outcomes and fidelity for arbitrary dimension d and low number of ports N, obtaining in addition a simple upper bound. The results for low N and arbitrary d show that the entanglement fidelity asymptotically approaches N/d(2) for large d, confirming the performance of one lower bound from the literature.

  18. Higher-dimensional performance of port-based teleportation

    NASA Astrophysics Data System (ADS)

    Wang, Zhi-Wei; Braunstein, Samuel L.

    2016-09-01

    Port-based teleportation (PBT) is a variation of regular quantum teleportation that operates without a final unitary correction. However, its behavior for higher-dimensional systems has been hard to calculate explicitly beyond dimension d = 2. Indeed, relying on conventional Hilbert-space representations entails an exponential overhead with increasing dimension. Some general upper and lower bounds for various success measures, such as (entanglement) fidelity, are known, but some become trivial in higher dimensions. Here we construct a graph-theoretic algebra (a subset of Temperley-Lieb algebra) which allows us to explicitly compute the higher-dimensional performance of PBT for so-called “pretty-good measurements” with negligible representational overhead. This graphical algebra allows us to explicitly compute the success probability to distinguish the different outcomes and fidelity for arbitrary dimension d and low number of ports N, obtaining in addition a simple upper bound. The results for low N and arbitrary d show that the entanglement fidelity asymptotically approaches N/d2 for large d, confirming the performance of one lower bound from the literature.

  19. Performance Study of optical Modulator based on electrooptic effect

    NASA Astrophysics Data System (ADS)

    Palodiya, V.; Raghuwanshi, S. K.

    2016-08-01

    In this paper, we have studied and derive performance parameter of highly integrated Lithium Niobate optical modulator. This is a chirp free modulator having low switching voltage and large bandwidth. For an external modulator in which travelling-wave electrodes length L imposed the modulating switching voltage, the product of Vπ and L is fixed for a given electro optic material Lithium Niobate. We investigate to achieve a low Vπ by both magnitude of the electro-optic coefficient for a wide variety of electro-optic materials. A Sellmeier equation for the extraordinary index of congruent lithium niobate is derived. For phase-matching, predictions are accmate for temperature between room temperature 250°C and wavelength ranging from 0.4 to 5µm. The Sellmeier equations predict more accmately refractive indices at long wavelengths. Theoretical result is confirmed by simulated results. We have analysed the various parameters such as switching voltage, device performance index, time constant, transmittance, cut-off frequency, 3-dB bandwidth, power absorption coefficient and transmission bit rate of Lithium Niobate optical Modulator based on electro -optic effect.

  20. Logistics Reduction Technologies for Exploration Missions

    NASA Technical Reports Server (NTRS)

    Broyan, James L., Jr.; Ewert, Michael K.; Fink, Patrick W.

    2014-01-01

    Human exploration missions under study are very limited by the launch mass capacity of existing and planned vehicles. The logistical mass of crew items is typically considered separate from the vehicle structure, habitat outfitting, and life support systems. Consequently, crew item logistical mass is typically competing with vehicle systems for mass allocation. NASA's Advanced Exploration Systems (AES) Logistics Reduction and Repurposing (LRR) Project is developing five logistics technologies guided by a systems engineering cradle-to-grave approach to enable used crew items to augment vehicle systems. Specifically, AES LRR is investigating the direct reduction of clothing mass, the repurposing of logistical packaging, the use of autonomous logistics management technologies, the processing of spent crew items to benefit radiation shielding and water recovery, and the conversion of trash to propulsion gases. The systematic implementation of these types of technologies will increase launch mass efficiency by enabling items to be used for secondary purposes and improve the habitability of the vehicle as the mission duration increases. This paper provides a description and the challenges of the five technologies under development and the estimated overall mission benefits of each technology.

  1. Satellite rainfall retrieval by logistic regression

    NASA Technical Reports Server (NTRS)

    Chiu, Long S.

    1986-01-01

    The potential use of logistic regression in rainfall estimation from satellite measurements is investigated. Satellite measurements provide covariate information in terms of radiances from different remote sensors.The logistic regression technique can effectively accommodate many covariates and test their significance in the estimation. The outcome from the logistical model is the probability that the rainrate of a satellite pixel is above a certain threshold. By varying the thresholds, a rainrate histogram can be obtained, from which the mean and the variant can be estimated. A logistical model is developed and applied to rainfall data collected during GATE, using as covariates the fractional rain area and a radiance measurement which is deduced from a microwave temperature-rainrate relation. It is demonstrated that the fractional rain area is an important covariate in the model, consistent with the use of the so-called Area Time Integral in estimating total rain volume in other studies. To calibrate the logistical model, simulated rain fields generated by rainfield models with prescribed parameters are needed. A stringent test of the logistical model is its ability to recover the prescribed parameters of simulated rain fields. A rain field simulation model which preserves the fractional rain area and lognormality of rainrates as found in GATE is developed. A stochastic regression model of branching and immigration whose solutions are lognormally distributed in some asymptotic limits has also been developed.

  2. Performance of Lightweight Concrete based on Granulated Foamglass

    NASA Astrophysics Data System (ADS)

    Popov, M.; Zakrevskaya, L.; Vaganov, V.; Hempel, S.; Mechtcherine, V.

    2015-11-01

    The paper presents an investigation of lightweight concretes properties, based on granulated foamglass (GFG-LWC) aggregates. The application of granulated foamglass (GFG) in concrete might significantly reduce the volume of waste glass and enhance the recycling industry in order to improve environmental performance. The conducted experiments showed high strength and thermal properties for GFG-LWC. However, the use of GFG in concrete is associated with the risk of harmful alkali-silica reactions (ASR). Thus, one of the main aims was to study ASR manifestation in GFG-LWC. It was found that the lightweight concrete based on porous aggregates, and ordinary concrete, have different a mechanism of ASR. In GFG-LWC, microstructural changes, partial destruction of granules, and accumulation of silica hydro-gel in pores were observed. According to the existing methods of analysis of ASR manifestation in concrete, sample expansion was measured, however, this method was found to be not appropriate to indicate ASR in concrete with porous aggregates. Microstructural analysis and testing of the concrete strength are needed to evaluate the damage degree due to ASR. Low-alkali cement and various pozzolanic additives as preventive measures against ASR were chosen. The final composition of the GFG-LWC provides very good characteristics with respect to compressive strength, thermal conductivity and durability. On the whole, the potential for GFG-LWC has been identified.

  3. Design consideration and performance analysis of OCT-based topography

    NASA Astrophysics Data System (ADS)

    Meemon, Panomsak; Yao, Jianing; Rolland, Jannick P.

    2014-03-01

    We report a study on design consideration and performance analysis of OCT-based topography by tracking of maximum intensity at each layer's interface. We demonstrate that, for a given stabilized OCT system, a high precision and accuracy of OCT-based layers and thickness topography in the order of tens nanometer can be achieved by using a technique of maximum amplitude tracking. The submicron precision was obtained by over sampling through the FFT of the acquired spectral fringes but was eventually limited by the system stability. Furthermore, we report characterization of a precision, repeatability, and accuracy of the surfaces, sub-surfaces, and thickness topography using our optimized FD-OCT system. We verified that for a given stability of our OCT system, precision of the detected position of signal's peak of down to 20 nm was obtained. In addition, we quantified the degradation of the precision caused by sensitivity fall-off over depth of FD-OCT. The measured precision is about 20 nm at about 0.1 mm depth, and degrades to about 80 nm at 1 mm depth, a position of about 10 dB sensitivity fall-off. The measured repeatability of thickness measurements over depth was approximately 0.04 micron. Finally, the accuracy of the system was verified by comparing with a digital micrometer gauging.

  4. Quantitative microbial risk assessment for Escherichia coli O157:H7, Salmonella enterica, and Listeria monocytogenes in leafy green vegetables consumed at salad bars, based on modeling supply chain logistics.

    PubMed

    Tromp, S O; Rijgersberg, H; Franz, E

    2010-10-01

    Quantitative microbial risk assessments do not usually account for the planning and ordering mechanisms (logistics) of a food supply chain. These mechanisms and consumer demand determine the storage and delay times of products. The aim of this study was to quantitatively assess the difference between simulating supply chain logistics (MOD) and assuming fixed storage times (FIX) in microbial risk estimation for the supply chain of fresh-cut leafy green vegetables destined for working-canteen salad bars. The results of the FIX model were previously published (E. Franz, S. O. Tromp, H. Rijgersberg, and H. J. van der Fels-Klerx, J. Food Prot. 73:274-285, 2010). Pathogen growth was modeled using stochastic discrete-event simulation of the applied logistics concept. The public health effects were assessed by conducting an exposure assessment and risk characterization. The relative growths of Escherichia coli O157 (17%) and Salmonella enterica (15%) were identical in the MOD and FIX models. In contrast, the relative growth of Listeria monocytogenes was considerably higher in the MOD model (1,156%) than in the FIX model (194%). The probability of L. monocytogenes infection in The Netherlands was higher in the MOD model (5.18×10(-8)) than in the FIX model (1.23×10(-8)). The risk of listeriosis-induced fetal mortality in the perinatal population increased from 1.24×10(-4) (FIX) to 1.66×10(-4) (MOD). Modeling the probabilistic nature of supply chain logistics is of additional value for microbial risk assessments regarding psychrotrophic pathogens in food products for which time and temperature are the postharvest preventive measures in guaranteeing food safety.

  5. Scientific research tools as an aid to Antarctic logistics

    NASA Astrophysics Data System (ADS)

    Dinn, Michael; Rose, Mike; Smith, Andrew; Fleming, Andrew; Garrod, Simon

    2013-04-01

    Logistics have always been a vital part of polar exploration and research. The more efficient those logistics can be made, the greater the likelihood that research programmes will be delivered on time, safely and to maximum scientific effectiveness. Over the last decade, the potential for symbiosis between logistics and some of the scientific research methods themselves, has increased remarkably; suites of scientific tools can help to optimise logistic efforts, thereby enhancing the effectiveness of further scientific activity. We present one recent example of input to logistics from scientific activities, in support of the NERC iSTAR Programme, a major ice sheet research effort in West Antarctica. We used data output from a number of research tools, spanning a range of techniques and international agencies, to support the deployment of a tractor-traverse system into a remote area of mainland Antarctica. The tractor system was deployed from RRS Ernest Shackleton onto the Abbot Ice Shelf then driven inland to the research area in Pine Island Glacier Data from NASA ICEBRIDGE were used to determine the ice-front freeboard and surface gradients for the traverse route off the ice shelf and onwards into the continent. Quickbird high resolution satellite imagery provided clear images of route track and some insight into snow surface roughness. Polarview satellite data gave sea ice information in the Amundsen Sea, both the previous multi-annual historical characteristics and for real-time information during deployment. Likewise meteorological data contributed historical and information and was used during deployment. Finally, during the tractors' inland journey, ground-based high frequency radar was used to determine a safe, crevasse-free route.

  6. Performance-based selection of likelihood models for phylogeny estimation.

    PubMed

    Minin, Vladimir; Abdo, Zaid; Joyce, Paul; Sullivan, Jack

    2003-10-01

    Phylogenetic estimation has largely come to rely on explicitly model-based methods. This approach requires that a model be chosen and that that choice be justified. To date, justification has largely been accomplished through use of likelihood-ratio tests (LRTs) to assess the relative fit of a nested series of reversible models. While this approach certainly represents an important advance over arbitrary model selection, the best fit of a series of models may not always provide the most reliable phylogenetic estimates for finite real data sets, where all available models are surely incorrect. Here, we develop a novel approach to model selection, which is based on the Bayesian information criterion, but incorporates relative branch-length error as a performance measure in a decision theory (DT) framework. This DT method includes a penalty for overfitting, is applicable prior to running extensive analyses, and simultaneously compares all models being considered and thus does not rely on a series of pairwise comparisons of models to traverse model space. We evaluate this method by examining four real data sets and by using those data sets to define simulation conditions. In the real data sets, the DT method selects the same or simpler models than conventional LRTs. In order to lend generality to the simulations, codon-based models (with parameters estimated from the real data sets) were used to generate simulated data sets, which are therefore more complex than any of the models we evaluate. On average, the DT method selects models that are simpler than those chosen by conventional LRTs. Nevertheless, these simpler models provide estimates of branch lengths that are more accurate both in terms of relative error and absolute error than those derived using the more complex (yet still wrong) models chosen by conventional LRTs. This method is available in a program called DT-ModSel. PMID:14530134

  7. Quantifying the complexity of the delayed logistic map.

    PubMed

    Masoller, Cristina; Rosso, Osvaldo A

    2011-01-28

    Statistical complexity measures are used to quantify the degree of complexity of the delayed logistic map, with linear and nonlinear feedback. We employ two methods for calculating the complexity measures, one with the 'histogram-based' probability distribution function and the other one with ordinal patterns. We show that these methods provide complementary information about the complexity of the delay-induced dynamics: there are parameter regions where the histogram-based complexity is zero while the ordinal pattern complexity is not, and vice versa. We also show that the time series generated from the nonlinear delayed logistic map can present zero missing or forbidden patterns, i.e. all possible ordinal patterns are realized into orbits.

  8. ISS Update: Logistics Reduction and Repurposing (Part 2)

    NASA Video Gallery

    Public Affairs Officer Brandi Dean interviews Sarah Shull, Deputy Project Manager Logistics Reduction and Repurposing. Shull, who is with the Advanced Exploration Systems, discusses the Logistics t...

  9. ISS Update: Logistics Reduction and Repurposing (Part 1)

    NASA Video Gallery

    Public Affairs Officer Brandi Dean interviews Sarah Shull, Deputy Project Manager Logistics Reduction and Repurposing. Shull, who is with the Advanced Exploration Systems, discusses the Logistics t...

  10. Concurrent and Longitudinal Patterns and Trends in Performance on Early Numeracy Curriculum-Based Measures in Kindergarten through Third Grade

    ERIC Educational Resources Information Center

    Missall, Kristen N.; Mercer, Sterett H.; Martinez, Rebecca S.; Casebeer, Dian

    2012-01-01

    The purpose of this study was to extend the research on the "Tests of Early Numeracy Curriculum-Based Measurement" (TEN-CBM) tools by examining concurrent and predictive relations from kindergarten through third grade. Using a longitudinal sample of 535 students, this study included logistic regression, latent cluster, and latent transition…

  11. Exploring Proficiency-Based vs. Performance-Based Items with Elicited Imitation Assessment

    ERIC Educational Resources Information Center

    Cox, Troy L.; Bown, Jennifer; Burdis, Jacob

    2015-01-01

    This study investigates the effect of proficiency- vs. performance-based elicited imitation (EI) assessment. EI requires test-takers to repeat sentences in the target language. The accuracy at which test-takers are able to repeat sentences highly correlates with test-takers' language proficiency. However, in EI, the factors that render an item…

  12. NASA Space Exploration Logistics Workshop Proceedings

    NASA Technical Reports Server (NTRS)

    deWeek, Oliver; Evans, William A.; Parrish, Joe; James, Sarah

    2006-01-01

    As NASA has embarked on a new Vision for Space Exploration, there is new energy and focus around the area of manned space exploration. These activities encompass the design of new vehicles such as the Crew Exploration Vehicle (CEV) and Crew Launch Vehicle (CLV) and the identification of commercial opportunities for space transportation services, as well as continued operations of the Space Shuttle and the International Space Station. Reaching the Moon and eventually Mars with a mix of both robotic and human explorers for short term missions is a formidable challenge in itself. How to achieve this in a safe, efficient and long-term sustainable way is yet another question. The challenge is not only one of vehicle design, launch, and operations but also one of space logistics. Oftentimes, logistical issues are not given enough consideration upfront, in relation to the large share of operating budgets they consume. In this context, a group of 54 experts in space logistics met for a two-day workshop to discuss the following key questions: 1. What is the current state-of the art in space logistics, in terms of architectures, concepts, technologies as well as enabling processes? 2. What are the main challenges for space logistics for future human exploration of the Moon and Mars, at the intersection of engineering and space operations? 3. What lessons can be drawn from past successes and failures in human space flight logistics? 4. What lessons and connections do we see from terrestrial analogies as well as activities in other areas, such as U.S. military logistics? 5. What key advances are required to enable long-term success in the context of a future interplanetary supply chain? These proceedings summarize the outcomes of the workshop, reference particular presentations, panels and breakout sessions, and record specific observations that should help guide future efforts.

  13. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    NASA Astrophysics Data System (ADS)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  14. Evaluating performances of simplified physically based models for landslide susceptibility

    NASA Astrophysics Data System (ADS)

    Formetta, G.; Capparelli, G.; Versace, P.

    2015-12-01

    Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC) coupled with model M3 is the best modeling solution for our test case.

  15. Humanitarian response: improving logistics to save lives.

    PubMed

    McCoy, Jessica

    2008-01-01

    Each year, millions of people worldwide are affected by disasters, underscoring the importance of effective relief efforts. Many highly visible disaster responses have been inefficient and ineffective. Humanitarian agencies typically play a key role in disaster response (eg, procuring and distributing relief items to an affected population, assisting with evacuation, providing healthcare, assisting in the development of long-term shelter), and thus their efficiency is critical for a successful disaster response. The field of disaster and emergency response modeling is well established, but the application of such techniques to humanitarian logistics is relatively recent. This article surveys models of humanitarian response logistics and identifies promising opportunities for future work. Existing models analyze a variety of preparation and response decisions (eg, warehouse location and the distribution of relief supplies), consider both natural and manmade disasters, and typically seek to minimize cost or unmet demand. Opportunities to enhance the logistics of humanitarian response include the adaptation of models developed for general disaster response; the use of existing models, techniques, and insights from the literature on commercial supply chain management; the development of working partnerships between humanitarian aid organizations and private companies with expertise in logistics; and the consideration of behavioral factors relevant to a response. Implementable, realistic models that support the logistics of humanitarian relief can improve the preparation for and the response to disasters, which in turn can save lives.

  16. Programmable Thermostat Module Upgrade for the Multipurpose Logistics Module

    NASA Technical Reports Server (NTRS)

    Clark, D. W.; Glasgow, S. d.; Reagan, S. E.; Presson, K. H.; Howard, D. E.; Smith, D. A.

    2007-01-01

    The STS-121/ULF 1.1 mission was the maiden flight of the programmable thermostat module (PTM) system used to control the 28 V shell heaters on the multi-purpose logistics module (MPLM). These PTMs, in conjunction with a data recorder module (DRM), provide continuous closed loop temperature control and data recording of MPLM on-orbit heater operations. This Technical Memorandum discusses the hardware design, development, test, and verification (DDT&V) activities performed at the Marshall Space Flight Center as well as the operational implementation and mission performance.

  17. A Case Study Using Modeling and Simulation to Predict Logistics Supply Chain Issues

    NASA Technical Reports Server (NTRS)

    Tucker, David A.

    2007-01-01

    Optimization of critical supply chains to deliver thousands of parts, materials, sub-assemblies, and vehicle structures as needed is vital to the success of the Constellation Program. Thorough analysis needs to be performed on the integrated supply chain processes to plan, source, make, deliver, and return critical items efficiently. Process modeling provides simulation technology-based, predictive solutions for supply chain problems which enable decision makers to reduce costs, accelerate cycle time and improve business performance. For example, United Space Alliance, LLC utilized this approach in late 2006 to build simulation models that recreated shuttle orbiter thruster failures and predicted the potential impact of thruster removals on logistics spare assets. The main objective was the early identification of possible problems in providing thruster spares for the remainder of the Shuttle Flight Manifest. After extensive analysis the model results were used to quantify potential problems and led to improvement actions in the supply chain. Similarly the proper modeling and analysis of Constellation parts, materials, operations, and information flows will help ensure the efficiency of the critical logistics supply chains and the overall success of the program.

  18. The Logistics Equipment Carbon Emission Monitoring System for a Green Logistics

    NASA Astrophysics Data System (ADS)

    Choi, Hyungrim; Park, Byoungkwon; Lee, Byungha; Park, Yongsung; Lee, Changsup; Ha, Jeongsoo

    Recently, due to the global enforcement of obligations to reduce green house gases and various environmental regulations, low carbon green growth strategies are required. Currently, in our country, environment friendly logistics activities are staying in the early stage compared to advanced countries because of our country's large energy consumption type industrial structures. As a measure to respond to the trend of the reinforcement of international environmental regulations in the sector of logistics, active green logistics systems should be established and to solve this problem, this study is intended to develop a monitoring system that can manage the carbon emission of logistics equipment(container truck, discharging equipment etc) in real time using a new technology named IP-RFID. The monitoring system developed in this study can actively manage the carbon emission of individual logistics equipment by attaching IP-Tags that can measure the carbon emission of individual logistics equipment in real time and transmit the information obtained from the measurement directly to users through IP communication. Since carbon emission can be managed by logistics equipment and drivers can check the carbon emission of equipment through this system, the carbon emission generated in the logistics sector may be reduced by using this system.

  19. Interacting Bose gas, the logistic law, and complex networks

    NASA Astrophysics Data System (ADS)

    Sowa, A.

    2015-01-01

    We discuss a mathematical link between the Quantum Statistical Mechanics and the logistic growth and decay processes. It is based on an observation that a certain nonlinear operator evolution equation, which we refer to as the Logistic Operator Equation (LOE), provides an extension of the standard model of noninteracting bosons. We discuss formal solutions (asymptotic formulas) for a special calibration of the LOE, which sets it in the number-theoretic framework. This trick, in the tradition of Julia and Bost-Connes, makes it possible for us to tap into the vast resources of classical mathematics and, in particular, to construct explicit solutions of the LOE via the Dirichlet series. The LOE is applicable to a range of modeling and simulation tasks, from characterization of interacting boson systems to simulation of some complex man-made networks. The theoretical results enable numerical simulations, which, in turn, shed light at the unique complexities of the rich and multifaceted models resulting from the LOE.

  20. Reverse logistics in the Brazilian construction industry.

    PubMed

    Nunes, K R A; Mahler, C F; Valle, R A

    2009-09-01

    In Brazil most Construction and Demolition Waste (C&D waste) is not recycled. This situation is expected to change significantly, since new federal regulations oblige municipalities to create and implement sustainable C&D waste management plans which assign an important role to recycling activities. The recycling organizational network and its flows and components are fundamental to C&D waste recycling feasibility. Organizational networks, flows and components involve reverse logistics. The aim of this work is to introduce the concepts of reverse logistics and reverse distribution channel networks and to study the Brazilian C&D waste case. PMID:19481331

  1. Reliability Driven Space Logistics Demand Analysis

    NASA Technical Reports Server (NTRS)

    Knezevic, J.

    1995-01-01

    Accurate selection of the quantity of logistic support resources has a strong influence on mission success, system availability and the cost of ownership. At the same time the accurate prediction of these resources depends on the accurate prediction of the reliability measures of the items involved. This paper presents a method for the advanced and accurate calculation of the reliability measures of complex space systems which are the basis for the determination of the demands for logistics resources needed during the operational life or mission of space systems. The applicability of the method presented is demonstrated through several examples.

  2. Inspection logistics planning for multi-stage production systems with applications to semiconductor fabrication lines

    NASA Astrophysics Data System (ADS)

    Chen, Kyle Dakai

    Since the market for semiconductor products has become more lucrative and competitive, research into improving yields for semiconductor fabrication lines has lately received a tremendous amount of attention. One of the most critical tasks in achieving such yield improvements is to plan the in-line inspection sampling efficiently so that any potential yield problems can be detected early and eliminated quickly. We formulate a multi-stage inspection planning model based on configurations in actual semiconductor fabrication lines, specifically taking into account both the capacity constraint and the congestion effects at the inspection station. We propose a new mixed First-Come-First-Serve (FCFS) and Last-Come-First-Serve (LCFS) discipline for serving the inspection samples to expedite the detection of potential yield problems. Employing this mixed FCFS and LCFS discipline, we derive approximate expressions for the queueing delays in yield problem detection time and develop near-optimal algorithms to obtain the inspection logistics planning policies. We also investigate the queueing performance with this mixed type of service discipline under different assumptions and configurations. In addition, we conduct numerical tests and generate managerial insights based on input data from actual semiconductor fabrication lines. To the best of our knowledge, this research is novel in developing, for the first time in the literature, near-optimal results for inspection logistics planning in multi-stage production systems with congestion effects explicitly considered.

  3. Reducing Electrical Power Use with a Performance Based Incentive

    SciTech Connect

    M. Kathleen Nell

    2004-07-01

    This Departmental Energy Management Program (DEMP) funded Model Program Study developed out of a potential DOE-ID Performance Based Incentive for the Idaho National Engineering and Environmental Laboratory (INEEL), lasting from October 2001 through May 2002, which stressed reductions in electrical usage. An analysis of demand usage obtained from monthly INEEL Power Management electric reports revealed reductions in demand from a majority of the site areas. The purpose of this Model Program study was to determine the methods and activities that were used at these site areas to achieve the reductions in demand and to develop these demand reduction methods and activities into a Model Program that could be shared throughout the INEEL and DOE complex-wide for additional demand savings. INEEL Energy Management personnel interviewed contacts from the eight areas which had achieved a consistent reduction in demand during the study period, namely, Idaho Nuclear Technology and Engineering Center (INTEC), Test Area North (TAN), Power Burst Facility (PBF), Test Reactor Area (TRA) including Advanced Test Reactor ATR), Engineering Test Reactor (ETR), and Materials Test Reactor (MTR) areas, Central Facilities Area (CFA), Specific Manufacturing Capability (SMC), Radioactive Waste Management Complex (RWMC), and Argonne National Laboratory-West (ANLW). The information that resulted from the interviews indicated that more than direct demand and energy reduction actions were responsible for the recorded reductions in demand. INEEL Energy Management identified five categories of actions or conditions that contributed to the demand reduction. These categories are Decontamination and Decommissioning (D&D), employee actions, improvements, inactivation for maintenance, and processes. The following information details the findings from the study.

  4. Lunar Surface Architecture Utilization and Logistics Support Assessment

    NASA Astrophysics Data System (ADS)

    Bienhoff, Dallas; Findiesen, William; Bayer, Martin; Born, Andrew; McCormick, David

    2008-01-01

    Crew and equipment utilization and logistics support needs for the point of departure lunar outpost as presented by the NASA Lunar Architecture Team (LAT) and alternative surface architectures were assessed for the first ten years of operation. The lunar surface architectures were evaluated and manifests created for each mission. Distances between Lunar Surface Access Module (LSAM) landing sites and emplacement locations were estimated. Physical characteristics were assigned to each surface element and operational characteristics were assigned to each surface mobility element. Stochastic analysis was conducted to assess probable times to deploy surface elements, conduct exploration excursions, and perform defined crew activities. Crew time is divided into Outpost-related, exploration and science, overhead, and personal activities. Outpost-related time includes element deployment, EVA maintenance, IVA maintenance, and logistics resupply. Exploration and science activities include mapping, geological surveys, science experiment deployment, sample analysis and categorizing, and physiological and biological tests in the lunar environment. Personal activities include sleeping, eating, hygiene, exercising, and time off. Overhead activities include precursor or close-out tasks that must be accomplished but don't fit into the other three categories such as: suit donning and doffing, airlock cycle time, suit cleaning, suit maintenance, post-landing safing actions, and pre-departure preparations. Equipment usage time, spares, maintenance actions, and Outpost consumables are also estimated to provide input into logistics support planning. Results are normalized relative to the NASA LAT point of departure lunar surface architecture.

  5. Observer based output feedback tuning for underwater remotely operated vehicle based on linear quadratic performance

    NASA Astrophysics Data System (ADS)

    Aras, Mohd Shahrieel Mohd; Abdullah, Shahrum Shah; Kamarudin, Muhammad Nizam; Rahman, Ahmad Fadzli Nizam Abdul; Azis, Fadilah Abd; Jaafar, Hazriq Izzuan

    2015-05-01

    This paper describes the effectiveness of observer-based output feedback for Unmanned Underwater Vehicle (UUV) with Linear Quadratic Regulation (LQR) performance. Tuning of observer parameters is crucial for tracking purpose. Prior to tuning facility, the ranges of observer and LQR parameters are obtained via system output cum error. The validation of this technique using unmanned underwater vehicles called Remotely Operated Vehicle (ROV) modelling helps to improve steady state performance of system response. The ROV modeling is focused for depth control using ROV 1 developed by the Underwater Technology Research Group (UTeRG). The results are showing that this technique improves steady state performances in term of overshoot and settling time of the system response.

  6. Equation-based triangle orientation discrimination sensor performance model based on sampling effects.

    PubMed

    Wang, Xiao-rui; Zhang, Jian-qi; Feng, Zhuo-xiang; Chang, Hong-hua

    2005-02-01

    An equation-based triangle orientation discrimination (TOD) performance model was first developed to focus on staring thermal imagers. Specifically, the spatial spectra distribution of a standard triangle pattern is determined. The modulation effects of the overall components of the system on a nonperiodic standard triangle pattern are analyzed and modeled. The matched filter idea is used to characterize quantitatively the spatiotemporal integration of the eye and brain to signal, aliasing, and various noise components over a triangle pattern area, and the perceived signal-to-noise ratio of the staring thermal imager is derived. Further, the TOD performance theoretical model is established. Comparison with the experimental results shows that this theoretical model gives a reasonable prediction of the TOD performance curve for staring thermal imagers. Although more tests and modifications are required, these preliminary results suggest that this model can be developed into a model that predicts the TOD for a wide range of sensors. PMID:15726945

  7. The Columbus logistics support at the APMC: Requirements and implementation aspects

    NASA Technical Reports Server (NTRS)

    Canu, C.; Battocchio, L.; Masullo, S.

    1993-01-01

    This paper focuses on the logistics support to be provided by the APM Center (APMC). Among the Columbus ground infrastructures, this center is tasked to provide logistics, sustaining engineering and P/L integration support to the ongoing missions of the APM, i.e. the Columbus Laboratory attached to the Freedom Space Station. The following is illustrated: an analysis of the requirements that are levied on the logistics support of the APM; how such requirements are reflected in the corresponding support to be available on-ground and at APMC; the functional components of the APMC logistics support and how such components interact each other; how the logistics support function interfaces with the other functions of the ground support; and how the logistics support is being designed in terms of resources (such as hardware, data bases, etc.). Emphasis is given to the data handling aspects and to the related data bases that will constitute for the logistics activities the fundamental source of information during the APM planned lifetime. Functional and physical architectures, together with trades for possible implementation, are addressed. Commonalities with other centers are taken into account and recommendations are made for possible reuse of tools already developed in the C/D phase. Finally, programmatic considerations are discussed for the actual implementation of the center.

  8. Expert music performance: cognitive, neural, and developmental bases.

    PubMed

    Brown, Rachel M; Zatorre, Robert J; Penhune, Virginia B

    2015-01-01

    In this chapter, we explore what happens in the brain of an expert musician during performance. Understanding expert music performance is interesting to cognitive neuroscientists not only because it tests the limits of human memory and movement, but also because studying expert musicianship can help us understand skilled human behavior in general. In this chapter, we outline important facets of our current understanding of the cognitive and neural basis for music performance, and developmental factors that may underlie musical ability. We address three main questions. (1) What is expert performance? (2) How do musicians achieve expert-level performance? (3) How does expert performance come about? We address the first question by describing musicians' ability to remember, plan, execute, and monitor their performances in order to perform music accurately and expressively. We address the second question by reviewing evidence for possible cognitive and neural mechanisms that may underlie or contribute to expert music performance, including the integration of sound and movement, feedforward and feedback motor control processes, expectancy, and imagery. We further discuss how neural circuits in auditory, motor, parietal, subcortical, and frontal cortex all contribute to different facets of musical expertise. Finally, we address the third question by reviewing evidence for the heritability of musical expertise and for how expertise develops through training and practice. We end by discussing outlooks for future work. PMID:25725910

  9. 48 CFR 970.1100-1 - Performance-based contracting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... requirements and expectations should be consistent with the Department's strategic planning goals and... planning processes. Measurable performance criteria, objective measures, and where appropriate,...

  10. An empirical hierarchical memory model based on hardware performance counters

    SciTech Connect

    Lubeck, O.M.; Luo, Y.; Wasserman, H.; Bassetti, F.

    1998-09-01

    In this paper, the authors characterize application performance with a memory-centric view. Using a simple strategy and performance data measured by on-chip hardware performance counters, they model the performance of a simple memory hierarchy and infer the contribution of each level in the memory system to an application`s overall cycles per instruction (cpi). They account for the overlap of processor execution with memory accesses--a key parameter not directly measurable on most systems. They infer the separate contributions of three major architecture features in the memory subsystem of the Origin 2000: cache size, outstanding loads-under-miss, and memory latency.

  11. Expert music performance: cognitive, neural, and developmental bases.

    PubMed

    Brown, Rachel M; Zatorre, Robert J; Penhune, Virginia B

    2015-01-01

    In this chapter, we explore what happens in the brain of an expert musician during performance. Understanding expert music performance is interesting to cognitive neuroscientists not only because it tests the limits of human memory and movement, but also because studying expert musicianship can help us understand skilled human behavior in general. In this chapter, we outline important facets of our current understanding of the cognitive and neural basis for music performance, and developmental factors that may underlie musical ability. We address three main questions. (1) What is expert performance? (2) How do musicians achieve expert-level performance? (3) How does expert performance come about? We address the first question by describing musicians' ability to remember, plan, execute, and monitor their performances in order to perform music accurately and expressively. We address the second question by reviewing evidence for possible cognitive and neural mechanisms that may underlie or contribute to expert music performance, including the integration of sound and movement, feedforward and feedback motor control processes, expectancy, and imagery. We further discuss how neural circuits in auditory, motor, parietal, subcortical, and frontal cortex all contribute to different facets of musical expertise. Finally, we address the third question by reviewing evidence for the heritability of musical expertise and for how expertise develops through training and practice. We end by discussing outlooks for future work.

  12. Performance-Based Contracting Within a State Substance Abuse Treatment System: A Preliminary Exploration of Differences in Client Access and Client Outcomes

    PubMed Central

    Brucker, Debra L.; Stewart, Maureen

    2013-01-01

    To explore whether the implementation of performance-based contracting (PBC) within the State of Maine’s substance abuse treatment system resulted in improved performance, one descriptive and two empirical analyses were conducted. The first analysis examined utilization and payment structure. The second study was designed to examine whether timeliness of access to outpatient (OP) and intensive outpatient (IOP) substance abuse assessments and treatment, measures that only became available after the implementation of PBC, differed between PBC and non-PBC agencies in the year following implementation of PBC. Using treatment admission records from the state treatment data system (N=9,128), logistic regression models run using generalized equation estimation techniques found no significant difference between PBC agencies and other agencies on timeliness of access to assessments or treatment, for both OP and IOP services. The third analysis, conducted using discharge data from the years prior to and after the implementation of performance-based contracting (N=6,740) for those agencies that became a part of the performance-based contracting system, was designed to assess differences in level of participation, retention, and completion of treatment. Regression models suggest that performance on OP client engagement and retention measures was significantly poorer the year after the implementation of PBC, but that temporal rather than a PBC effects were more significant. No differences were found between years for IOP level of participation or completion of treatment measures. PMID:21249461

  13. Rote Learning in a Performance-based Pedagogy.

    ERIC Educational Resources Information Center

    Dale, Moyra Buntine

    2001-01-01

    An ethnographic study of adult literacy classes in Egypt depicted rote recitation, copying, and dictation as forms of ritual that maintain social order. A sequence of framing, modeling, practicing, performing, and evaluating was followed. Focus was on correct performance rather than interpretation, comprehension, or critical evaluation. (SK)

  14. Gender-Based Differential Item Performance in Mathematics Achievement Items.

    ERIC Educational Resources Information Center

    Doolittle, Allen E.

    A procedure for the detection of differential item performance (DIP) is used to investigate the relationships between characteristics of mathematics achievement items and gender differences in performance. Eight randomly equivalent samples of high school seniors were each given a unique form of the ACT Assessment Mathematics Usage Test (ACTM).…

  15. Influential factors of red-light running at signalized intersection and prediction using a rare events logistic regression model.

    PubMed

    Ren, Yilong; Wang, Yunpeng; Wu, Xinkai; Yu, Guizhen; Ding, Chuan

    2016-10-01

    Red light running (RLR) has become a major safety concern at signalized intersection. To prevent RLR related crashes, it is critical to identify the factors that significantly impact the drivers' behaviors of RLR, and to predict potential RLR in real time. In this research, 9-month's RLR events extracted from high-resolution traffic data collected by loop detectors from three signalized intersections were applied to identify the factors that significantly affect RLR behaviors. The data analysis indicated that occupancy time, time gap, used yellow time, time left to yellow start, whether the preceding vehicle runs through the intersection during yellow, and whether there is a vehicle passing through the intersection on the adjacent lane were significantly factors for RLR behaviors. Furthermore, due to the rare events nature of RLR, a modified rare events logistic regression model was developed for RLR prediction. The rare events logistic regression method has been applied in many fields for rare events studies and shows impressive performance, but so far none of previous research has applied this method to study RLR. The results showed that the rare events logistic regression model performed significantly better than the standard logistic regression model. More importantly, the proposed RLR prediction method is purely based on loop detector data collected from a single advance loop detector located 400 feet away from stop-bar. This brings great potential for future field applications of the proposed method since loops have been widely implemented in many intersections and can collect data in real time. This research is expected to contribute to the improvement of intersection safety significantly.

  16. Pattern formation, logistics, and maximum path probability

    NASA Astrophysics Data System (ADS)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are

  17. 48 CFR 2937.602 - Elements of performance-based contracting.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Elements of performance-based contracting. 2937.602 Section 2937.602 Federal Acquisition Regulations System DEPARTMENT OF LABOR...) 2937.602 Elements of performance-based contracting. (a) Performance-based contracting is defined in...

  18. 48 CFR 2937.602 - Elements of performance-based contracting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Elements of performance-based contracting. 2937.602 Section 2937.602 Federal Acquisition Regulations System DEPARTMENT OF LABOR...) 2937.602 Elements of performance-based contracting. (a) Performance-based contracting is defined in...

  19. 48 CFR 2937.602 - Elements of performance-based contracting.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Elements of performance-based contracting. 2937.602 Section 2937.602 Federal Acquisition Regulations System DEPARTMENT OF LABOR...) 2937.602 Elements of performance-based contracting. (a) Performance-based contracting is defined in...

  20. 48 CFR 2937.602 - Elements of performance-based contracting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Elements of performance-based contracting. 2937.602 Section 2937.602 Federal Acquisition Regulations System DEPARTMENT OF LABOR...) 2937.602 Elements of performance-based contracting. (a) Performance-based contracting is defined in...

  1. 48 CFR 2937.602 - Elements of performance-based contracting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Elements of performance-based contracting. 2937.602 Section 2937.602 Federal Acquisition Regulations System DEPARTMENT OF LABOR...) 2937.602 Elements of performance-based contracting. (a) Performance-based contracting is defined in...

  2. Analyzing the Operation of Performance-Based Accountability Systems for Public Services. Technical Report

    ERIC Educational Resources Information Center

    Camm, Frank; Stecher, Brian M.

    2010-01-01

    Empirical evidence of the effects of performance-based public management is scarce. This report describes a framework used to organize available empirical information on one form of performance-based management, a performance-based accountability system (PBAS). Such a system identifies individuals or organizations that must change their behavior…

  3. Operation mechanism of high performance organic permeable base transistors with an insulated and perforated base electrode

    NASA Astrophysics Data System (ADS)

    Kaschura, Felix; Fischer, Axel; Klinger, Markus P.; Doan, Duy Hai; Koprucki, Thomas; Glitzky, Annegret; Kasemann, Daniel; Widmer, Johannes; Leo, Karl

    2016-09-01

    The organic permeable base transistor is a vertical transistor architecture that enables high performance while maintaining a simple low-resolution fabrication. It has been argued that the charge transport through the nano-sized openings of the central base electrode limits the performance. Here, we demonstrate by using 3D drift-diffusion simulations that this is not the case in the relevant operation range. At low current densities, the applied base potential controls the number of charges that can pass through an opening and the opening is the current limiting factor. However, at higher current densities, charges accumulate within the openings and in front of the base insulation, allowing for an efficient lateral transport of charges towards the next opening. The on-state in the current-voltage characteristics reaches the maximum possible current given by space charge limited current transport through the intrinsic semiconductor layers. Thus, even a small effective area of the openings can drive huge current densities, and further device optimization has to focus on reducing the intrinsic layer thickness to a minimum.

  4. Linear Logistic Test Modeling with R

    ERIC Educational Resources Information Center

    Baghaei, Purya; Kubinger, Klaus D.

    2015-01-01

    The present paper gives a general introduction to the linear logistic test model (Fischer, 1973), an extension of the Rasch model with linear constraints on item parameters, along with eRm (an R package to estimate different types of Rasch models; Mair, Hatzinger, & Mair, 2014) functions to estimate the model and interpret its parameters. The…

  5. Mission Benefits Analysis of Logistics Reduction Technologies

    NASA Technical Reports Server (NTRS)

    Ewert, Michael K.; Broyan, James L.

    2012-01-01

    Future space exploration missions will need to use less logistical supplies if humans are to live for longer periods away from our home planet. Anything that can be done to reduce initial mass and volume of supplies or reuse or recycle items that have been launched will be very valuable. Reuse and recycling also reduce the trash burden and associated nuisances, such as smell, but require good systems engineering and operations integration to reap the greatest benefits. A systems analysis was conducted to quantify the mass and volume savings of four different technologies currently under development by NASA fs Advanced Exploration Systems (AES) Logistics Reduction and Repurposing project. Advanced clothing systems lead to savings by direct mass reduction and increased wear duration. Reuse of logistical items, such as packaging, for a second purpose allows fewer items to be launched. A device known as a heat melt compactor drastically reduces the volume of trash, recovers water and produces a stable tile that can be used instead of launching additional radiation protection. The fourth technology, called trash ]to ]supply ]gas, can benefit a mission by supplying fuel such as methane to the propulsion system. This systems engineering work will help improve logistics planning and overall mission architectures by determining the most effective use, and reuse, of all resources.

  6. Mission Benefits Analysis of Logistics Reduction Technologies

    NASA Technical Reports Server (NTRS)

    Ewert, Michael K.; Broyan, James Lee, Jr.

    2013-01-01

    Future space exploration missions will need to use less logistical supplies if humans are to live for longer periods away from our home planet. Anything that can be done to reduce initial mass and volume of supplies or reuse or recycle items that have been launched will be very valuable. Reuse and recycling also reduce the trash burden and associated nuisances, such as smell, but require good systems engineering and operations integration to reap the greatest benefits. A systems analysis was conducted to quantify the mass and volume savings of four different technologies currently under development by NASA s Advanced Exploration Systems (AES) Logistics Reduction and Repurposing project. Advanced clothing systems lead to savings by direct mass reduction and increased wear duration. Reuse of logistical items, such as packaging, for a second purpose allows fewer items to be launched. A device known as a heat melt compactor drastically reduces the volume of trash, recovers water and produces a stable tile that can be used instead of launching additional radiation protection. The fourth technology, called trash-to-gas, can benefit a mission by supplying fuel such as methane to the propulsion system. This systems engineering work will help improve logistics planning and overall mission architectures by determining the most effective use, and reuse, of all resources.

  7. LOGISTICS OF ECOLOGICAL SAMPLING ON LARGE RIVERS

    EPA Science Inventory

    The objectives of this document are to provide an overview of the logistical problems associated with the ecological sampling of boatable rivers and to suggest solutions to those problems. It is intended to be used as a resource for individuals preparing to collect biological dat...

  8. Biomass round bales infield aggregation logistic scenarios

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Biomass bales often need to be aggregated (collected into groups and transported) to a field-edge stack for temporary storage for feedlots or processing facilities. Aggregating the bales with the least total distance involved is a goal of producers and bale handlers. Several logistics scenarios for ...

  9. Predicting Social Trust with Binary Logistic Regression

    ERIC Educational Resources Information Center

    Adwere-Boamah, Joseph; Hufstedler, Shirley

    2015-01-01

    This study used binary logistic regression to predict social trust with five demographic variables from a national sample of adult individuals who participated in The General Social Survey (GSS) in 2012. The five predictor variables were respondents' highest degree earned, race, sex, general happiness and the importance of personally assisting…

  10. Application of statistical distribution theory to launch-on-time for space construction logistic support

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.

    1989-01-01

    The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.

  11. Harvesting forest biomass for energy in Minnesota: An assessment of guidelines, costs and logistics

    NASA Astrophysics Data System (ADS)

    Saleh, Dalia El Sayed Abbas Mohamed

    The emerging market for renewable energy in Minnesota has generated a growing interest in utilizing more forest biomass for energy. However, this growing interest is paralleled with limited knowledge of the environmental impacts and cost effectiveness of utilizing this resource. To address environmental and economic viability concerns, this dissertation has addressed three areas related to biomass harvest: First, existing biomass harvesting guidelines and sustainability considerations are examined. Second, the potential contribution of biomass energy production to reduce the costs of hazardous fuel reduction treatments in these trials is assessed. Third, the logistics of biomass production trials are analyzed. Findings show that: (1) Existing forest related guidelines are not sufficient to allow large-scale production of biomass energy from forest residue sustainably. Biomass energy guidelines need to be based on scientific assessments of how repeated and large scale biomass production is going to affect soil, water and habitat values, in an integrated and individual manner over time. Furthermore, such guidelines would need to recommend production logistics (planning, implementation, and coordination of operations) necessary for a potential supply with the least site and environmental impacts. (2) The costs of biomass production trials were assessed and compared with conventional treatment costs. In these trials, conventional mechanical treatment costs were lower than biomass energy production costs less income from biomass sale. However, a sensitivity analysis indicated that costs reductions are possible under certain site, prescriptions and distance conditions. (3) Semi-structured interviews with forest machine operators indicate that existing fuel reduction prescriptions need to be more realistic in making recommendations that can overcome operational barriers (technical and physical) and planning and coordination concerns (guidelines and communications

  12. Logistics Process Analysis ToolProcess Analysis Tool

    SciTech Connect

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).

  13. Logistics Process Analysis ToolProcess Analysis Tool

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component wasmore » added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  14. Evaluating hospital performance based on excess cause-specific incidence.

    PubMed

    Van Rompaye, Bart; Eriksson, Marie; Goetghebeur, Els

    2015-04-15

    Formal evaluation of hospital performance in specific types of care is becoming an indispensable tool for quality assurance in the health care system. When the prime concern lies in reducing the risk of a cause-specific event, we propose to evaluate performance in terms of an average excess cumulative incidence, referring to the center's observed patient mix. Its intuitive interpretation helps give meaning to the evaluation results and facilitates the determination of important benchmarks for hospital performance. We apply it to the evaluation of cerebrovascular deaths after stroke in Swedish stroke centers, using data from Riksstroke, the Swedish stroke registry.

  15. Developing a Logistics Data Process for Support Equipment for NASA Ground Operations

    NASA Technical Reports Server (NTRS)

    Chakrabarti, Suman

    2010-01-01

    The United States NASA Space Shuttle has long been considered an extremely capable yet relatively expensive rocket. A great part of the roughly US $500 million per launch expense was the support footprint: refurbishment and maintenance of the space shuttle system, together with the long list of resources required to support it, including personnel, tools, facilities, transport and support equipment. NASA determined to make its next rocket system with a smaller logistics footprint, and thereby more cost-effective and quicker turnaround. The logical solution was to adopt a standard Logistics Support Analysis (LSA) process based on GEIA-STD-0007 http://www.logisticsengineers.org/may09pres/GEIASTD0007DEXShortIntro.pdf which is the successor of MIL-STD-1388-2B widely used by U.S., NATO, and other world military services and industries. This approach is unprecedented at NASA: it is the first time a major program of programs, Project Constellation, is factoring logistics and supportability into design at many levels. This paper will focus on one of those levels NASA ground support equipment for the next generation of NASA rockets and on building a Logistics Support Analysis Record (LSAR) for developing and documenting a support solution and inventory of resources for. This LSAR is actually a standards-based database, containing analyses of the time and tools, personnel, facilities and support equipment required to assemble and integrate the stages and umbilicals of a rocket. This paper will cover building this database from scratch: including creating and importing a hierarchical bill of materials (BOM) from legacy data; identifying line-replaceable units (LRUs) of a given piece of equipment; analyzing reliability and maintainability of said LRUs; and therefore making an assessment back to design whether the support solution for a piece of equipment is too much work, i.e., too resource-intensive. If one must replace or inspect an LRU too much, perhaps a modification of

  16. Performance Analysis of Web Applications Based on User Navigation

    NASA Astrophysics Data System (ADS)

    Zhou, Quanshu; Ye, Hairong; Ding, Zuohua

    This paper proposes a method to conduct performance eanalysis of web applications. The behavior model is firstly built from log file after user navigation, then an extended state diagram is extracted from this log file, finally multiple Markov model is cooperated to this state diagram and the performance analysis can be obtained from the Markov model. Five indexes are used to measure the performance and they are: service response time, service path length, service utilization, service implementation rate and access error rate. Our performance analysis result will provide a suggestion to improve the design of web applications and optimize the services. A case study of Zhejiang Chess web site has been used to demonstrate the advantage of our method.

  17. Criterion-based (proficiency) training to improve surgical performance.

    PubMed

    Fried, Marvin P; Kaye, Rachel J; Gibber, Marc J; Jackman, Alexis H; Paskhover, Boris P; Sadoughi, Babak; Schiff, Bradley; Fraioli, Rebecca E; Jacobs, Joseph B

    2012-11-01

    OBJECTIVE To investigate whether training otorhinolaryngology residents to criterion performance levels (proficiency) on the Endoscopic Sinus Surgery Simulator produces individuals whose performance in the operating room is at least equal to those who are trained by performing a fixed number of surgical procedures. DESIGN Prospective cohort. SETTING Two academic medical centers in New York City. PARTICIPANTS Otorhinolaryngology junior residents composed of 8 experimental subjects and 6 control subjects and 6 attending surgeons. INTERVENTION Experimental subjects achieved benchmark proficiency criteria on the Endoscopic Sinus Surgery Simulator; control subjects repeated the surgical procedure twice. MAIN OUTCOME MEASURES Residents completed validated objective tests to assess baseline abilities. All subjects were videotaped performing an initial standardized surgical procedure. Residents were videotaped performing a final surgery. Videotapes were assessed for metrics by an expert panel. RESULTS Attendings outperformed the residents in most parameters on the initial procedure. Experimental and attending groups outperformed controls in some parameters on the final procedure. There was no difference between resident groups in initial performance, but the experimental subjects outperformed the control subjects in navigation in the final procedure. Most important, there was no difference in final performance between subgroups of the experimental group on the basis of the number of trials needed to attain proficiency. CONCLUSIONS Simulator training can improve resident technical skills so that each individual attains a proficiency level, despite the existence of an intrinsic range of abilities. This proficiency level translates to at least equal, if not superior, operative performance compared with that of current conventional training with finite repetition of live surgical procedures.

  18. TMD-Based Structural Control of High Performance Steel Bridges

    NASA Astrophysics Data System (ADS)

    Kim, Tae Min; Kim, Gun; Kyum Kim, Moon

    2012-08-01

    The purpose of this study is to investigate the effectiveness of structural control using tuned mass damper (TMD) for suppressing excessive traffic induced vibration of high performance steel bridge. The study considered 1-span steel plate girder bridge and bridge-vehicle interaction using HS-24 truck model. A numerical model of steel plate girder, traffic load, and TMD is constructed and time history analysis is performed using commercial structural analysis program ABAQUS 6.10. Results from analyses show that high performance steel bridge has dynamic serviceability problem, compared to relatively low performance steel bridge. Therefore, the structural control using TMD is implemented in order to alleviate dynamic serviceability problems. TMD is applied to the bridge with high performance steel and then vertical vibration due to dynamic behavior is assessed again. In consequent, by using TMD, it is confirmed that the residual amplitude is appreciably reduced by 85% in steady-state vibration. Moreover, vibration serviceability assessment using 'Reiher-Meister Curve' is also remarkably improved. As a result, this paper provides the guideline for economical design of I-girder using high performance steel and evaluates the effectiveness of structural control using TMD, simultaneously.

  19. Perspectives on containment performance improvement based on the IPEs

    SciTech Connect

    Lehner, J.R.; Lin, C.C.; Pratt, W.T.

    1995-04-01

    Generic Letter 88-20, {open_quotes}Individual Plant Examination (IPE) for Severe Accident Vulnerabilities - 10CFR 50.54(f),{close_quotes} was issued by the NRC on November 23, 1988. In addition to assessing the core damage frequency from severe accidents, licensees were requested to report the results of their analyses regarding containment performance. Supplements to the Generic Letter forwarded technical insights obtained by the NRC staff through its Containment Performance Improvement (CPI) program. At this time, most of the IPEs have been submitted by the licensees. In a follow-on effort to support regulatory activities, the NRC staff with assistance from Brookhaven National Laboratory, has initiated a program involving a global examination of the containment performance results documented in the IPEs. The objective is to identify insights of potential generic safety significance relative to plant design, operation and maintenance, as well as to assess response to the previously forwarded CPI insights. The containment performance results of the IPEs are being categorized for commonalities and differences for different reactor and containment types. Preliminary results show that not only differences in plant design but also the methods, data, boundary conditions, and assumptions used in the different IPEs have a major impact on the containment performance results obtained. This paper presents preliminary results regarding the differences in containment performance observed in the IPEs and discusses some of the underlying reasons for these differences.

  20. An integrated conceptual framework for selecting reverse logistics providers in the presence of vagueness

    NASA Astrophysics Data System (ADS)

    Fırdolaş, Tugba; Önüt, Semih; Kongar, Elif

    2005-11-01

    In recent years, relating organization's attitude towards sustainable development, environmental management is gaining an increasing interest among researchers in supply chain management. With regard to a long term requirement of a shift from a linear economy towards a cycle economy, businesses should be motivated to embrace change brought about by consumers, government, competition, and ethical responsibility. To achieve business goals and objectives, a company must reply to increasing consumer demand for "green" products and implement environmentally responsible plans. Reverse logistics is an activity within organizations delegated to the customer service function, where customers with warranted or defective products would return them to their supplier. Emergence of reverse logistics enables to provide a competitive advantage and significant return on investment with an indirect effect on profitability. Many organizations are hiring third-party providers to implement reverse logistics programs designed to retain value by getting products back. Reverse logistics vendors play an important role in helping organizations in closing the loop for products offered by the organizations. In this regard, the selection of third-party providers issue is increasingly becoming an area of reverse logistics concept and practice. This study aims to assist managers in determining which third-party logistics provider to collaborate in the reverse logistics process with an alternative approach based on an integrated model using neural networks and fuzzy logic. An illustrative case study is discussed and the best provider is identified through the solution of this model.

  1. NASA Shuttle Logistics Depot (NSLD) - The application of ATE

    NASA Technical Reports Server (NTRS)

    Simpkins, Lorenz G.; Jenkins, Henry C.; Mauceri, A. Jack

    1990-01-01

    The concept of the NASA Shuttle Logistics Depot (NSLD) developed for the Space Shuttle Orbiter Program is described. The function of the NSLD at Cape Canaveral is to perform the acceptance and diagnostic testing of the Shuttle's space-rated line-replaceable units and shop-replaceable units (SRUs). The NSLD includes a comprehensive electronic automatic test station, program development stations, and assorted manufacturing support equipment (including thermal and vibration test equipment, special test equipment, and a card SRU test system). The depot activities also include the establishment of the functions for manufacturing of mechanical parts, soldering, welding, painting, clean room operation, procurement, and subcontract management.

  2. The Concept of a Logistics Centre Model as a Nodal Point of a Transport and Logistics Network

    NASA Astrophysics Data System (ADS)

    Krzyżaniak, Stanisław; Hajdul, Marcin; Fechner, Ireneusz

    2012-06-01

    The paper presents a concept of a logistics centre model. The model is based on defined output/input flows of goods. Therefore it is possible to determine and estimate the load of elements of the centre infrastructure being at the disposal of individual service providers operating within the centre. Input and output flows are defined on the basis of the balance of flows directed to the centre from the whole logistics and transport network and are considered split into: product groups, types of transport, various transport forms of cargo. The seasonal nature of these flows has been taken into consideration. The calculations of infrastructure load assume that every service provider operating within the centre can also use elements of infrastructure owned by other centre users. The proposed model can serve various purposes. An emphasis is put herein on the possibility of estimating the degree of the transport and logistics infrastructure load of the centre on the basis of historical data as well as on the basis of forecasts. As a result, the needs related to the access to the infrastructure can be specified and used to determine and verify investment plans.

  3. Lessons from the private sector on performance-based management

    SciTech Connect

    Stoeckle, K.E.; Kolster, W.G.; Shangraw, R.F.

    1996-03-01

    Implementation of the Government Performance and Results Act of 1993 (GPRA) has provided a unique challenge for Federal Agencies, such as the Department of Energy (DOE) Office of Waste Management (OWM). While performance measurement, as required by GPRA, is new to Federal Agencies, private industry has applied it at all organizational levels to better manage their operations for some time. There has been significant discussion about how the private sector uses performance measures, but there have been very few empirical studies systematically examining their use. To gather information on comparable private industry practices, waste management industry firms were surveyed through questionnaires and follow-on interviews. Questionnaires were sent to 75 waste management firms throughout the United States and Canada. Twenty-four percent of the firms responded to the questionnaire and participated in the follow-on interviews. The questionnaires were typically completed by vice-presidents or senior financial officers. Information collected from the questionnaire and follow-on interviews provided valuable insight into industry practices in the area of performance measurement. This paper discusses the study results and how they can be incorporated in the DOE OWM performance measures and influence the character of the ``critical few`` metrics used by senior DOE managers.

  4. Evaluating hospital performance based on excess cause-specific incidence

    PubMed Central

    Van Rompaye, Bart; Eriksson, Marie; Goetghebeur, Els

    2015-01-01

    Formal evaluation of hospital performance in specific types of care is becoming an indispensable tool for quality assurance in the health care system. When the prime concern lies in reducing the risk of a cause-specific event, we propose to evaluate performance in terms of an average excess cumulative incidence, referring to the center's observed patient mix. Its intuitive interpretation helps give meaning to the evaluation results and facilitates the determination of important benchmarks for hospital performance. We apply it to the evaluation of cerebrovascular deaths after stroke in Swedish stroke centers, using data from Riksstroke, the Swedish stroke registry. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:25640288

  5. Progress toward a performance based specification for diamond grinding wheels

    SciTech Connect

    Taylor, J.S.; Piscotty, M.S.; Blaedel, K.L.

    1996-11-12

    This work sought to improve the communication between users and makers of fine diamond grinding wheels. A promising avenue for this is to formulate a voluntary product standard that comprises performance indicators that bridge the gap between specific user requirements and the details of wheel formulations. We propose a set of performance specifiers of figures-of-merit, that might be assessed by straightforward and traceable testing methods, but do not compromise proprietary information of the wheel user of wheel maker. One such performance indicator might be wheel hardness. In addition we consider technologies that might be required to realize the benefits of optimized grinding wheels. A non-contact wheel-to- workpiece proximity sensor may provide a means of monitoring wheel wear and thus wheel position, for wheels that exhibit high wear rates in exchange for improved surface finish.

  6. Mate choice based on a key ecological performance trait.

    PubMed

    Snowberg, L K; Benkman, C W

    2009-04-01

    Mate preference for well-adapted individuals may strengthen divergent selection and thereby facilitate adaptive divergence. We performed mate choice experiments in which we manipulated male red crossbill (Loxia curvirostra complex) feeding rates. Using association time as a proxy for preference, we found that females preferred faster foragers, which reinforces natural selection because poorly adapted males would be less likely to obtain a mate as well as less likely to survive. Although theoretical models predict direct preference for adaptation and performance, to the best of our knowledge this experiment provides the first evidence of individuals directly assessing feeding performance in mate choice. In species where assessing the ecological adaptation of potential mates is possible, females may gain fitness benefits from choosing a well-adapted mate directly or indirectly, promoting the use of information about ecological adaptation in mate choice. PMID:19320795

  7. An Application-Based Performance Characterization of the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Djomehri, Jahed M.; Hood, Robert; Jin, Hoaqiang; Kiris, Cetin; Saini, Subhash

    2005-01-01

    Columbia is a 10,240-processor supercluster consisting of 20 Altix nodes with 512 processors each, and currently ranked as the second-fastest computer in the world. In this paper, we present the performance characteristics of Columbia obtained on up to four computing nodes interconnected via the InfiniBand and/or NUMAlink4 communication fabrics. We evaluate floating-point performance, memory bandwidth, message passing communication speeds, and compilers using a subset of the HPC Challenge benchmarks, and some of the NAS Parallel Benchmarks including the multi-zone versions. We present detailed performance results for three scientific applications of interest to NASA, one from molecular dynamics, and two from computational fluid dynamics. Our results show that both the NUMAlink4 and the InfiniBand hold promise for application scaling to a large number of processors.

  8. Performing Environmental Change: MED Theatre and the Changing Face of Community-Based Performance Research

    ERIC Educational Resources Information Center

    Schaefer, Kerrie

    2012-01-01

    This article examines a programme of work produced by community-based theatre company, Manaton and East Dartmoor (MED) Theatre, addressing issues of climate change as they impact on life in rural Devon, UK. After some discussion of MED Theatre's constitution as a community-based company and the group's long-term engagement with the place, history,…

  9. Performance effects of plan-based displays in commercial aviation

    NASA Technical Reports Server (NTRS)

    Shalin, Valerie L.; Mikesell, Brian G.; Ramamurthy, Maya; Geddes, Norman D.

    1993-01-01

    The experiment reported here examines the performance consequences of alignment between display formats and the methods used during descent. Twelve commercial pilots flew one of nine Standard Terminal Arrival Routes on a stand-alone flight simulator run on a Sparc II workstation. Three different methods for descent (Manual Control, Autopilot and Spoilers) were crossed with three methods according to scope, resolution, and bandwidth properties in the display of critical information. Performance variability in speed control supports our claim that the effectiveness of a display is not an independent property of the display itself, but rather, a function of the interaction of the display and the particular methods used for achieving task goals.

  10. Objective Situation Awareness Measurement Based on Performance Self-Evaluation

    NASA Technical Reports Server (NTRS)

    DeMaio, Joe

    1998-01-01

    The research was conducted in support of the NASA Safe All-Weather Flight Operations for Rotorcraft (SAFOR) program. The purpose of the work was to investigate the utility of two measurement tools developed by the British Defense Evaluation Research Agency. These tools were a subjective workload assessment scale, the DRA Workload Scale and a situation awareness measurement tool. The situation awareness tool uses a comparison of the crew's self-evaluation of performance against actual performance in order to determine what information the crew attended to during the performance. These two measurement tools were evaluated in the context of a test of innovative approach to alerting the crew by way of a helmet mounted display. The situation assessment data are reported here. The performance self-evaluation metric of situation awareness was found to be highly effective. It was used to evaluate situation awareness on a tank reconnaissance task, a tactical navigation task, and a stylized task used to evaluated handling qualities. Using the self-evaluation metric, it was possible to evaluate situation awareness, without exact knowledge the relevant information in some cases and to identify information to which the crew attended or failed to attend in others.

  11. Storytelling for Fluency and Flair: A Performance-Based Approach

    ERIC Educational Resources Information Center

    Campbell, Terry; Hlusek, Michelle

    2015-01-01

    In the classroom experiences described in this article, grade three students were introduced to storytelling through the interactive read aloud of a mentor text and a storytelling demonstration, followed by daily collaborative activities involving listening, speaking, reading, and writing, culminating in dramatic storytelling performances. The…

  12. Performance-Based Accountability: Newarks Charter School Experience.

    ERIC Educational Resources Information Center

    Callahan, Kathe; Sadovnik, Alan; Visconti, Louisa

    This study assessed how New Jersey's state accountability system encouraged or thwarted charter school success, how effectively performance standards were defined and enacted by authorizing agents, and how individual charter schools were developing accountability processes that made them more or less successful than their charter school…

  13. Great Performances: Creating Classroom-Based Assessment Tasks. Second Edition

    ERIC Educational Resources Information Center

    Shoemaker, Betty; Lewin, Larry

    2011-01-01

    Get an in-depth understanding of how to create fun, engaging, and challenging performance assessments that require students to elaborate on content and demonstrate mastery of skills. This update of an ASCD (Association for Supervision and Curriculum Development) classic includes new scoring methods, reading assessments, and insights on navigating…

  14. Holding Schools Accountable: Performance-Based Reform in Education.

    ERIC Educational Resources Information Center

    Ladd, Helen F., Ed.

    Many people believe that future reforms of education should focus on the primary mission of elementary and secondary schools and that these schools must be held more accountable for the academic performance of their students. This book brings together researchers from various disciplines--most notably economics, educational policy and management,…

  15. Activity-Based Costing Model for Assessing Economic Performance.

    ERIC Educational Resources Information Center

    DeHayes, Daniel W.; Lovrinic, Joseph G.

    1994-01-01

    An economic model for evaluating the cost performance of academic and administrative programs in higher education is described. Examples from its application at Indiana University-Purdue University Indianapolis are used to illustrate how the model has been used to control costs and reengineer processes. (Author/MSE)

  16. Hexafluorozirconic Acid Based Surface Pretreatments: Characterization and Performance Assessment

    SciTech Connect

    Adhikari, Saikat; Unocic, Kinga A; Zhai, Yumei; Frankel, Gerald; Zimmerman, John; Fristad, W

    2010-01-01

    A new phosphate-free pretreatment from Henkel Corp. named TecTalis , was investigated. The treatment bath is composed of dilute hexafluorozirconic acid with small quantities of non-hazardous components containing Si and Cu. The performance of treated steel was compared to samples treated in a phosphate conversion coating bath, in simple hexafluorozirconic acid and in TecTalis without the addition of the Cu containing component. Atomic Force Microscopy (AFM) and Transmission Electron Microscopy (TEM) were used to characterize the coating surface morphology, structure and composition. A Quartz Crystal Microbalance (QCM) was used for studying film growth kinetics on thin films of pure Fe, Al and Zn. Electrochemical Impedance Spectroscopy (EIS) was performed on treated and painted steel for studying long-term corrosion performance of the coatings. The phosphate-free coating provided long-term corrosion performance comparable to that of phosphate conversion coatings. The coatings uniformly cover the surface in the form of 10-20 nm sized nodules and clusters of these features up to 500 nm in size. The coatings are usually about 20-30 nm thick and are mostly composed of Zr and O with enrichment of copper at randomly distributed locations and clusters.

  17. Performance evaluation of algorithms for SAW-based temperature measurement.

    PubMed

    Schuster, Stefan; Scheiblhofer, Stefan; Reindl, Leonhard; Stelzer, Andreas

    2006-06-01

    Whenever harsh environmental conditions such as high temperatures, accelerations, radiation, etc., prohibit usage of standard temperature sensors, surface acoustic wave-based temperature sensors are the first choice for highly reliable wireless temperature measurement. Interrogation of these sensors is often based on frequency modulated or frequency stepped continuous wave-based radars (FMCW/FSCW). We investigate known algorithms regarding their achievable temperature accuracy and their applicability in practice. Furthermore, some general rules of thumb for FMCW/FSCW radar-based range estimation by means of the Cramer-Rao lower bound (CRLB) for frequency and phase estimation are provided. The theoretical results are verified on both simulated and measured data. PMID:16846150

  18. Development of high-performance diamine-based polybenzoxazines

    NASA Astrophysics Data System (ADS)

    Allen, Douglas James

    A series of linear aliphatic diamine-based benzoxazine monomers has been synthesized and characterized. These benzoxazine monomers polymerize at elevated temperatures into transparent, crosslinked materials with an inherently flexible network structure. These aliphatic diamine-based benzoxazines are atypical in that polyfunctionality is provided by the amine portion of the benzoxazine structure, rather than by a multifunctional phenol as is more conventional utilized. The kinetics of thermally activated polymerization for this series of diamine-based benzoxazine monomers is investigated by infrared spectroscopy and reveals that the rate of polymerization is inversely proportional to the length of the aliphatic diamine chain. The monomer based on the shortest diamine, ethylene diamine, is shown by differential scanning calorimetry to polymerize by a dual-mode process with an onset temperature that is lower than ever previously seen for bisphenol-based benzoxazines. The normalized heat of polymerization for monomers based on longer diamines is largely independent of the diamine chain length. The polybenzoxazines that are prepared from the linear aliphatic diamine-based benzoxazine monomers display a high degree of inherent flexibility, with mechanical and physical properties that strongly depend on the length of the aliphatic chain. Despite the flexible nature of the diamine chains, several of the aliphatic diamine-based polybenzoxazines have high glass transition temperatures and crosslink densities that exceed those of typical bisphenol-based polybenzoxazines. By selective blocking of the ortho, para, and meta reactive sites on the aromatic ring with a series of methyl-substituted monomers, polymerization is regiospecified and the type of linkage in the network structure is controlled. The substituted benzoxazines are shown to polymerize by similar mechanisms and possess a heat of polymerization by differential scanning calorimetry that is independent of both

  19. Performance-based staff development. A baseline for clinical competence.

    PubMed

    Snyder-Halpern, R; Buczkowski, E

    1990-01-01

    Changes in health-care financing, complex technological advancements, and an expanding nursing knowledge base have made it increasingly difficult for health care educators to assess and enhance the development of professional nurses. Using a set of objective evaluation methods and tools, educators assessed the competency of novice and experienced staff. This information identified learning and practice needs that formed the bases for individualized orientation and clinical educational programs.

  20. Research needs for risk-informed, performance-based regulations

    SciTech Connect

    Thadani, A.C.

    1997-01-01

    This article summarizes the activities of the Office of Research of the NRC, both from a historical aspect as well as it applies to the application of risk-based decision making. The office has been actively involved in problems related to understanding risks related to core accidents, to understanding the problem of aging of reactor components and materials from years of service, and toward the understanding and analysis of severe accidents. In addition new policy statements regarding the role of risk assessment in regulatory applications has given focus for the need of further work. The NRC has used risk assessment in regulatory questions in the past but in a fairly ad hoc sort of manner. The new policies will clearly require a better defined application of risk assessment, and help for people evaluating applications in judging the applicability of such applications when a component of them is based on risk-based decision making. To address this, standard review plans are being prepared to serve as guides for such questions. In addition, with regulatory decisions being allowed to be based upon risk-based decisions, it is necessary to have an adequate data base prepared, and made publically available, to support such a position.

  1. Logistic regression when binary predictor variables are highly correlated.

    PubMed

    Barker, L; Brown, C

    Standard logistic regression can produce estimates having large mean square error when predictor variables are multicollinear. Ridge regression and principal components regression can reduce the impact of multicollinearity in ordinary least squares regression. Generalizations of these, applicable in the logistic regression framework, are alternatives to standard logistic regression. It is shown that estimates obtained via ridge and principal components logistic regression can have smaller mean square error than estimates obtained through standard logistic regression. Recommendations for choosing among standard, ridge and principal components logistic regression are developed. Published in 2001 by John Wiley & Sons, Ltd.

  2. Determination of riverbank erosion probability using Locally Weighted Logistic Regression

    NASA Astrophysics Data System (ADS)

    Ioannidou, Elena; Flori, Aikaterini; Varouchakis, Emmanouil A.; Giannakis, Georgios; Vozinaki, Anthi Eirini K.; Karatzas, George P.; Nikolaidis, Nikolaos

    2015-04-01

    Riverbank erosion is a natural geomorphologic process that affects the fluvial environment. The most important issue concerning riverbank erosion is the identification of the vulnerable locations. An alternative to the usual hydrodynamic models to predict vulnerable locations is to quantify the probability of erosion occurrence. This can be achieved by identifying the underlying relations between riverbank erosion and the geomorphological or hydrological variables that prevent or stimulate erosion. Thus, riverbank erosion can be determined by a regression model using independent variables that are considered to affect the erosion process. The impact of such variables may vary spatially, therefore, a non-stationary regression model is preferred instead of a stationary equivalent. Locally Weighted Regression (LWR) is proposed as a suitable choice. This method can be extended to predict the binary presence or absence of erosion based on a series of independent local variables by using the logistic regression model. It is referred to as Locally Weighted Logistic Regression (LWLR). Logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable (e.g. binary response) based on one or more predictor variables. The method can be combined with LWR to assign weights to local independent variables of the dependent one. LWR allows model parameters to vary over space in order to reflect spatial heterogeneity. The probabilities of the possible outcomes are modelled as a function of the independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. erosion presence or absence) for any value of the independent variables. The

  3. Technological Support for Logistics Transportation Systems

    NASA Astrophysics Data System (ADS)

    Bujak, Andrzej; Śliwa, Zdzisław; Gębczyńska, Alicja

    The modern world is changing introducing robots, remotely controlled vehicles and other crewless means of transportation to reduce people's mistakes, as the main cause of incidents and crashes during traffic. New technologies are supporting operators and drivers, and according to some studies they can even replace them. Such programs as: AHS, UAH, IVBSS or MTVR are under development to improve traffic flow and its safety, to reduce traffic hazards and crashes. It is necessary to analyze such concepts and implement them boldly, including Polish logistics' companies, new programs, highways' system etc., as they will be applied in the future, so it is necessary to prepare logistics infrastructure ahead of time in order to capitalize on these improvements. The problem is quite urgent as transportation in the country must not be outdated to meet clients' expectations and to keep pace with competing foreign companies.

  4. Exact solution to fractional logistic equation

    NASA Astrophysics Data System (ADS)

    West, Bruce J.

    2015-07-01

    The logistic equation is one of the most familiar nonlinear differential equations in the biological and social sciences. Herein we provide an exact solution to an extension of this equation to incorporate memory through the use of fractional derivatives in time. The solution to the fractional logistic equation (FLE) is obtained using the Carleman embedding technique that allows the nonlinear equation to be replaced by an infinite-order set of linear equations, which we then solve exactly. The formal series expansion for the initial value solution of the FLE is shown to be expressed in terms of a series of weighted Mittag-Leffler functions that reduces to the well known analytic solution in the limit where the fractional index for the derivative approaches unity. The numerical integration to the FLE provides an excellent fit to the analytic solution. We propose this approach as a general technique for solving a class of nonlinear fractional differential equations.

  5. High-performance CAM-based Prolog execution scheme

    NASA Astrophysics Data System (ADS)

    Ali-Yahia, Tahar; Dana, Michel

    1991-03-01

    In this paper, we present an execution scheme allowing a direct and a pipeline evaluation of a Prolog Program. The execution scheme enhances Prolog performances in interpreted mode, by means of associative processing tools embodied in Content Addressable Memories and potential parallelism existing between clauses selection, unification, and access to clause arguments. The interpretation algorithm is distributed on several processing units, which are Content Addressable Memories (CAMs). These latter are generic and reconfigurable dealing with much more Artificial Intelligence applications, through improved target languages like Prolog, Lisp, and Object oriented languages. The model has been evaluated with a functional simulator written in Le-lisp. The results show the CAMs feasibility in improving Prolog execution at performances greater than 160 KLIPS, in interpreted mode.

  6. Improvement in fresh fruit and vegetable logistics quality: berry logistics field studies.

    PubMed

    do Nascimento Nunes, M Cecilia; Nicometo, Mike; Emond, Jean Pierre; Melis, Ricardo Badia; Uysal, Ismail

    2014-06-13

    Shelf life of fresh fruits and vegetables is greatly influenced by environmental conditions. Increasing temperature usually results in accelerated loss of quality and shelf-life reduction, which is not physically visible until too late in the supply chain to adjust logistics to match shelf life. A blackberry study showed that temperatures inside pallets varied significantly and 57% of the berries arriving at the packinghouse did not have enough remaining shelf life for the longest supply routes. Yet, the advanced shelf-life loss was not physically visible. Some of those pallets would be sent on longer supply routes than necessary, creating avoidable waste. Other studies showed that variable pre-cooling at the centre of pallets resulted in physically invisible uneven shelf life. We have shown that using simple temperature measurements much waste can be avoided using 'first expiring first out'. Results from our studies showed that shelf-life prediction should not be based on a single quality factor as, depending on the temperature history, the quality attribute that limits shelf life may vary. Finally, methods to use air temperature to predict product temperature for highest shelf-life prediction accuracy in the absence of individual sensors for each monitored product have been developed. Our results show a significant reduction of up to 98% in the root-mean-square-error difference between the product temperature and air temperature when advanced estimation methods are used.

  7. Scalable, high performance, enzymatic cathodes based on nanoimprint lithography

    PubMed Central

    Pankratov, Dmitry; Sundberg, Richard; Sotres, Javier; Suyatin, Dmitry B; Maximov, Ivan; Montelius, Lars

    2015-01-01

    Summary Here we detail high performance, enzymatic electrodes for oxygen bio-electroreduction, which can be easily and reproducibly fabricated with industry-scale throughput. Planar and nanostructured electrodes were built on biocompatible, flexible polymer sheets, while nanoimprint lithography was used for electrode nanostructuring. To the best of our knowledge, this is one of the first reports concerning the usage of nanoimprint lithography for amperometric bioelectronic devices. The enzyme (Myrothecium verrucaria bilirubin oxidase) was immobilised on planar (control) and artificially nanostructured, gold electrodes by direct physical adsorption. The detailed electrochemical investigation of bioelectrodes was performed and the following parameters were obtained: open circuit voltage of approximately 0.75 V, and maximum bio-electrocatalytic current densities of 18 µA/cm2 and 58 µA/cm2 in air-saturated buffers versus 48 µA/cm2 and 186 µA/cm2 in oxygen-saturated buffers for planar and nanostructured electrodes, respectively. The half-deactivation times of planar and nanostructured biocathodes were measured to be 2 h and 14 h, respectively. The comparison of standard heterogeneous and bio-electrocatalytic rate constants showed that the improved bio-electrocatalytic performance of the nanostructured biocathodes compared to planar biodevices is due to the increased surface area of the nanostructured electrodes, whereas their improved operational stability is attributed to stabilisation of the enzyme inside nanocavities. PMID:26199841

  8. Scalable, high performance, enzymatic cathodes based on nanoimprint lithography.

    PubMed

    Pankratov, Dmitry; Sundberg, Richard; Sotres, Javier; Suyatin, Dmitry B; Maximov, Ivan; Shleev, Sergey; Montelius, Lars

    2015-01-01

    Here we detail high performance, enzymatic electrodes for oxygen bio-electroreduction, which can be easily and reproducibly fabricated with industry-scale throughput. Planar and nanostructured electrodes were built on biocompatible, flexible polymer sheets, while nanoimprint lithography was used for electrode nanostructuring. To the best of our knowledge, this is one of the first reports concerning the usage of nanoimprint lithography for amperometric bioelectronic devices. The enzyme (Myrothecium verrucaria bilirubin oxidase) was immobilised on planar (control) and artificially nanostructured, gold electrodes by direct physical adsorption. The detailed electrochemical investigation of bioelectrodes was performed and the following parameters were obtained: open circuit voltage of approximately 0.75 V, and maximum bio-electrocatalytic current densities of 18 µA/cm(2) and 58 µA/cm(2) in air-saturated buffers versus 48 µA/cm(2) and 186 µA/cm(2) in oxygen-saturated buffers for planar and nanostructured electrodes, respectively. The half-deactivation times of planar and nanostructured biocathodes were measured to be 2 h and 14 h, respectively. The comparison of standard heterogeneous and bio-electrocatalytic rate constants showed that the improved bio-electrocatalytic performance of the nanostructured biocathodes compared to planar biodevices is due to the increased surface area of the nanostructured electrodes, whereas their improved operational stability is attributed to stabilisation of the enzyme inside nanocavities.

  9. Actuation performance of cellulose based electro-active papers

    NASA Astrophysics Data System (ADS)

    Kim, Jaehwan; Song, Chunseok; Bae, Seung-Hun

    2005-05-01

    Electro-Active Paper (EAPap) is attractive as an EAP actuator material due to its merits in terms of lightweight, dry condition, large displacement output, low actuation voltage and low power consumption. This paper presents the fabrication and performance test of EAPap actuators. EAPap material has been made from cellulose materials. Cellulose fiber is dissolved into a solution and made into a sheet by using a spin coater. Thin electrodes are deposited on the cellophane sheet to comprise an EAPap. Next the EAPap is made into plate or beam specimens cut along a specific orientation to enhance the actuator performance. The EAPap is clamped on electric power connector and placed in an environmental chamber and the tip displacement of EAPap is measured with laser sensor. Also the blocking force of EAPap sample is measured. The measured force is compared with a theoretical beam model. These measurements are performed under a variety of environmental and input factors including frequency, actuation voltage, temperature and humidity. Characteristics of EAPap in terms of fibrous nature, their crystallinity, and mechanical, physical and electrochemical characteristics are presented.

  10. 5 CFR 9901.343 - Pay reduction based on unacceptable performance and/or conduct.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... performance and/or conduct. 9901.343 Section 9901.343 Administrative Personnel DEPARTMENT OF DEFENSE HUMAN...) DEPARTMENT OF DEFENSE NATIONAL SECURITY PERSONNEL SYSTEM (NSPS) Pay and Pay Administration Performance-Based Pay § 9901.343 Pay reduction based on unacceptable performance and/or conduct. An employee's rate...

  11. Performance-Based Funding & Online Learning: Maximizing Resources for Student Success

    ERIC Educational Resources Information Center

    Patrick, Susan; Myers, John; Silverstein, Justin; Brown, Amanda; Watson, John

    2015-01-01

    There is a new conversation taking place in public education on creating systemic incentives through school finance to encourage schools to innovate and be rewarded for positive student outcomes and performance. What if education funding was not based on seat-time, but on rewarding student performance? Performance-based funding is a term that…

  12. Model selection for logistic regression models

    NASA Astrophysics Data System (ADS)

    Duller, Christine

    2012-09-01

    Model selection for logistic regression models decides which of some given potential regressors have an effect and hence should be included in the final model. The second interesting question is whether a certain factor is heterogeneous among some subsets, i.e. whether the model should include a random intercept or not. In this paper these questions will be answered with classical as well as with Bayesian methods. The application show some results of recent research projects in medicine and business administration.

  13. On the Performance of Distributed Lock-Based Synchronization

    NASA Astrophysics Data System (ADS)

    Lubowich, Yuval; Taubenfeld, Gadi

    We study the relation between two classical types of distributed locking mechanisms, called token-based locking and permission-based locking, and several distributed data structures which use locking for synchronization. We have proposed, implemented and tested several lock-based distributed data structures, namely, two different types of counters called find&increment and increment&publish, a queue, a stack and a linked list. For each one of them we have determined what is the preferred type of lock to be used as the underling locking mechanism. Furthermore, we have determined which one of the two proposed counters is better to be used either as a stand-alone data structure or when used as a building block for implementing other high level data structures.

  14. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  15. Forest biomass supply logistics for a power plant using the discrete-event simulation approach

    SciTech Connect

    Mobini, Mahdi; Sowlati, T.; Sokhansanj, Shahabaddine

    2011-04-01

    This study investigates the logistics of supplying forest biomass to a potential power plant. Due to the complexities in such a supply logistics system, a simulation model based on the framework of Integrated Biomass Supply Analysis and Logistics (IBSAL) is developed in this study to evaluate the cost of delivered forest biomass, the equilibrium moisture content, and carbon emissions from the logistics operations. The model is applied to a proposed case of 300 MW power plant in Quesnel, BC, Canada. The results show that the biomass demand of the power plant would not be met every year. The weighted average cost of delivered biomass to the gate of the power plant is about C$ 90 per dry tonne. Estimates of equilibrium moisture content of delivered biomass and CO2 emissions resulted from the processes are also provided.

  16. Accounting for Informatively Missing Data in Logistic Regression by Means of Reassessment Sampling

    PubMed Central

    Lin, Ji; Lyles, Robert H.

    2015-01-01

    We explore the “reassessment” design in a logistic regression setting, where a second wave of sampling is applied to recover a portion of the missing data on a binary exposure and/or outcome variable. We construct a joint likelihood function based on the original model of interest and a model for the missing data mechanism, with emphasis on non-ignorable missingness. The estimation is carried out by numerical maximization of the joint likelihood function with close approximation of the accompanying Hessian matrix, using sharable programs that take advantage of general optimization routines in standard software. We show how likelihood ratio tests can be used for model selection, and how they facilitate direct hypothesis testing for whether missingness is at random. Examples and simulations are presented to demonstrate the performance of the proposed method. PMID:25707010

  17. C*-algebras associated with reversible extensions of logistic maps

    NASA Astrophysics Data System (ADS)

    Kwaśniewski, Bartosz K.

    2012-10-01

    The construction of reversible extensions of dynamical systems presented in a previous paper by the author and A.V. Lebedev is enhanced, so that it applies to arbitrary mappings (not necessarily with open range). It is based on calculating the maximal ideal space of C*-algebras that extends endomorphisms to partial automorphisms via partial isometric representations, and involves a new set of 'parameters' (the role of parameters is played by chosen sets or ideals). As model examples, we give a thorough description of reversible extensions of logistic maps and a classification of systems associated with compression of unitaries generating homeomorphisms of the circle. Bibliography: 34 titles.

  18. C*-algebras associated with reversible extensions of logistic maps

    SciTech Connect

    Kwasniewski, Bartosz K

    2012-10-31

    The construction of reversible extensions of dynamical systems presented in a previous paper by the author and A.V. Lebedev is enhanced, so that it applies to arbitrary mappings (not necessarily with open range). It is based on calculating the maximal ideal space of C*-algebras that extends endomorphisms to partial automorphisms via partial isometric representations, and involves a new set of 'parameters' (the role of parameters is played by chosen sets or ideals). As model examples, we give a thorough description of reversible extensions of logistic maps and a classification of systems associated with compression of unitaries generating homeomorphisms of the circle. Bibliography: 34 titles.

  19. An Alternative Flight Software Trigger Paradigm: Applying Multivariate Logistic Regression to Sense Trigger Conditions using Inaccurate or Scarce Information

    NASA Technical Reports Server (NTRS)

    Smith, Kelly M.; Gay, Robert S.; Stachowiak, Susan J.

    2013-01-01

    In late 2014, NASA will fly the Orion capsule on a Delta IV-Heavy rocket for the Exploration Flight Test-1 (EFT-1) mission. For EFT-1, the Orion capsule will be flying with a new GPS receiver and new navigation software. Given the experimental nature of the flight, the flight software must be robust to the loss of GPS measurements. Once the high-speed entry is complete, the drogue parachutes must be deployed within the proper conditions to stabilize the vehicle prior to deploying the main parachutes. When GPS is available in nominal operations, the vehicle will deploy the drogue parachutes based on an altitude trigger. However, when GPS is unavailable, the navigated altitude errors become excessively large, driving the need for a backup barometric altimeter. In order to increase overall robustness, the vehicle also has an alternate method of triggering the drogue parachute deployment based on planet-relative velocity if both the GPS and the barometric altimeter fail. However, this velocity-based trigger results in large altitude errors relative to the targeted altitude. Motivated by this challenge, this paper demonstrates how logistic regression may be employed to automatically generate robust triggers based on statistical analysis. Logistic regression is used as a ground processor pre-flight to develop a classifier. The classifier would then be implemented in flight software and executed in real-time. This technique offers excellent performance even in the face of highly inaccurate measurements. Although the logistic regression-based trigger approach will not be implemented within EFT-1 flight software, the methodology can be carried forward for future missions and vehicles.

  20. Accounting for Slipping and Other False Negatives in Logistic Models of Student Learning

    ERIC Educational Resources Information Center

    MacLellan, Christopher J.; Liu, Ran; Koedinger, Kenneth R.

    2015-01-01

    Additive Factors Model (AFM) and Performance Factors Analysis (PFA) are two popular models of student learning that employ logistic regression to estimate parameters and predict performance. This is in contrast to Bayesian Knowledge Tracing (BKT) which uses a Hidden Markov Model formalism. While all three models tend to make similar predictions,…